All Projects → kayoyin → transformer-slt

kayoyin / transformer-slt

Licence: Apache-2.0 license
Sign Language Translation with Transformers (COLING'2020, ECCV'20 SLRTP Workshop)

Programming Languages

ASL
165 projects
python
139335 projects - #7 most used programming language
perl
6916 projects
emacs lisp
2029 projects
shell
77523 projects
smalltalk
420 projects

Projects that are alternatives of or similar to transformer-slt

Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+976.09%)
Mutual labels:  transformer, neural-machine-translation
Nmt Keras
Neural Machine Translation with Keras
Stars: ✭ 501 (+444.57%)
Mutual labels:  transformer, neural-machine-translation
transformer
Neutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (-34.78%)
Mutual labels:  transformer, neural-machine-translation
zero
Zero -- A neural machine translation system
Stars: ✭ 121 (+31.52%)
Mutual labels:  transformer, neural-machine-translation
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+3615.22%)
Mutual labels:  transformer, neural-machine-translation
Transformer Dynet
An Implementation of Transformer (Attention Is All You Need) in DyNet
Stars: ✭ 57 (-38.04%)
Mutual labels:  transformer, neural-machine-translation
Joeynmt
Minimalist NMT for educational purposes
Stars: ✭ 420 (+356.52%)
Mutual labels:  transformer, neural-machine-translation
Njunmt Tf
An open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (+5.43%)
Mutual labels:  transformer, neural-machine-translation
Neural-Machine-Translation
Several basic neural machine translation models implemented by PyTorch & TensorFlow
Stars: ✭ 29 (-68.48%)
Mutual labels:  transformer, neural-machine-translation
NiuTrans.NMT
A Fast Neural Machine Translation System. It is developed in C++ and resorts to NiuTensor for fast tensor APIs.
Stars: ✭ 112 (+21.74%)
Mutual labels:  transformer, neural-machine-translation
towhee
Towhee is a framework that is dedicated to making neural data processing pipelines simple and fast.
Stars: ✭ 821 (+792.39%)
Mutual labels:  transformer
image-classification
A collection of SOTA Image Classification Models in PyTorch
Stars: ✭ 70 (-23.91%)
Mutual labels:  transformer
graphtrans
Representing Long-Range Context for Graph Neural Networks with Global Attention
Stars: ✭ 45 (-51.09%)
Mutual labels:  transformer
laravel5-hal-json
Laravel 5 HAL+JSON API Transformer Package
Stars: ✭ 15 (-83.7%)
Mutual labels:  transformer
transform-graphql
⚙️ Transformer function to transform GraphQL Directives. Create model CRUD directive for example
Stars: ✭ 23 (-75%)
Mutual labels:  transformer
speech-transformer
Transformer implementation speciaized in speech recognition tasks using Pytorch.
Stars: ✭ 40 (-56.52%)
Mutual labels:  transformer
YOLOv5-Lite
🍅🍅🍅YOLOv5-Lite: lighter, faster and easier to deploy. Evolved from yolov5 and the size of model is only 930+kb (int8) and 1.7M (fp16). It can reach 10+ FPS on the Raspberry Pi 4B when the input size is 320×320~
Stars: ✭ 1,230 (+1236.96%)
Mutual labels:  transformer
TransPose
PyTorch Implementation for "TransPose: Keypoint localization via Transformer", ICCV 2021.
Stars: ✭ 250 (+171.74%)
Mutual labels:  transformer
transformer
A PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-69.57%)
Mutual labels:  transformer
visualization
a collection of visualization function
Stars: ✭ 189 (+105.43%)
Mutual labels:  transformer

transformer-slt

This repository gathers data and code supporting the experiments in the paper Better Sign Language Translation with STMC-Transformer.

Installation

This code is based on OpenNMT v1.0.0 and requires all of its dependencies (torch==1.6.0). Additional requirements are NLTK for NMT evaluation metrics.

The recommended way to install is shown below:

# create a new virtual environment
virtualenv --python=python3 venv
source venv/bin/activate

# clone the repo
git clone https://github.com/kayoyin/transformer-slt.git
cd transformer-slt

# install python dependencies
pip install -r requirements.txt

# install OpenNMT-py
python setup.py install

Sample Usage

Data processing

onmt_preprocess -train_src data/phoenix2014T.train.gloss -train_tgt data/phoenix2014T.train.de -valid_src data/phoenix2014T.dev.gloss -valid_tgt data/phoenix2014T.dev.de -save_data data/dgs -lower 

Training

python  train.py -data data/dgs -save_model model -keep_checkpoint 1 \
          -layers 2 -rnn_size 512 -word_vec_size 512 -transformer_ff 2048 -heads 8  \
          -encoder_type transformer -decoder_type transformer -position_encoding \
          -max_generator_batches 2 -dropout 0.1 \
          -early_stopping 3 -early_stopping_criteria accuracy ppl \
          -batch_size 2048 -accum_count 3 -batch_type tokens -normalization tokens \
          -optim adam -adam_beta2 0.998 -decay_method noam -warmup_steps 3000 -learning_rate 0.5 \
          -max_grad_norm 0 -param_init 0  -param_init_glorot \
          -label_smoothing 0.1 -valid_steps 100 -save_checkpoint_steps 100 \
          -world_size 1 -gpu_ranks 0

Inference

python translate.py -model model [model2 model3 ...] -src data/phoenix2014T.test.gloss -output pred.txt -gpu 0 -replace_unk -beam_size 4

Scoring

# BLEU-1,2,3,4
python tools/bleu.py 1 pred.txt data/phoenix2014T.test.de
python tools/bleu.py 2 pred.txt data/phoenix2014T.test.de
python tools/bleu.py 3 pred.txt data/phoenix2014T.test.de
python tools/bleu.py 4 pred.txt data/phoenix2014T.test.de

# ROUGE
python tools/rouge.py pred.txt data/phoenix2014T.test.de

# METEOR
python tools/meteor.py pred.txt data/phoenix2014T.test.de

To dos:

  • Add configurations & steps to recreate paper results

Reference

Please cite the paper below if you found the resources in this repository useful:

@inproceedings{yin-read-2020-better,
    title = "Better Sign Language Translation with {STMC}-Transformer",
    author = "Yin, Kayo  and
      Read, Jesse",
    booktitle = "Proceedings of the 28th International Conference on Computational Linguistics",
    month = dec,
    year = "2020",
    address = "Barcelona, Spain (Online)",
    publisher = "International Committee on Computational Linguistics",
    url = "https://www.aclweb.org/anthology/2020.coling-main.525",
    doi = "10.18653/v1/2020.coling-main.525",
    pages = "5975--5989",
    abstract = "Sign Language Translation (SLT) first uses a Sign Language Recognition (SLR) system to extract sign language glosses from videos. Then, a translation system generates spoken language translations from the sign language glosses. This paper focuses on the translation system and introduces the STMC-Transformer which improves on the current state-of-the-art by over 5 and 7 BLEU respectively on gloss-to-text and video-to-text translation of the PHOENIX-Weather 2014T dataset. On the ASLG-PC12 corpus, we report an increase of over 16 BLEU. We also demonstrate the problem in current methods that rely on gloss supervision. The video-to-text translation of our STMC-Transformer outperforms translation of GT glosses. This contradicts previous claims that GT gloss translation acts as an upper bound for SLT performance and reveals that glosses are an inefficient representation of sign language. For future SLT research, we therefore suggest an end-to-end training of the recognition and translation models, or using a different sign language annotation scheme.",
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].