All Projects → tunz → Transformer Pytorch

tunz / Transformer Pytorch

Licence: mit
Transformer implementation in PyTorch.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Transformer Pytorch

Conformer
Implementation of the convolutional module from the Conformer paper, for use in Transformers
Stars: ✭ 103 (-30.87%)
Mutual labels:  transformer
Overlappredator
[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 106 (-28.86%)
Mutual labels:  transformer
Nlp research
NLP research:基于tensorflow的nlp深度学习项目,支持文本分类/句子匹配/序列标注/文本生成 四大任务
Stars: ✭ 141 (-5.37%)
Mutual labels:  transformer
Ghostnet
CV backbones including GhostNet, TinyNet and TNT, developed by Huawei Noah's Ark Lab.
Stars: ✭ 1,744 (+1070.47%)
Mutual labels:  transformer
Kiss
Code for the paper "KISS: Keeping it Simple for Scene Text Recognition"
Stars: ✭ 108 (-27.52%)
Mutual labels:  transformer
Mmsegmentation
OpenMMLab Semantic Segmentation Toolbox and Benchmark.
Stars: ✭ 2,875 (+1829.53%)
Mutual labels:  transformer
Bert ocr.pytorch
Unofficial PyTorch implementation of 2D Attentional Irregular Scene Text Recognizer
Stars: ✭ 101 (-32.21%)
Mutual labels:  transformer
The Story Of Heads
This is a repository with the code for the ACL 2019 paper "Analyzing Multi-Head Self-Attention: Specialized Heads Do the Heavy Lifting, the Rest Can Be Pruned" and the paper "Analyzing Source and Target Contributions to NMT Predictions".
Stars: ✭ 146 (-2.01%)
Mutual labels:  transformer
Cjstoesm
A tool that can transform CommonJS to ESM
Stars: ✭ 109 (-26.85%)
Mutual labels:  transformer
Transformer In Generating Dialogue
An Implementation of 'Attention is all you need' with Chinese Corpus
Stars: ✭ 121 (-18.79%)
Mutual labels:  transformer
Multiturndialogzoo
Multi-turn dialogue baselines written in PyTorch
Stars: ✭ 106 (-28.86%)
Mutual labels:  transformer
Getting Started With Google Bert
Build and train state-of-the-art natural language processing models using BERT
Stars: ✭ 107 (-28.19%)
Mutual labels:  transformer
Symfony Jsonapi
JSON API Transformer Bundle for Symfony 2 and Symfony 3
Stars: ✭ 114 (-23.49%)
Mutual labels:  transformer
Protoc Gen Struct Transformer
Transformation functions generator for Protocol Buffers.
Stars: ✭ 105 (-29.53%)
Mutual labels:  transformer
Onnxt5
Summarization, translation, sentiment-analysis, text-generation and more at blazing speed using a T5 version implemented in ONNX.
Stars: ✭ 143 (-4.03%)
Mutual labels:  transformer
Esbuild Jest
A Jest transformer using esbuild
Stars: ✭ 100 (-32.89%)
Mutual labels:  transformer
Bertqa Attention On Steroids
BertQA - Attention on Steroids
Stars: ✭ 112 (-24.83%)
Mutual labels:  transformer
Tensorflowasr
集成了Tensorflow 2版本的端到端语音识别模型,并且RTF(实时率)在0.1左右/Mandarin State-of-the-art Automatic Speech Recognition in Tensorflow 2
Stars: ✭ 145 (-2.68%)
Mutual labels:  transformer
Tupe
Transformer with Untied Positional Encoding (TUPE). Code of paper "Rethinking Positional Encoding in Language Pre-training". Improve existing models like BERT.
Stars: ✭ 143 (-4.03%)
Mutual labels:  transformer
Sightseq
Computer vision tools for fairseq, containing PyTorch implementation of text recognition and object detection
Stars: ✭ 116 (-22.15%)
Mutual labels:  transformer

Transformer

This is a pytorch implementation of the Transformer model like tensorflow/tensor2tensor.

Prerequisite

I tested it with PyTorch 1.0.0 and Python 3.6.8.

It's using SpaCy to tokenize languages for wmt32k dataset. So, if you want to run wmt32k problem which is a de/en translation dataset, you should download language models first with the following command.

$ pip install spacy
$ python -m spacy download en
$ python -m spacy download de

Usage

  1. Train a model.
$ python train.py --problem wmt32k --output_dir ./output --data_dir ./wmt32k_data
or
$ python train.py --problem lm1b --output_dir ./output --data_dir ./lm1b_data

If you want to try fast_transformer, give a model argument after installing tcop-pytorch.

$ python train.py --problem lm1b --output_dir ./output --data_dir ./lm1b_data --model fast_transformer
  1. You can translate a single sentence with the trained model.
$ python decoder.py --translate --data_dir ./wmt32k_data --model_dir ./output/last/models
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].