All Projects → sooftware → transformer

sooftware / transformer

Licence: Apache-2.0 license
A PyTorch Implementation of "Attention Is All You Need"

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to transformer

Nlp Tutorials
Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Stars: ✭ 394 (+1307.14%)
Mutual labels:  transformer, seq2seq, attention
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+1357.14%)
Mutual labels:  transformer, seq2seq, attention
Text Classification Models Pytorch
Implementation of State-of-the-art Text Classification Models in Pytorch
Stars: ✭ 379 (+1253.57%)
Mutual labels:  transformer, seq2seq, attention
transformer
Neutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (+114.29%)
Mutual labels:  transformer, seq2seq, attention-is-all-you-need
Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+3435.71%)
Mutual labels:  transformer, seq2seq, attention-is-all-you-need
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+12107.14%)
Mutual labels:  transformer, seq2seq, attention
kospeech
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
Stars: ✭ 456 (+1528.57%)
Mutual labels:  transformer, seq2seq, attention-is-all-you-need
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+1367.86%)
Mutual labels:  transformer, attention, attention-is-all-you-need
Awesome Fast Attention
list of efficient attention modules
Stars: ✭ 627 (+2139.29%)
Mutual labels:  transformer, attention, attention-is-all-you-need
Speech Transformer
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (+1917.86%)
Mutual labels:  transformer, attention, attention-is-all-you-need
Multiturndialogzoo
Multi-turn dialogue baselines written in PyTorch
Stars: ✭ 106 (+278.57%)
Mutual labels:  transformer, seq2seq, attention
Machine Translation
Stars: ✭ 51 (+82.14%)
Mutual labels:  transformer, seq2seq, attention-is-all-you-need
Kospeech
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition.
Stars: ✭ 190 (+578.57%)
Mutual labels:  transformer, seq2seq, attention-is-all-you-need
Self Attention Cv
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+646.43%)
Mutual labels:  transformer, attention
Jddc solution 4th
2018-JDDC大赛第4名的解决方案
Stars: ✭ 235 (+739.29%)
Mutual labels:  transformer, attention
chinese ancient poetry
seq2seq attention tensorflow textrank context
Stars: ✭ 30 (+7.14%)
Mutual labels:  seq2seq, attention
Paddlenlp
NLP Core Library and Model Zoo based on PaddlePaddle 2.0
Stars: ✭ 212 (+657.14%)
Mutual labels:  transformer, seq2seq
seq2seq-pytorch
Sequence to Sequence Models in PyTorch
Stars: ✭ 41 (+46.43%)
Mutual labels:  transformer, attention
tensorflow-chatbot-chinese
網頁聊天機器人 | tensorflow implementation of seq2seq model with bahdanau attention and Word2Vec pretrained embedding
Stars: ✭ 50 (+78.57%)
Mutual labels:  seq2seq, attention
transformer
A simple TensorFlow implementation of the Transformer
Stars: ✭ 25 (-10.71%)
Mutual labels:  transformer, attention-is-all-you-need

transformer

A PyTorch Implementation of Transformer in Attention Is All You Need.
This repository focused on implementing the contents of the paper as much as possible.

Intro

This repository focused on implementing the contents of the paper as much as possible,
while at the same time striving for a readable code. To improve readability,
I designed the model structure to fit as much as possible to the blocks in the above Transformers figure.

Installation

This project recommends Python 3.7 or higher. We recommend creating a new virtual environment for this project (using virtual env or conda).

Prerequisites

  • Numpy: pip install numpy (Refer here for problem installing Numpy).
  • Pytorch: Refer to PyTorch website to install the version w.r.t. your environment.

Install from source

Currently we only support installation from source code using setuptools. Checkout the source code and run the following commands:

pip install -e .

Usage

import torch
import torch.nn as nn
from transformer import Transformer

BATCH_SIZE, SEQ_LENGTH, D_MODEL = 3, 10, 64

cuda = torch.cuda.is_available()  
device = torch.device('cuda' if cuda else 'cpu')

inputs = torch.zeros(BATCH_SIZE, SEQ_LENGTH).long().to(device)
input_lengths = torch.LongTensor([12345, 12300, 12000])
targets = torch.LongTensor([[1, 3, 3, 3, 3, 3, 4, 5, 6, 2],
                            [1, 3, 3, 3, 3, 3, 4, 5, 2, 0],
                            [1, 3, 3, 3, 3, 3, 4, 2, 0, 0]]).to(device)
target_lengths = torch.LongTensor([9, 8, 7])

model = nn.DataParallel(Transformer(num_input_embeddings=30, num_output_embeddings=50, 
                                    d_model=64, 
                                    num_encoder_layers=3, num_decoder_layers=3)).to(device)

# Forward propagate
outputs = model(inputs, input_lengths, targets, target_lengths)

# Inference
outputs = model(inputs, input_lengths)

Troubleshoots and Contributing

If you have any questions, bug reports, and feature requests, please open an issue on github or
contacts [email protected] please.

I appreciate any kind of feedback or contribution. Feel free to proceed with small issues like bug fixes, documentation improvement. For major contributions and new features, please discuss with the collaborators in corresponding issues.

Code Style

I follow PEP-8 for code style. Especially the style of docstrings is important to generate documentation.

Author

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].