All Projects → transformer → Similar Projects or Alternatives

659 Open source projects that are alternatives of or similar to transformer

Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+1357.14%)
Mutual labels:  transformer, seq2seq, attention
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+1367.86%)
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+12107.14%)
Mutual labels:  transformer, seq2seq, attention
Speech Transformer
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (+1917.86%)
Awesome Fast Attention
list of efficient attention modules
Stars: ✭ 627 (+2139.29%)
transformer
Neutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (+114.29%)
Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+3435.71%)
Machine Translation
Stars: ✭ 51 (+82.14%)
Nlp Tutorials
Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Stars: ✭ 394 (+1307.14%)
Mutual labels:  transformer, seq2seq, attention
Text Classification Models Pytorch
Implementation of State-of-the-art Text Classification Models in Pytorch
Stars: ✭ 379 (+1253.57%)
Mutual labels:  transformer, seq2seq, attention
Multiturndialogzoo
Multi-turn dialogue baselines written in PyTorch
Stars: ✭ 106 (+278.57%)
Mutual labels:  transformer, seq2seq, attention
Kospeech
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition.
Stars: ✭ 190 (+578.57%)
kospeech
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
Stars: ✭ 456 (+1528.57%)
Visual-Transformer-Paper-Summary
Summary of Transformer applications for computer vision tasks.
Stars: ✭ 51 (+82.14%)
Mutual labels:  transformer, attention
attention-is-all-you-need-paper
Implementation of Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information processing systems. 2017.
Stars: ✭ 97 (+246.43%)
ai challenger 2018 sentiment analysis
Fine-grained Sentiment Analysis of User Reviews --- AI CHALLENGER 2018
Stars: ✭ 16 (-42.86%)
Mutual labels:  transformer, attention
Keras Transformer
Transformer implemented in Keras
Stars: ✭ 273 (+875%)
Mutual labels:  transformer, attention
Dab
Data Augmentation by Backtranslation (DAB) ヽ( •_-)ᕗ
Stars: ✭ 294 (+950%)
Transformer
A TensorFlow Implementation of the Transformer: Attention Is All You Need
Stars: ✭ 3,646 (+12921.43%)
Nmt Keras
Neural Machine Translation with Keras
Stars: ✭ 501 (+1689.29%)
Cell Detr
Official and maintained implementation of the paper Attention-Based Transformers for Instance Segmentation of Cells in Microstructures [BIBM 2020].
Stars: ✭ 26 (-7.14%)
Mutual labels:  transformer, attention
chatbot
一个基于深度学习的中文聊天机器人,这里有详细的教程与代码,每份代码都有详细的注释,作为学习是美好的选择。A Chinese chatbot based on deep learning.
Stars: ✭ 94 (+235.71%)
Mutual labels:  seq2seq, attention
NLP-paper
🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-17.86%)
Mutual labels:  transformer, seq2seq
CrabNet
Predict materials properties using only the composition information!
Stars: ✭ 57 (+103.57%)
Mutual labels:  transformer, attention
Witwicky
Witwicky: An implementation of Transformer in PyTorch.
Stars: ✭ 21 (-25%)
Asr
Stars: ✭ 54 (+92.86%)
Mutual labels:  transformer, seq2seq
Deeplearning Nlp Models
A small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (+128.57%)
Mutual labels:  transformer, attention
deep-molecular-optimization
Molecular optimization by capturing chemist’s intuition using the Seq2Seq with attention and the Transformer
Stars: ✭ 60 (+114.29%)
Mutual labels:  transformer, seq2seq
pytorch-transformer-chatbot
PyTorch v1.2에서 생긴 Transformer API 를 이용한 간단한 Chitchat 챗봇
Stars: ✭ 44 (+57.14%)
Mutual labels:  transformer, seq2seq
classifier multi label seq2seq attention
multi-label,classifier,text classification,多标签文本分类,文本分类,BERT,ALBERT,multi-label-classification,seq2seq,attention,beam search
Stars: ✭ 26 (-7.14%)
Mutual labels:  seq2seq, attention
Transformer Tensorflow
TensorFlow implementation of 'Attention Is All You Need (2017. 6)'
Stars: ✭ 319 (+1039.29%)
Mutual labels:  transformer, attention
Embedding
Embedding模型代码和学习笔记总结
Stars: ✭ 25 (-10.71%)
Mutual labels:  transformer, seq2seq
Bertqa Attention On Steroids
BertQA - Attention on Steroids
Stars: ✭ 112 (+300%)
Mutual labels:  transformer, attention
Sightseq
Computer vision tools for fairseq, containing PyTorch implementation of text recognition and object detection
Stars: ✭ 116 (+314.29%)
Mutual labels:  transformer, attention
Transformers.jl
Julia Implementation of Transformer models
Stars: ✭ 173 (+517.86%)
Mutual labels:  transformer, attention
Seq2seqchatbots
A wrapper around tensor2tensor to flexibly train, interact, and generate data for neural chatbots.
Stars: ✭ 466 (+1564.29%)
Mutual labels:  transformer, seq2seq
Joeynmt
Minimalist NMT for educational purposes
Stars: ✭ 420 (+1400%)
Mutual labels:  transformer, seq2seq
Graphtransformer
Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
Stars: ✭ 187 (+567.86%)
Mutual labels:  transformer, attention
Tensorflow Ml Nlp
텐서플로우와 머신러닝으로 시작하는 자연어처리(로지스틱회귀부터 트랜스포머 챗봇까지)
Stars: ✭ 176 (+528.57%)
Mutual labels:  transformer, seq2seq
Pytorch Transformer
pytorch implementation of Attention is all you need
Stars: ✭ 199 (+610.71%)
Self Attention Cv
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+646.43%)
Mutual labels:  transformer, attention
visualization
a collection of visualization function
Stars: ✭ 189 (+575%)
Mutual labels:  transformer, attention
Transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Stars: ✭ 55,742 (+198978.57%)
Mutual labels:  transformer, seq2seq
tensorflow-ml-nlp-tf2
텐서플로2와 머신러닝으로 시작하는 자연어처리 (로지스틱회귀부터 BERT와 GPT3까지) 실습자료
Stars: ✭ 245 (+775%)
Mutual labels:  transformer, seq2seq
Medical Transformer
Pytorch Code for "Medical Transformer: Gated Axial-Attention for Medical Image Segmentation"
Stars: ✭ 153 (+446.43%)
Mutual labels:  transformer, attention
Njunmt Tf
An open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (+246.43%)
Mutual labels:  transformer, attention
learningspoons
nlp lecture-notes and source code
Stars: ✭ 29 (+3.57%)
Mutual labels:  transformer, attention
chinese ancient poetry
seq2seq attention tensorflow textrank context
Stars: ✭ 30 (+7.14%)
Mutual labels:  seq2seq, attention
Transformer Temporal Tagger
Code and data form the paper BERT Got a Date: Introducing Transformers to Temporal Tagging
Stars: ✭ 55 (+96.43%)
Mutual labels:  transformer, seq2seq
Paddlenlp
NLP Core Library and Model Zoo based on PaddlePaddle 2.0
Stars: ✭ 212 (+657.14%)
Mutual labels:  transformer, seq2seq
Transformers without tears
Transformers without Tears: Improving the Normalization of Self-Attention
Stars: ✭ 80 (+185.71%)
h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+332.14%)
Mutual labels:  transformer, attention
seq2seq-pytorch
Sequence to Sequence Models in PyTorch
Stars: ✭ 41 (+46.43%)
Mutual labels:  transformer, attention
tensorflow-chatbot-chinese
網頁聊天機器人 | tensorflow implementation of seq2seq model with bahdanau attention and Word2Vec pretrained embedding
Stars: ✭ 50 (+78.57%)
Mutual labels:  seq2seq, attention
speech-transformer
Transformer implementation speciaized in speech recognition tasks using Pytorch.
Stars: ✭ 40 (+42.86%)
Relation-Extraction-Transformer
NLP: Relation extraction with position-aware self-attention transformer
Stars: ✭ 63 (+125%)
Mutual labels:  transformer, attention
Nlp Tutorial
Natural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+35239.29%)
Mutual labels:  transformer, attention
Jddc solution 4th
2018-JDDC大赛第4名的解决方案
Stars: ✭ 235 (+739.29%)
Mutual labels:  transformer, attention
transformer
A simple TensorFlow implementation of the Transformer
Stars: ✭ 25 (-10.71%)
TRAR-VQA
[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (+75%)
Mutual labels:  transformer, attention
1-60 of 659 similar projects