All Projects → seq2seq-pytorch → Similar Projects or Alternatives

579 Open source projects that are alternatives of or similar to seq2seq-pytorch

Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+895.12%)
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+8236.59%)
CrabNet
Predict materials properties using only the composition information!
Stars: ✭ 57 (+39.02%)
Mutual labels:  transformer, attention, self-attention
AttnSleep
[IEEE TNSRE] "An Attention-based Deep Learning Approach for Sleep Stage Classification with Single-Channel EEG"
Stars: ✭ 76 (+85.37%)
Mutual labels:  attention, self-attention
Jddc solution 4th
2018-JDDC大赛第4名的解决方案
Stars: ✭ 235 (+473.17%)
Mutual labels:  transformer, attention
Deeplearning Nlp Models
A small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (+56.1%)
Mutual labels:  transformer, attention
MASTER-pytorch
Code for the paper "MASTER: Multi-Aspect Non-local Network for Scene Text Recognition" (Pattern Recognition 2021)
Stars: ✭ 263 (+541.46%)
Mutual labels:  transformer, self-attention
Medical Transformer
Pytorch Code for "Medical Transformer: Gated Axial-Attention for Medical Image Segmentation"
Stars: ✭ 153 (+273.17%)
Mutual labels:  transformer, attention
learningspoons
nlp lecture-notes and source code
Stars: ✭ 29 (-29.27%)
Mutual labels:  transformer, attention
Transformer-Transducer
PyTorch implementation of "Transformer Transducer: A Streamable Speech Recognition Model with Transformer Encoders and RNN-T Loss" (ICASSP 2020)
Stars: ✭ 61 (+48.78%)
Transformer Tensorflow
TensorFlow implementation of 'Attention Is All You Need (2017. 6)'
Stars: ✭ 319 (+678.05%)
Mutual labels:  transformer, attention
visualization
a collection of visualization function
Stars: ✭ 189 (+360.98%)
Mutual labels:  transformer, attention
Nlp Tutorials
Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Stars: ✭ 394 (+860.98%)
Mutual labels:  transformer, attention
Transformers.jl
Julia Implementation of Transformer models
Stars: ✭ 173 (+321.95%)
Mutual labels:  transformer, attention
Relation-Extraction-Transformer
NLP: Relation extraction with position-aware self-attention transformer
Stars: ✭ 63 (+53.66%)
Mutual labels:  transformer, attention
Self Attention Cv
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+409.76%)
Mutual labels:  transformer, attention
query-selector
LONG-TERM SERIES FORECASTING WITH QUERYSELECTOR – EFFICIENT MODEL OF SPARSEATTENTION
Stars: ✭ 63 (+53.66%)
Mutual labels:  transformer, self-attention
h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+195.12%)
Mutual labels:  transformer, attention
protein-transformer
Predicting protein structure through sequence modeling
Stars: ✭ 77 (+87.8%)
Mutual labels:  attention, sequence-to-sequence
text-generation-transformer
text generation based on transformer
Stars: ✭ 36 (-12.2%)
ai challenger 2018 sentiment analysis
Fine-grained Sentiment Analysis of User Reviews --- AI CHALLENGER 2018
Stars: ✭ 16 (-60.98%)
Mutual labels:  transformer, attention
Text Classification Models Pytorch
Implementation of State-of-the-art Text Classification Models in Pytorch
Stars: ✭ 379 (+824.39%)
Mutual labels:  transformer, attention
Sightseq
Computer vision tools for fairseq, containing PyTorch implementation of text recognition and object detection
Stars: ✭ 116 (+182.93%)
Mutual labels:  transformer, attention
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+902.44%)
Mutual labels:  transformer, attention
Bertqa Attention On Steroids
BertQA - Attention on Steroids
Stars: ✭ 112 (+173.17%)
Mutual labels:  transformer, attention
Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+2314.63%)
Keras Transformer
Transformer implemented in Keras
Stars: ✭ 273 (+565.85%)
Mutual labels:  transformer, attention
Machine Translation
Stars: ✭ 51 (+24.39%)
Nlp Tutorial
Natural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+24034.15%)
Mutual labels:  transformer, attention
Nmt Keras
Neural Machine Translation with Keras
Stars: ✭ 501 (+1121.95%)
Walk-Transformer
From Random Walks to Transformer for Learning Node Embeddings (ECML-PKDD 2020) (In Pytorch and Tensorflow)
Stars: ✭ 26 (-36.59%)
Mutual labels:  transformer, self-attention
Awesome Fast Attention
list of efficient attention modules
Stars: ✭ 627 (+1429.27%)
Mutual labels:  transformer, attention
Cell Detr
Official and maintained implementation of the paper Attention-Based Transformers for Instance Segmentation of Cells in Microstructures [BIBM 2020].
Stars: ✭ 26 (-36.59%)
Mutual labels:  transformer, attention
Speech Transformer
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (+1278.05%)
Mutual labels:  transformer, attention
Visual-Transformer-Paper-Summary
Summary of Transformer applications for computer vision tasks.
Stars: ✭ 51 (+24.39%)
Mutual labels:  transformer, attention
dhs summit 2019 image captioning
Image captioning using attention models
Stars: ✭ 34 (-17.07%)
Mutual labels:  attention, sequence-to-sequence
RNNSearch
An implementation of attention-based neural machine translation using Pytorch
Stars: ✭ 43 (+4.88%)
Mutual labels:  attention, sequence-to-sequence
iPerceive
Applying Common-Sense Reasoning to Multi-Modal Dense Video Captioning and Video Question Answering | Python3 | PyTorch | CNNs | Causality | Reasoning | LSTMs | Transformers | Multi-Head Self Attention | Published in IEEE Winter Conference on Applications of Computer Vision (WACV) 2021
Stars: ✭ 52 (+26.83%)
Mutual labels:  attention, self-attention
TRAR-VQA
[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (+19.51%)
Mutual labels:  transformer, attention
R-MeN
Transformer-based Memory Networks for Knowledge Graph Embeddings (ACL 2020) (Pytorch and Tensorflow)
Stars: ✭ 74 (+80.49%)
Mutual labels:  transformer, self-attention
Multiturndialogzoo
Multi-turn dialogue baselines written in PyTorch
Stars: ✭ 106 (+158.54%)
Mutual labels:  transformer, attention
transformer
A PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-31.71%)
Mutual labels:  transformer, attention
Athena
an open-source implementation of sequence-to-sequence based speech processing engine
Stars: ✭ 542 (+1221.95%)
Njunmt Tf
An open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (+136.59%)
Mutual labels:  transformer, attention
Graphtransformer
Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
Stars: ✭ 187 (+356.1%)
Mutual labels:  transformer, attention
Graph Transformer
Transformer for Graph Classification (Pytorch and Tensorflow)
Stars: ✭ 191 (+365.85%)
Mutual labels:  transformer
Torchnlp
Easy to use NLP library built on PyTorch and TorchText
Stars: ✭ 233 (+468.29%)
Mutual labels:  transformer
Sentimentanalysis
Sentiment analysis neural network trained by fine-tuning BERT, ALBERT, or DistilBERT on the Stanford Sentiment Treebank.
Stars: ✭ 186 (+353.66%)
Mutual labels:  transformer
MGAN
Exploiting Coarse-to-Fine Task Transfer for Aspect-level Sentiment Classification (AAAI'19)
Stars: ✭ 44 (+7.32%)
Mutual labels:  attention
Fairseq Image Captioning
Transformer-based image captioning extension for pytorch/fairseq
Stars: ✭ 180 (+339.02%)
Mutual labels:  transformer
Transformer Clinic
Understanding the Difficulty of Training Transformers
Stars: ✭ 179 (+336.59%)
Mutual labels:  transformer
Meshed Memory Transformer
Meshed-Memory Transformer for Image Captioning. CVPR 2020
Stars: ✭ 230 (+460.98%)
Mutual labels:  transformer
Tensorflow Ml Nlp
텐서플로우와 머신러닝으로 시작하는 자연어처리(로지스틱회귀부터 트랜스포머 챗봇까지)
Stars: ✭ 176 (+329.27%)
Mutual labels:  transformer
End2end Asr Pytorch
End-to-End Automatic Speech Recognition on PyTorch
Stars: ✭ 175 (+326.83%)
Mutual labels:  transformer
nested-transformer
Nested Hierarchical Transformer https://arxiv.org/pdf/2105.12723.pdf
Stars: ✭ 174 (+324.39%)
Mutual labels:  transformer
Ner Bert Pytorch
PyTorch solution of named entity recognition task Using Google AI's pre-trained BERT model.
Stars: ✭ 249 (+507.32%)
Mutual labels:  transformer
Multigraph transformer
transformer, multi-graph transformer, graph, graph classification, sketch recognition, sketch classification, free-hand sketch, official code of the paper "Multi-Graph Transformer for Free-Hand Sketch Recognition"
Stars: ✭ 231 (+463.41%)
Mutual labels:  transformer
Gpt 2 Tensorflow2.0
OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0
Stars: ✭ 172 (+319.51%)
Mutual labels:  transformer
Eeg Dl
A Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Stars: ✭ 165 (+302.44%)
Mutual labels:  transformer
Effective transformer
Running BERT without Padding
Stars: ✭ 169 (+312.2%)
Mutual labels:  transformer
1-60 of 579 similar projects