Pytorch Seq2seqTutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+9952.94%)
Mutual labels: lstm, attention, sequence-to-sequence, encoder-decoder
Rnn For Joint NluPytorch implementation of "Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling" (https://arxiv.org/abs/1609.01454)
Stars: ✭ 176 (+417.65%)
Mutual labels: lstm, attention, encoder-decoder
BanglatranslatorBangla Machine Translator
Stars: ✭ 21 (-38.24%)
Mutual labels: lstm, attention, encoder-decoder
Screenshot To CodeA neural network that transforms a design mock-up into a static website.
Stars: ✭ 13,561 (+39785.29%)
Mutual labels: lstm, encoder-decoder
seq2seq-pytorchSequence to Sequence Models in PyTorch
Stars: ✭ 41 (+20.59%)
Mutual labels: attention, sequence-to-sequence
EBIM-NLIEnhanced BiLSTM Inference Model for Natural Language Inference
Stars: ✭ 24 (-29.41%)
Mutual labels: lstm, attention
datastories-semeval2017-task6Deep-learning model presented in "DataStories at SemEval-2017 Task 6: Siamese LSTM with Attention for Humorous Text Comparison".
Stars: ✭ 20 (-41.18%)
Mutual labels: lstm, attention
iPerceiveApplying Common-Sense Reasoning to Multi-Modal Dense Video Captioning and Video Question Answering | Python3 | PyTorch | CNNs | Causality | Reasoning | LSTMs | Transformers | Multi-Head Self Attention | Published in IEEE Winter Conference on Applications of Computer Vision (WACV) 2021
Stars: ✭ 52 (+52.94%)
Mutual labels: lstm, attention
Datastories Semeval2017 Task4Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".
Stars: ✭ 184 (+441.18%)
Mutual labels: lstm, attention
Deep Time Series PredictionSeq2Seq, Bert, Transformer, WaveNet for time series prediction.
Stars: ✭ 183 (+438.24%)
Mutual labels: lstm, attention
protein-transformerPredicting protein structure through sequence modeling
Stars: ✭ 77 (+126.47%)
Mutual labels: attention, sequence-to-sequence
automatic-personality-prediction[AAAI 2020] Modeling Personality with Attentive Networks and Contextual Embeddings
Stars: ✭ 43 (+26.47%)
Mutual labels: lstm, attention
learningspoonsnlp lecture-notes and source code
Stars: ✭ 29 (-14.71%)
Mutual labels: lstm, attention
ConvLSTM-PyTorchConvLSTM/ConvGRU (Encoder-Decoder) with PyTorch on Moving-MNIST
Stars: ✭ 202 (+494.12%)
Mutual labels: lstm, encoder-decoder
Abstractive SummarizationImplementation of abstractive summarization using LSTM in the encoder-decoder architecture with local attention.
Stars: ✭ 128 (+276.47%)
Mutual labels: lstm, encoder-decoder
Deep News SummarizationNews summarization using sequence to sequence model with attention in TensorFlow.
Stars: ✭ 167 (+391.18%)
Mutual labels: lstm, encoder-decoder
deep-transTransliterating English to Hindi using Recurrent Neural Networks
Stars: ✭ 44 (+29.41%)
Mutual labels: lstm, sequence-to-sequence