All Projects → minimal-nmt → Similar Projects or Alternatives

455 Open source projects that are alternatives of or similar to minimal-nmt

transformer
Neutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (+66.67%)
Seq2seq chatbot
基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
Stars: ✭ 308 (+755.56%)
Tf Seq2seq
Sequence to sequence learning using TensorFlow.
Stars: ✭ 387 (+975%)
Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+2650%)
Seq2seq chatbot new
基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
Stars: ✭ 144 (+300%)
TS3000 TheChatBOT
Its a social networking chat-bot trained on Reddit dataset . It supports open bounded queries developed on the concept of Neural Machine Translation. Beware of its being sarcastic just like its creator 😝 BDW it uses Pytorch framework and Python3.
Stars: ✭ 20 (-44.44%)
Poetry Seq2seq
Chinese Poetry Generation
Stars: ✭ 159 (+341.67%)
Image Caption Generator
A neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (+250%)
Mutual labels:  beam-search, attention-mechanism
Nmtpytorch
Sequence-to-Sequence Framework in PyTorch
Stars: ✭ 392 (+988.89%)
Word-Level-Eng-Mar-NMT
Translating English sentences to Marathi using Neural Machine Translation
Stars: ✭ 37 (+2.78%)
Image-Caption
Using LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (+0%)
Mutual labels:  beam-search, attention-mechanism
A-Persona-Based-Neural-Conversation-Model
No description or website provided.
Stars: ✭ 22 (-38.89%)
Mutual labels:  seq2seq, attention-mechanism
Tensorflow end2end speech recognition
End-to-End speech recognition implementation base on TensorFlow (CTC, Attention, and MTL training)
Stars: ✭ 305 (+747.22%)
Mutual labels:  beam-search, attention-mechanism
Video-Cap
🎬 Video Captioning: ICCV '15 paper implementation
Stars: ✭ 44 (+22.22%)
Mutual labels:  seq2seq, attention-mechanism
ttslearn
ttslearn: Library for Pythonで学ぶ音声合成 (Text-to-speech with Python)
Stars: ✭ 158 (+338.89%)
Mutual labels:  seq2seq, attention-mechanism
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+1033.33%)
Mutual labels:  seq2seq, attention-mechanism
Nlp made easy
Explains nlp building blocks in a simple manner.
Stars: ✭ 232 (+544.44%)
Mutual labels:  seq2seq, beam-search
Nmt Keras
Neural Machine Translation with Keras
Stars: ✭ 501 (+1291.67%)
RNNSearch
An implementation of attention-based neural machine translation using Pytorch
Stars: ✭ 43 (+19.44%)
SequenceToSequence
A seq2seq with attention dialogue/MT model implemented by TensorFlow.
Stars: ✭ 11 (-69.44%)
Mutual labels:  seq2seq, attention-mechanism
NLP-paper
🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-36.11%)
Mutual labels:  seq2seq, attention-mechanism
Pytorch Chatbot
Pytorch seq2seq chatbot
Stars: ✭ 336 (+833.33%)
Mutual labels:  seq2seq, beam-search
Im2latex
Image to LaTeX (Seq2seq + Attention with Beam Search) - Tensorflow
Stars: ✭ 342 (+850%)
Mutual labels:  seq2seq, beam-search
Joeynmt
Minimalist NMT for educational purposes
Stars: ✭ 420 (+1066.67%)
neural-chat
An AI chatbot using seq2seq
Stars: ✭ 30 (-16.67%)
Mutual labels:  seq2seq, beam-search
Seq2seq.pytorch
Sequence-to-Sequence learning using PyTorch
Stars: ✭ 514 (+1327.78%)
Xmunmt
An implementation of RNNsearch using TensorFlow
Stars: ✭ 69 (+91.67%)
Awesome Speech Recognition Speech Synthesis Papers
Automatic Speech Recognition (ASR), Speaker Verification, Speech Synthesis, Text-to-Speech (TTS), Language Modelling, Singing Voice Synthesis (SVS), Voice Conversion (VC)
Stars: ✭ 2,085 (+5691.67%)
Mutual labels:  seq2seq, attention-mechanism
Nspm
🤖 Neural SPARQL Machines for Knowledge Graph Question Answering.
Stars: ✭ 156 (+333.33%)
Seq2seq
基于Pytorch的中文聊天机器人 集成BeamSearch算法
Stars: ✭ 200 (+455.56%)
Mutual labels:  seq2seq, beam-search
dynmt-py
Neural machine translation implementation using dynet's python bindings
Stars: ✭ 17 (-52.78%)
Seq2seq Summarizer
Pointer-generator reinforced seq2seq summarization in PyTorch
Stars: ✭ 306 (+750%)
Mutual labels:  seq2seq, attention-mechanism
S2VT-seq2seq-video-captioning-attention
S2VT (seq2seq) video captioning with bahdanau & luong attention implementation in Tensorflow
Stars: ✭ 18 (-50%)
Mutual labels:  seq2seq, attention-mechanism
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+9394.44%)
MoChA-pytorch
PyTorch Implementation of "Monotonic Chunkwise Attention" (ICLR 2018)
Stars: ✭ 65 (+80.56%)
Mutual labels:  seq2seq, attention-mechanism
Openseq2seq
Toolkit for efficient experimentation with Speech Recognition, Text2Speech and NLP
Stars: ✭ 1,378 (+3727.78%)
Tensorflow Shakespeare
Neural machine translation between the writings of Shakespeare and modern English using TensorFlow
Stars: ✭ 244 (+577.78%)
beam search
Beam search for neural network sequence to sequence (encoder-decoder) models.
Stars: ✭ 31 (-13.89%)
Mutual labels:  seq2seq, beam-search
2018-dlsl
UPC Deep Learning for Speech and Language 2018
Stars: ✭ 18 (-50%)
seq3
Source code for the NAACL 2019 paper "SEQ^3: Differentiable Sequence-to-Sequence-to-Sequence Autoencoder for Unsupervised Abstractive Sentence Compression"
Stars: ✭ 121 (+236.11%)
Mutual labels:  seq2seq
rnnt decoder cuda
An efficient implementation of RNN-T Prefix Beam Search in C++/CUDA.
Stars: ✭ 60 (+66.67%)
Mutual labels:  beam-search
Machine-Translation-Hindi-to-english-
Machine translation is the task of converting one language to other. Unlike the traditional phrase-based translation system which consists of many small sub-components that are tuned separately, neural machine translation attempts to build and train a single, large neural network that reads a sentence and outputs a correct translation.
Stars: ✭ 19 (-47.22%)
Mutual labels:  attention-mechanism
CVAE Dial
CVAE_XGate model in paper "Xu, Dusek, Konstas, Rieser. Better Conversations by Modeling, Filtering, and Optimizing for Coherence and Diversity"
Stars: ✭ 16 (-55.56%)
Mutual labels:  seq2seq
resolutions-2019
A list of data mining and machine learning papers that I implemented in 2019.
Stars: ✭ 19 (-47.22%)
Mutual labels:  attention-mechanism
h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+236.11%)
Mutual labels:  attention-mechanism
tensorflow-ml-nlp-tf2
텐서플로2와 머신러닝으로 시작하는 자연어처리 (로지스틱회귀부터 BERT와 GPT3까지) 실습자료
Stars: ✭ 245 (+580.56%)
Mutual labels:  seq2seq
visdial
Visual Dialog: Light-weight Transformer for Many Inputs (ECCV 2020)
Stars: ✭ 27 (-25%)
Mutual labels:  attention-mechanism
efficient-attention
An implementation of the efficient attention module.
Stars: ✭ 191 (+430.56%)
Mutual labels:  attention-mechanism
NiuTrans.NMT
A Fast Neural Machine Translation System. It is developed in C++ and resorts to NiuTensor for fast tensor APIs.
Stars: ✭ 112 (+211.11%)
dodrio
Exploring attention weights in transformer-based models with linguistic knowledge.
Stars: ✭ 233 (+547.22%)
Mutual labels:  attention-mechanism
Linear-Attention-Mechanism
Attention mechanism
Stars: ✭ 27 (-25%)
Mutual labels:  attention-mechanism
DCAN
[AAAI 2020] Code release for "Domain Conditioned Adaptation Network" https://arxiv.org/abs/2005.06717
Stars: ✭ 27 (-25%)
Mutual labels:  attention-mechanism
Attention mechanism-event-extraction
Attention mechanism in CNNs to extract events of interest
Stars: ✭ 17 (-52.78%)
Mutual labels:  attention-mechanism
probabilistic nlg
Tensorflow Implementation of Stochastic Wasserstein Autoencoder for Probabilistic Sentence Generation (NAACL 2019).
Stars: ✭ 28 (-22.22%)
Mutual labels:  seq2seq
SiGAT
source code for signed graph attention networks (ICANN2019) & SDGNN (AAAI2021)
Stars: ✭ 37 (+2.78%)
Mutual labels:  attention-mechanism
Multi-task-Conditional-Attention-Networks
A prototype version of our submitted paper: Conversion Prediction Using Multi-task Conditional Attention Networks to Support the Creation of Effective Ad Creatives.
Stars: ✭ 21 (-41.67%)
Mutual labels:  attention-mechanism
Hierarchical-Word-Sense-Disambiguation-using-WordNet-Senses
Word Sense Disambiguation using Word Specific models, All word models and Hierarchical models in Tensorflow
Stars: ✭ 33 (-8.33%)
Mutual labels:  attention-mechanism
lang2logic-PyTorch
PyTorch port of the paper "Language to Logical Form with Neural Attention"
Stars: ✭ 34 (-5.56%)
Mutual labels:  seq2seq
RETRO-pytorch
Implementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch
Stars: ✭ 473 (+1213.89%)
Mutual labels:  attention-mechanism
chatbot
一个基于深度学习的中文聊天机器人,这里有详细的教程与代码,每份代码都有详细的注释,作为学习是美好的选择。A Chinese chatbot based on deep learning.
Stars: ✭ 94 (+161.11%)
Mutual labels:  seq2seq
1-60 of 455 similar projects