All Projects → j-min → MoChA-pytorch

j-min / MoChA-pytorch

Licence: other
PyTorch Implementation of "Monotonic Chunkwise Attention" (ICLR 2018)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to MoChA-pytorch

Seq2seq chatbot
基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
Stars: ✭ 308 (+373.85%)
Mutual labels:  seq2seq, attention-mechanism
ttslearn
ttslearn: Library for Pythonで学ぶ音声合成 (Text-to-speech with Python)
Stars: ✭ 158 (+143.08%)
Mutual labels:  seq2seq, attention-mechanism
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+527.69%)
Mutual labels:  seq2seq, attention-mechanism
Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+1423.08%)
Mutual labels:  seq2seq, attention-mechanism
SequenceToSequence
A seq2seq with attention dialogue/MT model implemented by TensorFlow.
Stars: ✭ 11 (-83.08%)
Mutual labels:  seq2seq, attention-mechanism
Seq2seq Summarizer
Pointer-generator reinforced seq2seq summarization in PyTorch
Stars: ✭ 306 (+370.77%)
Mutual labels:  seq2seq, attention-mechanism
Seq2seq chatbot new
基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
Stars: ✭ 144 (+121.54%)
Mutual labels:  seq2seq, attention-mechanism
Video-Cap
🎬 Video Captioning: ICCV '15 paper implementation
Stars: ✭ 44 (-32.31%)
Mutual labels:  seq2seq, attention-mechanism
S2VT-seq2seq-video-captioning-attention
S2VT (seq2seq) video captioning with bahdanau & luong attention implementation in Tensorflow
Stars: ✭ 18 (-72.31%)
Mutual labels:  seq2seq, attention-mechanism
Poetry Seq2seq
Chinese Poetry Generation
Stars: ✭ 159 (+144.62%)
Mutual labels:  seq2seq, attention-mechanism
Awesome Speech Recognition Speech Synthesis Papers
Automatic Speech Recognition (ASR), Speaker Verification, Speech Synthesis, Text-to-Speech (TTS), Language Modelling, Singing Voice Synthesis (SVS), Voice Conversion (VC)
Stars: ✭ 2,085 (+3107.69%)
Mutual labels:  seq2seq, attention-mechanism
minimal-nmt
A minimal nmt example to serve as an seq2seq+attention reference.
Stars: ✭ 36 (-44.62%)
Mutual labels:  seq2seq, attention-mechanism
NLP-paper
🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-64.62%)
Mutual labels:  seq2seq, attention-mechanism
A-Persona-Based-Neural-Conversation-Model
No description or website provided.
Stars: ✭ 22 (-66.15%)
Mutual labels:  seq2seq, attention-mechanism
transformer
Neutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (-7.69%)
Mutual labels:  seq2seq
attention-mechanism-keras
attention mechanism in keras, like Dense and RNN...
Stars: ✭ 19 (-70.77%)
Mutual labels:  attention-mechanism
SentimentAnalysis
Sentiment Analysis: Deep Bi-LSTM+attention model
Stars: ✭ 32 (-50.77%)
Mutual labels:  attention-mechanism
deep-molecular-optimization
Molecular optimization by capturing chemist’s intuition using the Seq2Seq with attention and the Transformer
Stars: ✭ 60 (-7.69%)
Mutual labels:  seq2seq
Image-Caption
Using LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (-44.62%)
Mutual labels:  attention-mechanism
FragmentVC
Any-to-any voice conversion by end-to-end extracting and fusing fine-grained voice fragments with attention
Stars: ✭ 134 (+106.15%)
Mutual labels:  attention-mechanism

PyTorch Implementation of Monotonic Chunkwise Attention

Requirements

  • PyTorch 0.4

TODOs

  • Soft MoChA
  • Hard MoChA
  • Linear Time Decoding
  • Experiment with Real-world dataset

Model figure

Model figure 1

Linear Time Decoding

It's not clear if authors' TF implementation supports decoding in linear time. They calculate energies for whole encoder outputs instead of scanning from previously attended encoder output.

References

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].