All Projects → jacoxu → Encoder_decoder

jacoxu / Encoder_decoder

Four styles of encoder decoder model by Python, Theano, Keras and Seq2Seq

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Encoder decoder

Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+1170.63%)
Mutual labels:  attention, seq2seq, encoder-decoder
tensorflow-chatbot-chinese
網頁聊天機器人 | tensorflow implementation of seq2seq model with bahdanau attention and Word2Vec pretrained embedding
Stars: ✭ 50 (-81.41%)
Mutual labels:  seq2seq, attention
chinese ancient poetry
seq2seq attention tensorflow textrank context
Stars: ✭ 30 (-88.85%)
Mutual labels:  seq2seq, attention
NeuralCodeTranslator
Neural Code Translator provides instructions, datasets, and a deep learning infrastructure (based on seq2seq) that aims at learning code transformations
Stars: ✭ 32 (-88.1%)
Mutual labels:  seq2seq, encoder-decoder
Rnn For Joint Nlu
Pytorch implementation of "Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling" (https://arxiv.org/abs/1609.01454)
Stars: ✭ 176 (-34.57%)
Mutual labels:  attention, encoder-decoder
Deep Time Series Prediction
Seq2Seq, Bert, Transformer, WaveNet for time series prediction.
Stars: ✭ 183 (-31.97%)
Mutual labels:  attention, seq2seq
ai-visual-storytelling-seq2seq
Implementation of seq2seq model for Visual Storytelling Challenge (VIST) http://visionandlanguage.net/VIST/index.html
Stars: ✭ 50 (-81.41%)
Mutual labels:  seq2seq, encoder-decoder
Banglatranslator
Bangla Machine Translator
Stars: ✭ 21 (-92.19%)
Mutual labels:  attention, encoder-decoder
classifier multi label seq2seq attention
multi-label,classifier,text classification,多标签文本分类,文本分类,BERT,ALBERT,multi-label-classification,seq2seq,attention,beam search
Stars: ✭ 26 (-90.33%)
Mutual labels:  seq2seq, attention
transformer
A PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-89.59%)
Mutual labels:  seq2seq, attention
RNNSearch
An implementation of attention-based neural machine translation using Pytorch
Stars: ✭ 43 (-84.01%)
Mutual labels:  seq2seq, attention
Chinese Chatbot
中文聊天机器人,基于10万组对白训练而成,采用注意力机制,对一般问题都会生成一个有意义的答复。已上传模型,可直接运行,跑不起来直播吃键盘。
Stars: ✭ 124 (-53.9%)
Mutual labels:  attention, seq2seq
Multiturndialogzoo
Multi-turn dialogue baselines written in PyTorch
Stars: ✭ 106 (-60.59%)
Mutual labels:  attention, seq2seq
Base-On-Relation-Method-Extract-News-DA-RNN-Model-For-Stock-Prediction--Pytorch
基於關聯式新聞提取方法之雙階段注意力機制模型用於股票預測
Stars: ✭ 33 (-87.73%)
Mutual labels:  seq2seq, attention
Pointer Networks Experiments
Sorting numbers with pointer networks
Stars: ✭ 53 (-80.3%)
Mutual labels:  attention, seq2seq
Transformer Temporal Tagger
Code and data form the paper BERT Got a Date: Introducing Transformers to Temporal Tagging
Stars: ✭ 55 (-79.55%)
Mutual labels:  seq2seq, encoder-decoder
Nlp Tutorials
Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Stars: ✭ 394 (+46.47%)
Mutual labels:  attention, seq2seq
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+51.67%)
Mutual labels:  attention, seq2seq
chatbot
一个基于深度学习的中文聊天机器人,这里有详细的教程与代码,每份代码都有详细的注释,作为学习是美好的选择。A Chinese chatbot based on deep learning.
Stars: ✭ 94 (-65.06%)
Mutual labels:  seq2seq, attention
Embedding
Embedding模型代码和学习笔记总结
Stars: ✭ 25 (-90.71%)
Mutual labels:  seq2seq, encoder-decoder

encoder_decoder

漫谈四种神经网络序列解码模型[http://jacoxu.com/?p=1852]

requirements=Keras[https://github.com/fchollet/keras], Seq2Seq[https://github.com/farizrahman4u/seq2seq]

NOTE:

The suggested version of Keras is 0.3.3 or 0.3.2 rather than 1.0.0 and the lasted version, for some old style functions are called in seq2seq.

model - 1: basic encoder-decoder

model - 1: basic encoder-decoder

model - 2: encoder-decoder with feedback

model - 2: encoder-decoder with feedback

model - 3: encoder-decoder with peek

model - 3: encoder-decoder with peek

model - 4: encoder-decoder with attention

model - 4: encoder-decoder with attention

results: four encoder-decoder modes

results: four encoder-decoder modes

Question: How to change the encoder-decoder modes?

Answering: Change the decoder_mode in Line 144 of the code. For example, you can change decoder_mode = 3 to run the attention mode.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].