All Projects → ywk991112 → Pytorch Chatbot

ywk991112 / Pytorch Chatbot

Pytorch seq2seq chatbot

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Pytorch Chatbot

Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+917.26%)
Mutual labels:  pytorch-tutorial, seq2seq, sequence-to-sequence
Seq2seq chatbot
基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
Stars: ✭ 308 (-8.33%)
Mutual labels:  chatbot, beam-search, seq2seq
Mlds2018spring
Machine Learning and having it Deep and Structured (MLDS) in 2018 spring
Stars: ✭ 124 (-63.1%)
Mutual labels:  chatbot, seq2seq, sequence-to-sequence
Sequence To Sequence 101
a series of tutorials on sequence to sequence learning, implemented with PyTorch.
Stars: ✭ 62 (-81.55%)
Mutual labels:  pytorch-tutorial, seq2seq, sequence-to-sequence
Tf Seq2seq
Sequence to sequence learning using TensorFlow.
Stars: ✭ 387 (+15.18%)
Mutual labels:  beam-search, seq2seq, sequence-to-sequence
Chatlearner
A chatbot implemented in TensorFlow based on the seq2seq model, with certain rules integrated.
Stars: ✭ 528 (+57.14%)
Mutual labels:  chatbot, beam-search, sequence-to-sequence
Seq2seq chatbot new
基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
Stars: ✭ 144 (-57.14%)
Mutual labels:  chatbot, beam-search, seq2seq
A-Persona-Based-Neural-Conversation-Model
No description or website provided.
Stars: ✭ 22 (-93.45%)
Mutual labels:  seq2seq, sequence-to-sequence
Conversational-AI-Chatbot-using-Practical-Seq2Seq
A simple open domain generative based chatbot based on Recurrent Neural Networks
Stars: ✭ 17 (-94.94%)
Mutual labels:  chatbot, seq2seq
RNNSearch
An implementation of attention-based neural machine translation using Pytorch
Stars: ✭ 43 (-87.2%)
Mutual labels:  seq2seq, sequence-to-sequence
pytorch-transformer-chatbot
PyTorch v1.2에서 생긴 Transformer API 를 이용한 간단한 Chitchat 챗봇
Stars: ✭ 44 (-86.9%)
Mutual labels:  chatbot, seq2seq
chatbot
kbqa task-oriented qa seq2seq ir neo4j jena seq2seq tf chatbot chat
Stars: ✭ 32 (-90.48%)
Mutual labels:  chatbot, seq2seq
keras-chatbot-web-api
Simple keras chat bot using seq2seq model with Flask serving web
Stars: ✭ 51 (-84.82%)
Mutual labels:  chatbot, seq2seq
CVAE Dial
CVAE_XGate model in paper "Xu, Dusek, Konstas, Rieser. Better Conversations by Modeling, Filtering, and Optimizing for Coherence and Diversity"
Stars: ✭ 16 (-95.24%)
Mutual labels:  seq2seq, sequence-to-sequence
Neural Conversation Models
Tensorflow based Neural Conversation Models
Stars: ✭ 29 (-91.37%)
Mutual labels:  chatbot, seq2seq
classy
classy is a simple-to-use library for building high-performance Machine Learning models in NLP.
Stars: ✭ 61 (-81.85%)
Mutual labels:  seq2seq, sequence-to-sequence
minimal-nmt
A minimal nmt example to serve as an seq2seq+attention reference.
Stars: ✭ 36 (-89.29%)
Mutual labels:  seq2seq, beam-search
transformer
Neutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (-82.14%)
Mutual labels:  seq2seq, beam-search
Deepqa
My tensorflow implementation of "A neural conversational model", a Deep learning based chatbot
Stars: ✭ 2,811 (+736.61%)
Mutual labels:  chatbot, seq2seq
chatbot
🤖️ 基于 PyTorch 的任务型聊天机器人(支持私有部署和 docker 部署的 Chatbot)
Stars: ✭ 77 (-77.08%)
Mutual labels:  chatbot, seq2seq

pytorch-chatbot

This is a pytorch seq2seq tutorial for Formosa Speech Grand Challenge, which is modified from pratical-pytorch seq2seq-translation-batched.
Tutorial introducing this repo from pytorch official website, Tutorial in Chinese.

Update

A new version is already implemented in branch "dev".

Requirement

  • python 3.5+
  • pytorch 0.4.0
  • tqdm

Get started

Clone the repository

git clone https://github.com/ywk991112/pytorch-chatbot

Corpus

In the corpus file, the input-output sequence pairs should be in the adjacent lines. For example,

I'll see you next time.
Sure. Bye.
How are you?
Better than ever.

The corpus files should be placed under a path like,

pytorch-chatbot/data/<corpus file name>

Otherwise, the corpus file will be tracked by git.

Pretrained Model

The pretrained model on movie_subtitles corpus with an bidirectional rnn layer and hidden size 512 can be downloaded in this link. The pretrained model file should be placed in directory as followed.

mkdir -p save/model/movie_subtitles/1-1_512
mv 50000_backup_bidir_model.tar save/model/movie_subtitles/1-1_512

Training

Run this command to start training, change the argument values in your own need.

python main.py -tr <CORPUS_FILE_PATH> -la 1 -hi 512 -lr 0.0001 -it 50000 -b 64 -p 500 -s 1000

Continue training with saved model.

python main.py -tr <CORPUS_FILE_PATH> -l <MODEL_FILE_PATH> -lr 0.0001 -it 50000 -b 64 -p 500 -s 1000

For more options,

python main.py -h

Testing

Models will be saved in pytorch-chatbot/save/model while training, and this can be changed in config.py.
Evaluate the saved model with input sequences in the corpus.

python main.py -te <MODEL_FILE_PATH> -c <CORPUS_FILE_PATH>

Test the model with input sequence manually.

python main.py -te <MODEL_FILE_PATH> -c <CORPUS_FILE_PATH> -i

Beam search with size k.

python main.py -te <MODEL_FILE_PATH> -c <CORPUS_FILE_PATH> -be k [-i] 
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].