All Projects → lc222 → Seq2seq_chatbot

lc222 / Seq2seq_chatbot

基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Seq2seq chatbot

Seq2seq chatbot new
基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
Stars: ✭ 144 (-53.25%)
Mutual labels:  chatbot, beam-search, attention-mechanism, seq2seq
Poetry Seq2seq
Chinese Poetry Generation
Stars: ✭ 159 (-48.38%)
Mutual labels:  beam-search, attention-mechanism, seq2seq
Pytorch Chatbot
Pytorch seq2seq chatbot
Stars: ✭ 336 (+9.09%)
Mutual labels:  chatbot, beam-search, seq2seq
Tf Seq2seq
Sequence to sequence learning using TensorFlow.
Stars: ✭ 387 (+25.65%)
Mutual labels:  beam-search, seq2seq, nmt
Chatlearner
A chatbot implemented in TensorFlow based on the seq2seq model, with certain rules integrated.
Stars: ✭ 528 (+71.43%)
Mutual labels:  chatbot, beam-search, nmt
Nlp pytorch project
Embedding, NMT, Text_Classification, Text_Generation, NER etc.
Stars: ✭ 153 (-50.32%)
Mutual labels:  chatbot, seq2seq, nmt
minimal-nmt
A minimal nmt example to serve as an seq2seq+attention reference.
Stars: ✭ 36 (-88.31%)
Mutual labels:  seq2seq, beam-search, attention-mechanism
Dynamic Seq2seq
seq2seq中文聊天机器人
Stars: ✭ 303 (-1.62%)
Mutual labels:  chatbot, seq2seq
Video-Cap
🎬 Video Captioning: ICCV '15 paper implementation
Stars: ✭ 44 (-85.71%)
Mutual labels:  seq2seq, attention-mechanism
Image-Caption
Using LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (-88.31%)
Mutual labels:  beam-search, attention-mechanism
ttslearn
ttslearn: Library for Pythonで学ぶ音声合成 (Text-to-speech with Python)
Stars: ✭ 158 (-48.7%)
Mutual labels:  seq2seq, attention-mechanism
RNNSearch
An implementation of attention-based neural machine translation using Pytorch
Stars: ✭ 43 (-86.04%)
Mutual labels:  seq2seq, nmt
Conversational-AI-Chatbot-using-Practical-Seq2Seq
A simple open domain generative based chatbot based on Recurrent Neural Networks
Stars: ✭ 17 (-94.48%)
Mutual labels:  chatbot, seq2seq
Neural Conversation Models
Tensorflow based Neural Conversation Models
Stars: ✭ 29 (-90.58%)
Mutual labels:  chatbot, seq2seq
A-Persona-Based-Neural-Conversation-Model
No description or website provided.
Stars: ✭ 22 (-92.86%)
Mutual labels:  seq2seq, attention-mechanism
pytorch-transformer-chatbot
PyTorch v1.2에서 생긴 Transformer API 를 이용한 간단한 Chitchat 챗봇
Stars: ✭ 44 (-85.71%)
Mutual labels:  chatbot, seq2seq
chatbot
kbqa task-oriented qa seq2seq ir neo4j jena seq2seq tf chatbot chat
Stars: ✭ 32 (-89.61%)
Mutual labels:  chatbot, seq2seq
transformer
Neutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (-80.52%)
Mutual labels:  seq2seq, beam-search
Seq2seq Summarizer
Pointer-generator reinforced seq2seq summarization in PyTorch
Stars: ✭ 306 (-0.65%)
Mutual labels:  attention-mechanism, seq2seq
chatbot
🤖️ 基于 PyTorch 的任务型聊天机器人(支持私有部署和 docker 部署的 Chatbot)
Stars: ✭ 77 (-75%)
Mutual labels:  chatbot, seq2seq

=================================================更新=========================================================== 训练好的模型已经上传到百度云网盘,如果大家有需要可以前去下载。模型训练速度的话,CPU,16G内存,一天即刻训练完成~~~

链接:https://pan.baidu.com/s/1hrNxaSk 密码:d2sn

=================================================分割线,下面是正文===============================================

本文是一个简单的基于seq2seq模型的chatbot对话系统的tensorflow实现。

代码的讲解可以参考我的知乎专栏文章:

从头实现深度学习的对话系统--简单chatbot代码实现

代码参考了DeepQA,在其基础上添加了beam search的功能和attention的机制,

最终的效果如下图所示:

测试效果,根据用户输入回复概率最大的前beam_size个句子:

#使用方法

1,下载代码到本地(data文件夹下已经包含了处理好的数据集,所以无需额外下载数据集)

2,训练模型,将chatbot.py文件第34行的decode参数修改为False,进行训练模型

(之后我会把我这里训练好的模型上传到网上方便大家使用)

3,训练完之后(大概要一天左右的时间,30个epoches),再将decode参数修改为True

就可以进行测试了。输入你想问的话看他回复什么吧==

这里还需要注意的就是要记得修改数据集和最后模型文件的绝对路径,不然可能会报错。

分别在44行,57行,82行三处。好了,接下来就可以愉快的玩耍了~~

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].