All Projects → jiayiwang5 → Chinese Chatbot

jiayiwang5 / Chinese Chatbot

中文聊天机器人,基于10万组对白训练而成,采用注意力机制,对一般问题都会生成一个有意义的答复。已上传模型,可直接运行,跑不起来直播吃键盘。

Projects that are alternatives of or similar to Chinese Chatbot

Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+2656.45%)
Mutual labels:  jupyter-notebook, lstm, rnn, attention, seq2seq
Nlp Models Tensorflow
Gathers machine learning and Tensorflow deep learning models for NLP problems, 1.13 < Tensorflow < 2.0
Stars: ✭ 1,603 (+1192.74%)
Mutual labels:  chatbot, jupyter-notebook, lstm, attention
Poetry Seq2seq
Chinese Poetry Generation
Stars: ✭ 159 (+28.23%)
Mutual labels:  jupyter-notebook, lstm, rnn, seq2seq
Machine Learning
My Attempt(s) In The World Of ML/DL....
Stars: ✭ 78 (-37.1%)
Mutual labels:  jupyter-notebook, lstm, rnn, attention
Rnn For Joint Nlu
Pytorch implementation of "Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling" (https://arxiv.org/abs/1609.01454)
Stars: ✭ 176 (+41.94%)
Mutual labels:  jupyter-notebook, lstm, rnn, attention
Screenshot To Code
A neural network that transforms a design mock-up into a static website.
Stars: ✭ 13,561 (+10836.29%)
Mutual labels:  jupyter-notebook, jupyter, lstm, seq2seq
Natural Language Processing With Tensorflow
Natural Language Processing with TensorFlow, published by Packt
Stars: ✭ 222 (+79.03%)
Mutual labels:  jupyter-notebook, lstm, rnn, seq2seq
Base-On-Relation-Method-Extract-News-DA-RNN-Model-For-Stock-Prediction--Pytorch
基於關聯式新聞提取方法之雙階段注意力機制模型用於股票預測
Stars: ✭ 33 (-73.39%)
Mutual labels:  lstm, rnn, seq2seq, attention
Time Attention
Implementation of RNN for Time Series prediction from the paper https://arxiv.org/abs/1704.02971
Stars: ✭ 52 (-58.06%)
Mutual labels:  lstm, rnn, attention
Pointer Networks Experiments
Sorting numbers with pointer networks
Stars: ✭ 53 (-57.26%)
Mutual labels:  lstm, attention, seq2seq
Bitcoin Price Prediction Using Lstm
Bitcoin price Prediction ( Time Series ) using LSTM Recurrent neural network
Stars: ✭ 67 (-45.97%)
Mutual labels:  jupyter-notebook, lstm, rnn
Rnn Notebooks
RNN(SimpleRNN, LSTM, GRU) Tensorflow2.0 & Keras Notebooks (Workshop materials)
Stars: ✭ 48 (-61.29%)
Mutual labels:  jupyter-notebook, lstm, rnn
Neural Networks
All about Neural Networks!
Stars: ✭ 34 (-72.58%)
Mutual labels:  jupyter-notebook, lstm, rnn
Attentive Neural Processes
implementing "recurrent attentive neural processes" to forecast power usage (w. LSTM baseline, MCDropout)
Stars: ✭ 33 (-73.39%)
Mutual labels:  jupyter-notebook, rnn, attention
Tensorflow seq2seq chatbot
Stars: ✭ 81 (-34.68%)
Mutual labels:  chatbot, lstm, seq2seq
Know Your Intent
State of the Art results in Intent Classification using Sematic Hashing for three datasets: AskUbuntu, Chatbot and WebApplication.
Stars: ✭ 116 (-6.45%)
Mutual labels:  chatbot, jupyter-notebook, jupyter
Linear Attention Recurrent Neural Network
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (-4.03%)
Mutual labels:  jupyter-notebook, lstm, rnn
Lstm Sentiment Analysis
Sentiment Analysis with LSTMs in Tensorflow
Stars: ✭ 886 (+614.52%)
Mutual labels:  jupyter-notebook, lstm, rnn
Pytorch Pos Tagging
A tutorial on how to implement models for part-of-speech tagging using PyTorch and TorchText.
Stars: ✭ 96 (-22.58%)
Mutual labels:  jupyter-notebook, lstm, rnn
Cnn lstm for text classify
CNN, LSTM, NBOW, fasttext 中文文本分类
Stars: ✭ 90 (-27.42%)
Mutual labels:  lstm, rnn, attention

Chinese-ChatBot/中文聊天机器人

环境配置

程序 版本
python 3.68
tensorflow 1.13.1
Keras 2.2.4
windows10
jupyter

主要参考资料

关键点

  • LSTM
  • seq2seq
  • attention 实验表明加入attention机制后训练速度快,收敛快,效果更好。

语料及训练环境

青云语料库10万组对话,在google colaboratory训练。

运行

方式一:完整过程

  • 数据预处理
    get_data
  • 模型训练
    chatbot_train(此为挂载到google colab版本,本地跑对路径等需略加修改)
  • 模型预测
    chatbot_inference_Attention

方式二:加载现有模型

  • 运行chatbot_inference_Attention
  • 加载models/W--184-0.5949-.h5

界面(Tkinter)

Attention权重可视化

其他

  • 训练文件chat_bot中,最后三块代码前两个是挂载谷歌云盘用的,最后一个是获取那些loss方便画图,不知道为什么回调函数里的tensorbord不好使,故出此下策;
  • 预测文件里倒数第二块代码只有文字输入没界面,最后一块代码是界面,根据需求两块跑其一即刻;
  • 代码中有很多中间输出,希望对你理解代码提供了些许帮助;
  • models里面有一个我训练好的模型,正常运行应该是没有问题的,你也可以自己训练
  • 本人能力有限,并未找到量化对话效果的指标,因此loss只能大致反映训练进度。
  • 未完待续。
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].