All Projects → adambcomer → Tensorflow Seq2seq Dialogs

adambcomer / Tensorflow Seq2seq Dialogs

Build conversation Seq2Seq models with TensorFlow

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Tensorflow Seq2seq Dialogs

Tensorflow seq2seq chatbot
Stars: ✭ 81 (+88.37%)
Mutual labels:  chatbot, neural-networks, seq2seq
Seq2seqchatbots
A wrapper around tensor2tensor to flexibly train, interact, and generate data for neural chatbots.
Stars: ✭ 466 (+983.72%)
Mutual labels:  chatbot, neural-networks, seq2seq
pytorch-transformer-chatbot
PyTorch v1.2에서 생긴 Transformer API 를 이용한 간단한 Chitchat 챗봇
Stars: ✭ 44 (+2.33%)
Mutual labels:  chatbot, seq2seq
chatbot
🤖️ 基于 PyTorch 的任务型聊天机器人(支持私有部署和 docker 部署的 Chatbot)
Stars: ✭ 77 (+79.07%)
Mutual labels:  chatbot, seq2seq
Komputation
Komputation is a neural network framework for the Java Virtual Machine written in Kotlin and CUDA C.
Stars: ✭ 295 (+586.05%)
Mutual labels:  neural-networks, seq2seq
chatbot
kbqa task-oriented qa seq2seq ir neo4j jena seq2seq tf chatbot chat
Stars: ✭ 32 (-25.58%)
Mutual labels:  chatbot, seq2seq
Conversational-AI-Chatbot-using-Practical-Seq2Seq
A simple open domain generative based chatbot based on Recurrent Neural Networks
Stars: ✭ 17 (-60.47%)
Mutual labels:  chatbot, seq2seq
Seq2seq chatbot links
Links to the implementations of neural conversational models for different frameworks
Stars: ✭ 270 (+527.91%)
Mutual labels:  chatbot, seq2seq
Tensorflow Ml Nlp
텐서플로우와 머신러닝으로 시작하는 자연어처리(로지스틱회귀부터 트랜스포머 챗봇까지)
Stars: ✭ 176 (+309.3%)
Mutual labels:  chatbot, seq2seq
Seq2seq Chatbot For Keras
This repository contains a new generative model of chatbot based on seq2seq modeling.
Stars: ✭ 322 (+648.84%)
Mutual labels:  chatbot, seq2seq
Seq2seq chatbot
基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
Stars: ✭ 308 (+616.28%)
Mutual labels:  chatbot, seq2seq
Pytorch Chatbot
Pytorch seq2seq chatbot
Stars: ✭ 336 (+681.4%)
Mutual labels:  chatbot, seq2seq
keras-chatbot-web-api
Simple keras chat bot using seq2seq model with Flask serving web
Stars: ✭ 51 (+18.6%)
Mutual labels:  chatbot, seq2seq
Debug seq2seq
[unmaintained] Make seq2seq for keras work
Stars: ✭ 233 (+441.86%)
Mutual labels:  chatbot, seq2seq
Neural Conversation Models
Tensorflow based Neural Conversation Models
Stars: ✭ 29 (-32.56%)
Mutual labels:  chatbot, seq2seq
Tensorflow Tutorials
텐서플로우를 기초부터 응용까지 단계별로 연습할 수 있는 소스 코드를 제공합니다
Stars: ✭ 2,096 (+4774.42%)
Mutual labels:  chatbot, seq2seq
Deepqa
My tensorflow implementation of "A neural conversational model", a Deep learning based chatbot
Stars: ✭ 2,811 (+6437.21%)
Mutual labels:  chatbot, seq2seq
Conversation Tensorflow
TensorFlow implementation of Conversation Models
Stars: ✭ 143 (+232.56%)
Mutual labels:  chatbot, seq2seq
Nlp pytorch project
Embedding, NMT, Text_Classification, Text_Generation, NER etc.
Stars: ✭ 153 (+255.81%)
Mutual labels:  chatbot, seq2seq
Dynamic Seq2seq
seq2seq中文聊天机器人
Stars: ✭ 303 (+604.65%)
Mutual labels:  chatbot, seq2seq

Tensorflow Seq2Seq For Conversations

Build conversation Seq2Seq models with TensorFlow

Takes dialog data and trains a model to make responses for a conversation input.

Dependencies

Data format

Data must be formated as input text,output text for each exchange. File must be named [dialogs.csv]. note: no space between the input text or ouput text in relation to the comma.

Example data:
hi,hello
how are you?,i'm well
what is your name?,my name is john

Before TensorFlow builds the model we compile a dictionary of all the words in the training set to convert them to vectors. The code has a word frequency minimum by default, but you can uncomment one line to use every word in the dataset. The function is create_dictionary() near the bottom if you wish to do so.

Once the dictionary is built(This can take a few hours!) TensorFlow makes the model and starts training.

train.py

python train.py -dialog_path -units -layers -training_iterations -batch_size --restore --display_out --load_dictionary

Required to build:

  • dialog_path: path to the dialog csv ex: /user/dialog_folder/
  • units: number of neurons per GRU cell
  • layers: number of layers deep for the recurrent model (min. 1)
  • training_iteration: number of mini-batches to train on
  • batch_size: number of dialog pairs per mini-batch

Not required:

  • --restore: restores the model from a past save
  • --display_out: displays a feed-forward of the model to the console
  • --load_dictionary: uses a pre-built dictionary. Use this once you have built the model at least once. It save computing time.

All of the Not Required args are booleans that are entered automatically.

Example: python train.py ~/adam/tensorflow_seq2seq/ 512 4 1000000 32 Makes a new model where the data is at "~/adam/tensorflow_seq2seq/". The model has 512 neurons per GRU cell and is 4 layers deep. The program will run for 1,000,000 iterations with a batch size of 32.

python train.py ~/adam/tensorflow_seq2seq/ 512 4 1000000 32 --restore --display_out --load_dictionary This command will build the same model above and restore from a past save. This command will display a feed-forward pass after each training iteration. Also this command will use a past dictionary that is generated after the first run of the model.

test.py

python test.py -dialog_path -units -layers

Required to build:

  • dialog_path: use the same path as used to train the model
  • units: use the same number of neurons as when training
  • layers: use the same number of layers as when training

Example: python train.py ~/adam/tensorflow_seq2seq/ 512 4 This will use the model above after training. Once you run this a "> " will appear. Type in anything you want to see the output of the model.

Notes:

FYI: To train this model you need something like a GTX TITAN X or a cluster computer and a lot of time. Not for the Deep Learning weary.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].