All Projects → HLTCHKUST → Moel

HLTCHKUST / Moel

Licence: mit
MoEL: Mixture of Empathetic Listeners

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Moel

Lic2019 Competition
2019语言与智能技术竞赛-基于知识图谱的主动聊天
Stars: ✭ 109 (+186.84%)
Mutual labels:  chatbot, dialogue-systems
tf2-transformer-chatbot
Transformer Chatbot in TensorFlow 2 with TPU support.
Stars: ✭ 94 (+147.37%)
Mutual labels:  chatbot, transformer
Transformer In Generating Dialogue
An Implementation of 'Attention is all you need' with Chinese Corpus
Stars: ✭ 121 (+218.42%)
Mutual labels:  chatbot, transformer
Convai Bot 1337
NIPS Conversational Intelligence Challenge 2017 Winner System: Skill-based Conversational Agent with Supervised Dialog Manager
Stars: ✭ 65 (+71.05%)
Mutual labels:  chatbot, dialogue-systems
Neuraldialog Cvae
Tensorflow Implementation of Knowledge-Guided CVAE for dialog generation ACL 2017. It is released by Tiancheng Zhao (Tony) from Dialog Research Center, LTI, CMU
Stars: ✭ 279 (+634.21%)
Mutual labels:  chatbot, dialogue-systems
Multiturndialogzoo
Multi-turn dialogue baselines written in PyTorch
Stars: ✭ 106 (+178.95%)
Mutual labels:  chatbot, transformer
Variational-Transformer
Variational Transformers for Diverse Response Generation
Stars: ✭ 79 (+107.89%)
Mutual labels:  transformer, dialogue-systems
Korean restaurant reservation
Implement korean restaurant reservation dialogue system based on hybrid code network.
Stars: ✭ 73 (+92.11%)
Mutual labels:  chatbot, dialogue-systems
SpaceFusion
NAACL'19: "Jointly Optimizing Diversity and Relevance in Neural Response Generation"
Stars: ✭ 73 (+92.11%)
Mutual labels:  chatbot, dialogue-systems
pytorch-transformer-chatbot
PyTorch v1.2에서 생긴 Transformer API 를 이용한 간단한 Chitchat 챗봇
Stars: ✭ 44 (+15.79%)
Mutual labels:  chatbot, transformer
Tensorflow Ml Nlp
텐서플로우와 머신러닝으로 시작하는 자연어처리(로지스틱회귀부터 트랜스포머 챗봇까지)
Stars: ✭ 176 (+363.16%)
Mutual labels:  chatbot, transformer
Seq2seqchatbots
A wrapper around tensor2tensor to flexibly train, interact, and generate data for neural chatbots.
Stars: ✭ 466 (+1126.32%)
Mutual labels:  chatbot, transformer
Xpersona
XPersona: Evaluating Multilingual Personalized Chatbot
Stars: ✭ 54 (+42.11%)
Mutual labels:  chatbot, transformer
Meld
MELD: A Multimodal Multi-Party Dataset for Emotion Recognition in Conversation
Stars: ✭ 373 (+881.58%)
Mutual labels:  chatbot, dialogue-systems
Deeppavlov
An open source library for deep learning end-to-end dialog systems and chatbots.
Stars: ✭ 5,525 (+14439.47%)
Mutual labels:  chatbot, dialogue-systems
Keras Textclassification
中文长文本分类、短句子分类、多标签分类、两句子相似度(Chinese Text Classification of Keras NLP, multi-label classify, or sentence classify, long or short),字词句向量嵌入层(embeddings)和网络层(graph)构建基类,FastText,TextCNN,CharCNN,TextRNN, RCNN, DCNN, DPCNN, VDCNN, CRNN, Bert, Xlnet, Albert, Attention, DeepMoji, HAN, 胶囊网络-CapsuleNet, Transformer-encode, Seq2seq, SWEM, LEAM, TextGCN
Stars: ✭ 914 (+2305.26%)
Mutual labels:  transformer
Dialogflow Sendgrid
📮 Dialogflow + Sendgrid = AI Mailbox
Stars: ✭ 33 (-13.16%)
Mutual labels:  chatbot
Witwicky
Witwicky: An implementation of Transformer in PyTorch.
Stars: ✭ 21 (-44.74%)
Mutual labels:  transformer
Cog
Bringing the power of the command line to chat
Stars: ✭ 910 (+2294.74%)
Mutual labels:  chatbot
Conversational Ai
Conversational AI Reading Materials
Stars: ✭ 34 (-10.53%)
Mutual labels:  dialogue-systems

MoEL: Mixture of Empathetic Listeners

License: MIT

This is the PyTorch implementation of the paper:

MoEL: Mixture of Empathetic Listeners. Zhaojiang Lin, Andrea Madotto, Jamin Shin, Peng Xu, Pascale Fung EMNLP 2019 [PDF]

This code has been written using PyTorch >= 0.4.1. If you use any source codes or datasets included in this toolkit in your work, please cite the following paper. The bibtex is listed below:

@article{lin2019moel,
  title={MoEL: Mixture of Empathetic Listeners},
  author={Lin, Zhaojiang and Madotto, Andrea and Shin, Jamin and Xu, Peng and Fung, Pascale},
  journal={arXiv preprint arXiv:1908.07687},
  year={2019}
}

Abstract

Previous research on empathetic dialogue systems has mostly focused on generating responses given certain emotions. However, being empathetic not only requires the ability of generating emotional responses, but more importantly, requires the understanding of user emotions and replying appropriately. In this paper, we propose a novel end-to-end approach for modeling empathy in dialogue systems: Mixture of Empathetic Listeners (MoEL). Our model first captures the user emotions and outputs an emotion distribution. Based on this, MoEL will softly combine the output states of the appropriate Listener(s), which are each optimized to react to certain emotions, and generate an empathetic response. Human evaluations on empathetic-dialogues dataset confirm that MoEL outperforms multitask training baseline in terms of empathy, relevance, and fluency. Furthermore, the case study on generated responses of different Listeners shows high interpretability of our model.

MoEL Architecture:

The proposed model Mixture of Empathetic Listeners, which has an emotion tracker, n empathetic listeners along with a shared listener, and a meta listener to fuse the information from listeners and produce the empathetic response.

Attention on the Listeners

The visualization of attention on the listeners: The left side is the context followed by the responses generated by MoEL. The heat map illustrate the attention weights on 32 listeners

Dependency

Check the packages needed or simply run the command

❱❱❱ pip install -r requirements.txt

Pre-trained glove embedding: glove.6B.300d.txt inside folder /vectors/.

Experiment

Quick Result

To skip training, please check generation_result.txt.

Dataset

The dataset (empathetic-dialogue) is preprocessed and stored in npy format: sys_dialog_texts.train.npy, sys_target_texts.train.npy, sys_emotion_texts.train.npy which consist of parallel list of context (source), response (target) and emotion label (additional label).

Training&Test

MoEL

❱❱❱ python3 main.py --model experts  --label_smoothing --noam --emb_dim 300 --hidden_dim 300 --hop 1 --heads 2 --topk 5 --cuda --pretrain_emb --softmax --basic_learner --schedule 10000 --save_path save/moel/

Transformer baseline

❱❱❱ python3 main.py --model trs  --label_smoothing --noam --emb_dim 300 --hidden_dim 300 --hop 2 --heads 2 --cuda --pretrain_emb --save_path save/trs/

Multitask Transformer baseline

❱❱❱ python3 main.py --model trs  --label_smoothing --noam --emb_dim 300 --hidden_dim 300 --hop 2 --heads 2 --cuda --pretrain_emb --multitask --save_path save/multi-trs/

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].