All Projects → FudanNLP → Irl_gen

FudanNLP / Irl_gen

Licence: other
This is implementation of the paper 'Toward Diverse Text Generation with Inverse Reinforcement Learning' https://arxiv.org/abs/1804.11258 IJCAI2018

Programming Languages

python
139335 projects - #7 most used programming language
Roff
2310 projects

Projects that are alternatives of or similar to Irl gen

Tensorflow novelist
模仿莎士比亚创作戏剧!屌炸天的是还能创作金庸武侠小说!快star,保持更新!!
Stars: ✭ 244 (+662.5%)
Mutual labels:  text-generation
porn-description-generator
Generates new porn descriptions based on an edited dataset of xhamster video descriptions uploaded between 2007-2016.
Stars: ✭ 40 (+25%)
Mutual labels:  text-generation
text-generator
Golang text generator for generate SEO texts
Stars: ✭ 18 (-43.75%)
Mutual labels:  text-generation
Gpt 2 Cloud Run
Text-generation API via GPT-2 for Cloud Run
Stars: ✭ 254 (+693.75%)
Mutual labels:  text-generation
Deep-Reinforcement-Learning-With-Python
Master classic RL, deep RL, distributional RL, inverse RL, and more using OpenAI Gym and TensorFlow with extensive Math
Stars: ✭ 222 (+593.75%)
Mutual labels:  inverse-reinforcement-learning
gap-text2sql
GAP-text2SQL: Learning Contextual Representations for Semantic Parsing with Generation-Augmented Pre-Training
Stars: ✭ 83 (+159.38%)
Mutual labels:  text-generation
Gpt2 Newstitle
Chinese NewsTitle Generation Project by GPT2.带有超级详细注释的中文GPT2新闻标题生成项目。
Stars: ✭ 235 (+634.38%)
Mutual labels:  text-generation
Gumbel-CRF
Implementation of NeurIPS 20 paper: Latent Template Induction with Gumbel-CRFs
Stars: ✭ 51 (+59.38%)
Mutual labels:  text-generation
pistoBot
Create an AI that chats like you
Stars: ✭ 121 (+278.13%)
Mutual labels:  text-generation
Transformer-QG-on-SQuAD
Implement Question Generator with SOTA pre-trained Language Models (RoBERTa, BERT, GPT, BART, T5, etc.)
Stars: ✭ 28 (-12.5%)
Mutual labels:  text-generation
transformer-drg-style-transfer
This repository have scripts and Jupyter-notebooks to perform all the different steps involved in Transforming Delete, Retrieve, Generate Approach for Controlled Text Style Transfer
Stars: ✭ 97 (+203.13%)
Mutual labels:  text-generation
Pontryagin-Differentiable-Programming
A unified end-to-end learning and control framework that is able to learn a (neural) control objective function, dynamics equation, control policy, or/and optimal trajectory in a control system.
Stars: ✭ 111 (+246.88%)
Mutual labels:  inverse-reinforcement-learning
hangul ipsum
한글 버전의 lorem ipsum 생성기
Stars: ✭ 17 (-46.87%)
Mutual labels:  text-generation
Rnn.wgan
Code for training and evaluation of the model from "Language Generation with Recurrent Generative Adversarial Networks without Pre-training"
Stars: ✭ 252 (+687.5%)
Mutual labels:  text-generation
Keras-Generating-Sentences-from-a-Continuous-Space
Text Variational Autoencoder inspired by the paper 'Generating Sentences from a Continuous Space' Bowman et al. https://arxiv.org/abs/1511.06349
Stars: ✭ 32 (+0%)
Mutual labels:  text-generation
Summarization Papers
Summarization Papers
Stars: ✭ 238 (+643.75%)
Mutual labels:  text-generation
text-wgan
Improved Training of Wasserstein GANs for Text Generation
Stars: ✭ 20 (-37.5%)
Mutual labels:  text-generation
gpt-neo-fine-tuning-example
Fine-Tune EleutherAI GPT-Neo And GPT-J-6B To Generate Netflix Movie Descriptions Using Hugginface And DeepSpeed
Stars: ✭ 157 (+390.63%)
Mutual labels:  text-generation
Cancerify
Turn an innocent text into torturous hell
Stars: ✭ 44 (+37.5%)
Mutual labels:  text-generation
caffe-char-rnn
Multi-layer Recurrent Neural Networks (with LSTM) for character-level language models in Caffe
Stars: ✭ 25 (-21.87%)
Mutual labels:  text-generation

Summary

This is implementation of the paper 'Toward Diverse Text Generation with Inverse Reinforcement Learning' https://arxiv.org/abs/1804.11258 IJCAI2018

Requirement

Python >=2.7.12

Tensorflow >=1.2.0

nltk == 3.2.5

Synthetic

python irl_generation.py

In experiment, we add weight x to entropy term so that it can be used to adjust the balance of 'quality' and 'diversity' of generated texts. The training process of various x is stored in the 'synthetic/save' folder.

fig

x 0.0 0.02 0.04 0.06 0.08 0.085
Result 0.916 0.918 0.914 2.89 3.46 6.91
0.09 0.095 0.1 0.11 0.12 0.14 0.16
4.78 7.36 8.38 7.76 8.24 8.03 8.26
0.25 0.35 0.5 0.75 1.0 1.25 Ground Truth MLE
8.69 8.86 9.01 9.06 9.12 9.20 5.75 9.03

Image COCO

python irl_coco.py

The results are under 50 epochs pretrain and 50 epochs irl training. Smooth function is not used in calucating BLEU(Same with LeakGAN github repo).

x 0.35 0.45 0.55 0.65 0.75
BLEU-2(f) 0.924 0.889 0.901 0.830 0.827
BLEU-3(f) 0.851 0.789 0.800 0.671 0.647
BLEU-4(f) 0.772 0.690 0.710 0.573 0.556
BLEU-5(f) 0.724 0.624 0.656 0.564 0.571
BLEU-2(b) 0.813 0.851 0.846 0.881 0.883
BLEU-3(b) 0.689 0.733 0.715 0.761 0.754
BLEU-4(b) 0.614 0.657 0.632 0.667 0.652
BLEU-5(b) 0.590 0.627 0.605 0.618 0.606

we also tested with BLEU and Self-BLEU (see TexyGen for details) results are listed here, use same smooth function calculating BLEU in TexyGen github repo:

x 0.35 0.45 0.55 0.65 0.75
BLEU-2 0.922 0.887 0.906 0.828 0.824
BLEU-3 0.844 0.780 0.802 0.653 0.627
BLEU-4 0.751 0.654 0.680 0.463 0.415
BLEU-5 0.645 0.513 0.559 0.308 0.256
Self-BLEU-2 0.936 0.899 0.892 0.829 0.831
Self-BLEU-3 0.881 0.797 0.802 0.641 0.621
Self-BLEU-4 0.829 0.701 0.716 0.464 0.408
Self-BLEU-5 0.781 0.609 0.646 0.325 0.263

Examples of each weight are listed in imagecoco/speech.

Many thanks to SeqGAN and LeakGAN authors, part of my codes are modified from their codes.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].