All Projects → hugochan → RL-based-Graph2Seq-for-NQG

hugochan / RL-based-Graph2Seq-for-NQG

Licence: Apache-2.0 license
Code & data accompanying the ICLR 2020 paper "Reinforcement Learning Based Graph-to-Sequence Model for Natural Question Generation"

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to RL-based-Graph2Seq-for-NQG

Transformer-QG-on-SQuAD
Implement Question Generator with SOTA pre-trained Language Models (RoBERTa, BERT, GPT, BART, T5, etc.)
Stars: ✭ 28 (-73.08%)
Mutual labels:  text-generation, question-generation
keras-deep-learning
Various implementations and projects on CNN, RNN, LSTM, GAN, etc
Stars: ✭ 22 (-78.85%)
Mutual labels:  text-generation
php-text-generator
Fast SEO text generator on a mask.
Stars: ✭ 19 (-81.73%)
Mutual labels:  text-generation
MLH-Quizzet
This is a smart Quiz Generator that generates a dynamic quiz from any uploaded text/PDF document using NLP. This can be used for self-analysis, question paper generation, and evaluation, thus reducing human effort.
Stars: ✭ 23 (-77.88%)
Mutual labels:  question-generation
TIMME
TIMME: Twitter Ideology-detection via Multi-task Multi-relational Embedding (code & data)
Stars: ✭ 57 (-45.19%)
Mutual labels:  graph-neural-networks
sdn-nfv-papers
This is a paper list about Resource Allocation in Network Functions Virtualization (NFV) and Software-Defined Networking (SDN).
Stars: ✭ 40 (-61.54%)
Mutual labels:  graph-neural-networks
Meta-GDN AnomalyDetection
Implementation of TheWebConf 2021 -- Few-shot Network Anomaly Detection via Cross-network Meta-learning
Stars: ✭ 22 (-78.85%)
Mutual labels:  graph-neural-networks
InterGCN-ABSA
[COLING 2020] Jointly Learning Aspect-Focused and Inter-Aspect Relations with Graph Convolutional Networks for Aspect Sentiment Analysis
Stars: ✭ 41 (-60.58%)
Mutual labels:  graph-neural-networks
awesome-graph-explainability-papers
Papers about explainability of GNNs
Stars: ✭ 153 (+47.12%)
Mutual labels:  graph-neural-networks
Zero-shot-Fact-Verification
Codes for ACL-IJCNLP 2021 Paper "Zero-shot Fact Verification by Claim Generation"
Stars: ✭ 39 (-62.5%)
Mutual labels:  question-generation
graph-convnet-tsp
Code for the paper 'An Efficient Graph Convolutional Network Technique for the Travelling Salesman Problem' (INFORMS Annual Meeting Session 2019)
Stars: ✭ 196 (+88.46%)
Mutual labels:  graph-neural-networks
GNN4CD
Supervised community detection with line graph neural networks
Stars: ✭ 67 (-35.58%)
Mutual labels:  graph-neural-networks
fiction generator
Fiction generator with Tensorflow. 模仿王小波的风格的小说生成器
Stars: ✭ 27 (-74.04%)
Mutual labels:  text-generation
GraphLIME
This is a Pytorch implementation of GraphLIME
Stars: ✭ 40 (-61.54%)
Mutual labels:  graph-neural-networks
deepsphere-cosmo-tf1
A spherical convolutional neural network for cosmology (TFv1).
Stars: ✭ 119 (+14.42%)
Mutual labels:  graph-neural-networks
Chinese-Hip-pop-Generation
Generate Chinese hip-pop lyrics using GAN
Stars: ✭ 121 (+16.35%)
Mutual labels:  text-generation
GNNSCVulDetector
Smart Contract Vulnerability Detection Using Graph Neural Networks (IJCAI-20 Accepted)
Stars: ✭ 42 (-59.62%)
Mutual labels:  graph-neural-networks
Entity2Topic
[NAACL2018] Entity Commonsense Representation for Neural Abstractive Summarization
Stars: ✭ 20 (-80.77%)
Mutual labels:  text-generation
Spectrum
Spectrum is an AI that uses machine learning to generate Rap song lyrics
Stars: ✭ 37 (-64.42%)
Mutual labels:  text-generation
RgxGen
Regex: generate matching and non matching strings based on regex pattern.
Stars: ✭ 45 (-56.73%)
Mutual labels:  text-generation

RL-based-Graph2Seq-for-NQG

Code & data accompanying the ICLR 2020 paper "Reinforcement Learning Based Graph-to-Sequence Model for Natural Question Generation".

Get started

Prerequisites

This code is written in python 3. You will need to install a few python packages in order to run the code. We recommend you to use virtualenv to manage your python packages and environments. Please take the following steps to create a python virtual environment.

  • If you have not installed virtualenv, install it with pip install virtualenv.
  • Create a virtual environment with virtualenv venv.
  • Activate the virtual environment with source venv/bin/activate.
  • Install the package requirements with pip install -r requirements.txt.

Run the model

  • Download the preprocessed data from squad-split1 and squad-split2. And put the data under the root directory. So the file hierarchy will be like: data/squad-split1 and data/squad-split2.

  • Run the model

    python main.py -config config/squad_split1/graph2seq_static_bert_finetune_word_70k_0.4_bs_60.yml
    

    Note that you can specify the output path by modifying out_dir in a config file. If you want to finetune a pretrained model, you can specify the path to the pretrained model by modifying pretrained and you need to set out_dir to null. If you just want to load a pretrained model and evaluate it on a test set, you need to set both trainset and devset to null.

  • Finetune the model using RL

    python main.py -config config/squad_split1/rl_graph2seq_static_bert_finetune_word_70k_0.4_bs_60.yml
    

Reference

If you found this code useful, please consider citing the following paper:

Yu Chen, Lingfei Wu and Mohammed J. Zaki. "Reinforcement Learning Based Graph-to-Sequence Model for Natural Question Generation." In Proceedings of the 8th International Conference on Learning Representations (ICLR 2020), Addis Ababa, Ethiopia, Apr. 26-30, 2020.

@inproceedings{chen2019reinforcement,
author    = {Chen, Yu and Wu, Lingfei and Zaki, Mohammed J.},
title     = {Reinforcement Learning Based Graph-to-Sequence Model for Natural Question Generation},
booktitle = {Proceedings of the 8th International Conference on Learning Representations},
month = {Apr. 26-30,},
year      = {2020}}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].