All Projects → nanguoshun → LSR

nanguoshun / LSR

Licence: other
Pytorch Implementation of our ACL 2020 Paper "Reasoning with Latent Structure Refinement for Document-Level Relation Extraction"

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to LSR

DiffuseVAE
A combination of VAE's and Diffusion Models for efficient, controllable and high-fidelity generation from low-dimensional latents
Stars: ✭ 81 (-33.06%)
Mutual labels:  latent-variable-models
LatentDiffEq.jl
Latent Differential Equations models in Julia.
Stars: ✭ 34 (-71.9%)
Mutual labels:  latent-variable-models
lava
Latent Variable Models in R https://kkholst.github.io/lava/
Stars: ✭ 28 (-76.86%)
Mutual labels:  latent-variable-models
cross-lingual-struct-flow
PyTorch implementation of ACL paper https://arxiv.org/abs/1906.02656
Stars: ✭ 23 (-80.99%)
Mutual labels:  latent-variable-models

LSR

This repository is the PyTorch implementation of our LSR model in ACL 2020 Paper "Reasoning with Latent Structure Refinement for Document-Level Relation Extraction".

Requirement

python==3.6.7 
torch==1.3.1 + CUDA == 9.2 1.5.1
OR torch==1.5.1 + CUDA == 10.1
tqdm==4.29.1
numpy==1.15.4
spacy==2.1.3
networkx==2.4

Overview of LSR

Node construtor:

Node Constructor

Overview of the Node Constructor: A context encoder is applied to get the contextualized representations of sentences. The representations of mentions and words in the meta dependency path are extracted as mention nodes and MDP nodes. An average pooling is used to construct the entity node from the mention nodes. For example, the entity node Lutsenko is constructed by averaging representations of its mentions Lutsenko and He.

Dynamic Reasoner:

Overview of the Dynamic Reasoner. Each block consists of two sub-modules: structure induction and multi-hop reasoning. The first module takes the nodes constructed by the Node Constructor as inputs. Representations of nodes are fed into two feed forward networks before the bilinear transformation. The latent document-level structure is computed by the Matrix-Tree Theorem. The second module takes the structure as input and updates representations of nodes by using the densely connected graph convolutional networks. We stack N blocks which correspond to N times of refinement. Each iteration outputs the latent structure for inference.

Dataset

For the dataset and pretrained embeddings, please download it here, which are officially provided by DocRED: A Large-Scale Document-Level Relation Extraction Dataset .

Data Proprocessing

After you download the dataset, please put the files train_annotated.json, dev.json and test.json to the ./data directory, and files in pre directory to the code/prepro_data. Run:

# cd code
# python3 gen_data.py 

For the BERT encoder:

# cd code
# python3 gen_data_bert.py 

Training

In order to train the model, run:

# cd code
# python3 train.py

For the BERT encoder, Please set the '--model_name' as 'LSR_bert'

Test

After the training process, we can test the model by:

python3 test.py

Related Repo

Codes are adapted from the repo of the ACL2019 paper DocRED DocRED: A Large-Scale Document-Level Relation Extraction Dataset and the Pytorch implementation of Learning Structured Text Representations. We would like to thanks Yang Liu and Vidhisha Balachandran for their constructive suggestions.

Citation

If you find our work or the code useful, please consider cite our paper using:

@inproceedings{nan2020lsr,
 author = {Guoshun, Nan and Zhijiang, Guo and  Ivan, Sekulić and Wei, Lu},
 booktitle = {Proc. of ACL},
 title = {Reasoning with Latent Structure Refinement for Document-Level Relation Extraction},
 year = {2020}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].