All Projects → Cartus → Aggcn

Cartus / Aggcn

Licence: mit
Attention Guided Graph Convolutional Networks for Relation Extraction (authors' PyTorch implementation for the ACL19 paper)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Aggcn

Tacred Relation
PyTorch implementation of the position-aware attention model for relation extraction
Stars: ✭ 271 (-14.78%)
Mutual labels:  relation-extraction, information-extraction
Gcn Over Pruned Trees
Graph Convolution over Pruned Dependency Trees Improves Relation Extraction (authors' PyTorch implementation)
Stars: ✭ 312 (-1.89%)
Mutual labels:  relation-extraction, information-extraction
Open Ie Papers
Open Information Extraction (OpenIE) and Open Relation Extraction (ORE) papers and data.
Stars: ✭ 150 (-52.83%)
Mutual labels:  relation-extraction, information-extraction
Tre
[AKBC 19] Improving Relation Extraction by Pre-trained Language Representations
Stars: ✭ 95 (-70.13%)
Mutual labels:  relation-extraction, information-extraction
IE Paper Notes
Paper notes for Information Extraction, including Relation Extraction (RE), Named Entity Recognition (NER), Entity Linking (EL), Event Extraction (EE), Named Entity Disambiguation (NED).
Stars: ✭ 14 (-95.6%)
Mutual labels:  information-extraction, relation-extraction
Pytorch multi head selection re
BERT + reproduce "Joint entity recognition and relation extraction as a multi-head selection problem" for Chinese and English IE
Stars: ✭ 105 (-66.98%)
Mutual labels:  relation-extraction, information-extraction
ReQuest
Indirect Supervision for Relation Extraction Using Question-Answer Pairs (WSDM'18)
Stars: ✭ 26 (-91.82%)
Mutual labels:  information-extraction, relation-extraction
Open Entity Relation Extraction
Knowledge triples extraction and knowledge base construction based on dependency syntax for open domain text.
Stars: ✭ 350 (+10.06%)
Mutual labels:  relation-extraction, information-extraction
InformationExtractionSystem
Information Extraction System can perform NLP tasks like Named Entity Recognition, Sentence Simplification, Relation Extraction etc.
Stars: ✭ 27 (-91.51%)
Mutual labels:  information-extraction, relation-extraction
CogIE
CogIE: An Information Extraction Toolkit for Bridging Text and CogNet. ACL 2021
Stars: ✭ 47 (-85.22%)
Mutual labels:  information-extraction, relation-extraction
PSPE
Pretrained Span and span Pair Encoder, code for "Pre-training Entity Relation Encoder with Intra-span and Inter-spanInformation.", EMNLP2020. It is based on our NERE toolkit (https://github.com/Receiling/NERE).
Stars: ✭ 17 (-94.65%)
Mutual labels:  information-extraction, relation-extraction
knowledge-graph-nlp-in-action
从模型训练到部署,实战知识图谱(Knowledge Graph)&自然语言处理(NLP)。涉及 Tensorflow, Bert+Bi-LSTM+CRF,Neo4j等 涵盖 Named Entity Recognition,Text Classify,Information Extraction,Relation Extraction 等任务。
Stars: ✭ 58 (-81.76%)
Mutual labels:  information-extraction, relation-extraction
Distre
[ACL 19] Fine-tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction
Stars: ✭ 75 (-76.42%)
Mutual labels:  relation-extraction, information-extraction
Information Extraction Chinese
Chinese Named Entity Recognition with IDCNN/biLSTM+CRF, and Relation Extraction with biGRU+2ATT 中文实体识别与关系提取
Stars: ✭ 1,888 (+493.71%)
Mutual labels:  relation-extraction, information-extraction
Usc Ds Relationextraction
Distantly Supervised Relation Extraction
Stars: ✭ 378 (+18.87%)
Mutual labels:  relation-extraction, information-extraction
lima
The Libre Multilingual Analyzer, a Natural Language Processing (NLP) C++ toolkit.
Stars: ✭ 75 (-76.42%)
Mutual labels:  information-extraction, relation-extraction
Casrel
A Novel Cascade Binary Tagging Framework for Relational Triple Extraction. Accepted by ACL 2020.
Stars: ✭ 329 (+3.46%)
Mutual labels:  relation-extraction, information-extraction
DocuNet
Code and dataset for the IJCAI 2021 paper "Document-level Relation Extraction as Semantic Segmentation".
Stars: ✭ 84 (-73.58%)
Mutual labels:  information-extraction, relation-extraction
Multiple Relations Extraction Only Look Once
Multiple-Relations-Extraction-Only-Look-Once. Just look at the sentence once and extract the multiple pairs of entities and their corresponding relations. 端到端联合多关系抽取模型,可用于 http://lic2019.ccf.org.cn/kg 信息抽取。
Stars: ✭ 269 (-15.41%)
Mutual labels:  relation-extraction, information-extraction
Oie Resources
A curated list of Open Information Extraction (OIE) resources: papers, code, data, etc.
Stars: ✭ 283 (-11.01%)
Mutual labels:  relation-extraction, information-extraction

Attention Guided Graph Convolutional Networks for Relation Extraction

This paper/code introduces the Attention Guided Graph Convolutional graph convolutional networks (AGGCNs) over dependency trees for the large scale sentence-level relation extraction task (TACRED).

You can find the paper here

See below for an overview of the model architecture:

AGGCN Architecture

Requirements

Our model was trained on GPU Tesla P100-SXM2 of Nvidia DGX.

  • Python 3 (tested on 3.6.8)

  • PyTorch (tested on 0.4.1)

  • CUDA (tested on 9.0)

  • tqdm

  • unzip, wget (for downloading only)

We have released our trained model and training log in this repo. You can find the logs under the main directory and the trained model under the saved_models directory. Our released model achieves 69.0% F1 score as reported in the original ACL paper. Moreover, in our Arxiv version, we also reported the mean and std of F1 score, the stats is 68.2% +- 0.5% based on 5 trained models. The random seeds are 0, 37, 47, 72 and 76.

There is no guarantee that the model is the same as we released and reported if you run the code on different environments (including hardware and software). If you train the model by using the default setting, you will get the exact same output in the logs.txt.

Preparation

The code requires that you have access to the TACRED dataset (LDC license required). Once you have the TACRED data, please put the JSON files under the directory dataset/tacred.

First, download and unzip GloVe vectors:

chmod +x download.sh; ./download.sh

Then prepare vocabulary and initial word vectors with:

python3 prepare_vocab.py dataset/tacred dataset/vocab --glove_dir dataset/glove

This will write vocabulary and word vectors as a numpy matrix into the dir dataset/vocab.

Training

To train the AGGCN model, run:

bash train_aggcn.sh 1

Model checkpoints and logs will be saved to ./saved_models/01.

For details on the use of other parameters, please refer to train.py.

Evaluation

Our pretrained model is saved under the dir saved_models/01. To run evaluation on the test set, run:

python3 eval.py saved_models/01 --dataset test

This will use the best_model.pt file by default. Use --model checkpoint_epoch_10.pt to specify a model checkpoint file.

Retrain

Reload a pretrained model and finetune it, run:

python train.py --load --model_file saved_models/01/best_model.pt --optim sgd --lr 0.001

Related Repo

The paper uses the model DCGCN, for detail architecture please refer to the TACL19 paper Densely Connected Graph Convolutional Network for Graph-to-Sequence Learning. Codes are adapted from the repo of the EMNLP18 paper Graph Convolution over Pruned Dependency Trees Improves Relation Extraction.

Citation

@inproceedings{guo2019aggcn,
 author = {Guo, Zhijiang and Zhang, Yan and Lu, Wei},
 booktitle = {Proc. of ACL},
 title = {Attention Guided Graph Convolutional Networks for Relation Extraction},
 year = {2019}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].