All Projects → DFKI-NLP → Distre

DFKI-NLP / Distre

Licence: apache-2.0
[ACL 19] Fine-tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Distre

Tre
[AKBC 19] Improving Relation Extraction by Pre-trained Language Representations
Stars: ✭ 95 (+26.67%)
Mutual labels:  relation-extraction, information-extraction, transformer
IE Paper Notes
Paper notes for Information Extraction, including Relation Extraction (RE), Named Entity Recognition (NER), Entity Linking (EL), Event Extraction (EE), Named Entity Disambiguation (NED).
Stars: ✭ 14 (-81.33%)
Mutual labels:  information-extraction, relation-extraction
Relation-Extraction-Transformer
NLP: Relation extraction with position-aware self-attention transformer
Stars: ✭ 63 (-16%)
Mutual labels:  transformer, relation-extraction
Multiple Relations Extraction Only Look Once
Multiple-Relations-Extraction-Only-Look-Once. Just look at the sentence once and extract the multiple pairs of entities and their corresponding relations. 端到端联合多关系抽取模型,可用于 http://lic2019.ccf.org.cn/kg 信息抽取。
Stars: ✭ 269 (+258.67%)
Mutual labels:  relation-extraction, information-extraction
DocuNet
Code and dataset for the IJCAI 2021 paper "Document-level Relation Extraction as Semantic Segmentation".
Stars: ✭ 84 (+12%)
Mutual labels:  information-extraction, relation-extraction
CogIE
CogIE: An Information Extraction Toolkit for Bridging Text and CogNet. ACL 2021
Stars: ✭ 47 (-37.33%)
Mutual labels:  information-extraction, relation-extraction
knowledge-graph-nlp-in-action
从模型训练到部署,实战知识图谱(Knowledge Graph)&自然语言处理(NLP)。涉及 Tensorflow, Bert+Bi-LSTM+CRF,Neo4j等 涵盖 Named Entity Recognition,Text Classify,Information Extraction,Relation Extraction 等任务。
Stars: ✭ 58 (-22.67%)
Mutual labels:  information-extraction, relation-extraction
Information Extraction Chinese
Chinese Named Entity Recognition with IDCNN/biLSTM+CRF, and Relation Extraction with biGRU+2ATT 中文实体识别与关系提取
Stars: ✭ 1,888 (+2417.33%)
Mutual labels:  relation-extraction, information-extraction
Oie Resources
A curated list of Open Information Extraction (OIE) resources: papers, code, data, etc.
Stars: ✭ 283 (+277.33%)
Mutual labels:  relation-extraction, information-extraction
Gcn Over Pruned Trees
Graph Convolution over Pruned Dependency Trees Improves Relation Extraction (authors' PyTorch implementation)
Stars: ✭ 312 (+316%)
Mutual labels:  relation-extraction, information-extraction
Aggcn
Attention Guided Graph Convolutional Networks for Relation Extraction (authors' PyTorch implementation for the ACL19 paper)
Stars: ✭ 318 (+324%)
Mutual labels:  relation-extraction, information-extraction
ReQuest
Indirect Supervision for Relation Extraction Using Question-Answer Pairs (WSDM'18)
Stars: ✭ 26 (-65.33%)
Mutual labels:  information-extraction, relation-extraction
lima
The Libre Multilingual Analyzer, a Natural Language Processing (NLP) C++ toolkit.
Stars: ✭ 75 (+0%)
Mutual labels:  information-extraction, relation-extraction
InformationExtractionSystem
Information Extraction System can perform NLP tasks like Named Entity Recognition, Sentence Simplification, Relation Extraction etc.
Stars: ✭ 27 (-64%)
Mutual labels:  information-extraction, relation-extraction
Open Ie Papers
Open Information Extraction (OpenIE) and Open Relation Extraction (ORE) papers and data.
Stars: ✭ 150 (+100%)
Mutual labels:  relation-extraction, information-extraction
PSPE
Pretrained Span and span Pair Encoder, code for "Pre-training Entity Relation Encoder with Intra-span and Inter-spanInformation.", EMNLP2020. It is based on our NERE toolkit (https://github.com/Receiling/NERE).
Stars: ✭ 17 (-77.33%)
Mutual labels:  information-extraction, relation-extraction
Open Entity Relation Extraction
Knowledge triples extraction and knowledge base construction based on dependency syntax for open domain text.
Stars: ✭ 350 (+366.67%)
Mutual labels:  relation-extraction, information-extraction
Pytorch multi head selection re
BERT + reproduce "Joint entity recognition and relation extraction as a multi-head selection problem" for Chinese and English IE
Stars: ✭ 105 (+40%)
Mutual labels:  relation-extraction, information-extraction
Tacred Relation
PyTorch implementation of the position-aware attention model for relation extraction
Stars: ✭ 271 (+261.33%)
Mutual labels:  relation-extraction, information-extraction
Casrel
A Novel Cascade Binary Tagging Framework for Relational Triple Extraction. Accepted by ACL 2020.
Stars: ✭ 329 (+338.67%)
Mutual labels:  relation-extraction, information-extraction

Fine-tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction

This repository contains the code of our paper:
Fine-tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction
Christoph Alt, Marc Hübner, Leonhard Hennig

Our code depends on huggingface's PyTorch reimplementation of the OpenAI GPT, and AllenNLP - so thanks to them.

The code is tested with:

  • Python 3.6.6
  • PyTorch 1.0.1
  • AllenNLP 0.7.1

Installation

First, clone the repository to your machine and install the requirements with the following command:

pip install -r requirements.txt

Second, download the OpenAI GPT archive (containing all model related files):

wget --content-disposition https://cloud.dfki.de/owncloud/index.php/s/kKdpoaGikWnL4tn/download

Prepare the data

We evaluate our model on the NYT dataset and use the version provided by OpenNRE.

Follow the OpenNRE instructions for creating the NYT dataset in JSON format:

  1. download the nyt.tar file.
  2. extract the archive with: tar -xvf nyt.tar
  3. create the protobuf files: protoc --proto_path=. --python_out=. Document.proto
  4. convert the protobuf files to json: python protobuf2json.py .
  5. move train.json and test.json to data/open_nre_nyt/

Training

E.g. for training on the NYT dataset, run the following command:

CUDA_VISIBLE_DEVICES=0 allennlp train \
    experiments/configs/model_paper.json \
    -s <MODEL AND METRICS DIR> \
    --include-package tre

Evaluation

CUDA_VISIBLE_DEVICES=0 python ./experiments/utils/pr_curve_and_predictions.py \
    <MODEL AND METRICS DIR> \
    ./data/open_nre_nyt/test.json \
    --output-dir <RESULTS DIR> \
    --archive-filename <MODEL ARCHIVE FILENAME>

Trained Models

The model(s) we trained on NYT to produce our paper results can be found here:

Dataset Masking Mode AUC Download
NYT None 0.422 Link

Download and extract model files

Download the archive corresponding to the model you want to evaluate (links in the table above).

wget --content-disposition <DOWNLOAD URL>

Run evaluation

For example, to evaluate the NYT model used in the paper, run the following command:

CUDA_VISIBLE_DEVICES=0 python ./experiments/utils/pr_curve_and_predictions.py \
    <DIR CONTAINING THE MODEL ARCHIVE> \
    ./data/open_nre_nyt/test.json \
    --output-dir ./results/ \
    --archive-filename model_lm05_wu2_do2_bs16_att.tar.gz

Citations

If you use our code in your research or find our repository useful, please consider citing our work.

@inproceedings{alt-etal-2019-fine,
    title = "Fine-tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction",
    author = {Alt, Christoph  and
      H{\"u}bner, Marc  and
      Hennig, Leonhard},
    booktitle = "Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics",
    month = jul,
    year = "2019",
    address = "Florence, Italy",
    publisher = "Association for Computational Linguistics",
    url = "https://www.aclweb.org/anthology/P19-1134",
    pages = "1388--1398",
}

License

DISTRE is released under the Apache 2.0 license. See LICENSE for additional details.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].