All Projects → thunlp → Hatt Proto

thunlp / Hatt Proto

Licence: mit
Code and dataset of AAAI2019 paper Hybrid Attention-Based Prototypical Networks for Noisy Few-Shot Relation Classification

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Hatt Proto

Knowledge Graphs
A collection of research on knowledge graphs
Stars: ✭ 845 (+467.11%)
Mutual labels:  relation-extraction
Distre
[ACL 19] Fine-tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction
Stars: ✭ 75 (-49.66%)
Mutual labels:  relation-extraction
Pytorch multi head selection re
BERT + reproduce "Joint entity recognition and relation extraction as a multi-head selection problem" for Chinese and English IE
Stars: ✭ 105 (-29.53%)
Mutual labels:  relation-extraction
Rex
REx: Relation Extraction. Modernized re-write of the code in the master's thesis: "Relation Extraction using Distant Supervision, SVMs, and Probabalistic First-Order Logic"
Stars: ✭ 21 (-85.91%)
Mutual labels:  relation-extraction
Chinese Relation Extraction
Relation Extraction 中文关系提取
Stars: ✭ 57 (-61.74%)
Mutual labels:  relation-extraction
Relation extraction
Relation Extraction using Deep learning(CNN)
Stars: ✭ 96 (-35.57%)
Mutual labels:  relation-extraction
Nrepapers
Must-read papers on neural relation extraction (NRE)
Stars: ✭ 831 (+457.72%)
Mutual labels:  relation-extraction
Bertem
论文实现(ACL2019):《Matching the Blanks: Distributional Similarity for Relation Learning》
Stars: ✭ 146 (-2.01%)
Mutual labels:  relation-extraction
Rcnn Relation Extraction
Tensorflow Implementation of Recurrent Convolutional Neural Network for Relation Extraction
Stars: ✭ 64 (-57.05%)
Mutual labels:  relation-extraction
Zhopenie
Chinese Open Information Extraction (Tree-based Triple Relation Extraction Module)
Stars: ✭ 98 (-34.23%)
Mutual labels:  relation-extraction
Jointre
End-to-end neural relation extraction using deep biaffine attention (ECIR 2019)
Stars: ✭ 41 (-72.48%)
Mutual labels:  relation-extraction
Exemplar
An open relation extraction system
Stars: ✭ 46 (-69.13%)
Mutual labels:  relation-extraction
Copymtl
AAAI20 "CopyMTL: Copy Mechanism for Joint Extraction of Entities and Relations with Multi-Task Learning"
Stars: ✭ 97 (-34.9%)
Mutual labels:  relation-extraction
Pytorch Nre
Neural Relation Extraction in Pytorch
Stars: ✭ 20 (-86.58%)
Mutual labels:  relation-extraction
Atnre
Adversarial Training for Neural Relation Extraction
Stars: ✭ 108 (-27.52%)
Mutual labels:  relation-extraction
Gigabert
Zero-shot Transfer Learning from English to Arabic
Stars: ✭ 23 (-84.56%)
Mutual labels:  relation-extraction
Tre
[AKBC 19] Improving Relation Extraction by Pre-trained Language Representations
Stars: ✭ 95 (-36.24%)
Mutual labels:  relation-extraction
Information Extraction Chinese
Chinese Named Entity Recognition with IDCNN/biLSTM+CRF, and Relation Extraction with biGRU+2ATT 中文实体识别与关系提取
Stars: ✭ 1,888 (+1167.11%)
Mutual labels:  relation-extraction
Bran
Full abstract relation extraction from biological texts with bi-affine relation attention networks
Stars: ✭ 111 (-25.5%)
Mutual labels:  relation-extraction
Intra Bag And Inter Bag Attentions
Code for NAACL 2019 paper: Distant Supervision Relation Extraction with Intra-Bag and Inter-Bag Attentions
Stars: ✭ 98 (-34.23%)
Mutual labels:  relation-extraction

Hybrid Attention-Based Prototypical Networks for Noisy Few-Shot Relation Classification

Code and data for AAAI2019 paper Hybrid Attention-Based Prototypical Networks for Noisy Few-Shot Relation Classification.

Author: Tianyu Gao*, Xu Han*, Zhiyuan Liu, Maosong Sun. (* means equal contribution)

Dataset and Word Embedding

We evaluate our models on FewRel, a large-scale dataset for few-shot relation classification. It has 100 relations and 700 instances for each relation. You can find some baseline models from here.

Due to the large size, we did not upload the glove file (pre-trained word embedding). Please download glove.6B.50d.json from Tsinghua Cloud or Google Drive and put it under data/ folder.

Usage

To run our code, use this command for training

python train.py {MODEL_NAME} {N} {K} {NOISE_RATE}

and use this command for testing

python test.py {MODEL_NAME} {N} {K} {NOISE_RATE}

where {MODEL_NAME} could be proto or proto_hatt, {N} is the num of classes, {K} is the num of instances for each class and {NOISE_RATE} is the probability that one instance is wrong-labeled.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].