All Projects → zjunlp → DocED

zjunlp / DocED

Licence: other
Source code for the ACL 2021 paper "MLBiNet: A Cross-Sentence Collective Event Detection Network ".

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to DocED

DeepEE
DeepEE: Deep Event Extraction Algorithm Gallery (基于深度学习的开源中文事件抽取算法汇总)
Stars: ✭ 24 (+33.33%)
Mutual labels:  event-detection, event-extraction
Giveme5W
Extraction of the five journalistic W-questions (5W) from news articles
Stars: ✭ 16 (-11.11%)
Mutual labels:  event-detection, event-extraction
event-extraction-paper
Papers from top conferences and journals for event extraction in recent years
Stars: ✭ 54 (+200%)
Mutual labels:  event-detection, event-extraction
IE Paper Notes
Paper notes for Information Extraction, including Relation Extraction (RE), Named Entity Recognition (NER), Entity Linking (EL), Event Extraction (EE), Named Entity Disambiguation (NED).
Stars: ✭ 14 (-22.22%)
Mutual labels:  event-extraction
watchman
Watchman: An open-source social-media event-detection system
Stars: ✭ 18 (+0%)
Mutual labels:  event-detection
SEDTWik-Event-Detection-from-Tweets
Segmentation based event detection from Tweets. Published at NAACL SRW 2019
Stars: ✭ 58 (+222.22%)
Mutual labels:  event-detection
CogIE
CogIE: An Information Extraction Toolkit for Bridging Text and CogNet. ACL 2021
Stars: ✭ 47 (+161.11%)
Mutual labels:  event-extraction
GEANet-BioMed-Event-Extraction
Code for the paper Biomedical Event Extraction with Hierarchical Knowledge Graphs
Stars: ✭ 52 (+188.89%)
Mutual labels:  event-extraction
text analysis tools
中文文本分析工具包(包括- 文本分类 - 文本聚类 - 文本相似性 - 关键词抽取 - 关键短语抽取 - 情感分析 - 文本纠错 - 文本摘要 - 主题关键词-同义词、近义词-事件三元组抽取)
Stars: ✭ 410 (+2177.78%)
Mutual labels:  event-extraction
OpenUE
OpenUE是一个轻量级知识图谱抽取工具 (An Open Toolkit for Universal Extraction from Text published at EMNLP2020: https://aclanthology.org/2020.emnlp-demos.1.pdf)
Stars: ✭ 274 (+1422.22%)
Mutual labels:  event-extraction
TradeTheEvent
Implementation of "Trade the Event: Corporate Events Detection for News-Based Event-Driven Trading." In Findings of ACL2021
Stars: ✭ 64 (+255.56%)
Mutual labels:  event-detection

DocED

This repository is the official implementation of the ACL 2021 paper MLBiNet: A Cross-Sentence Collective Event Detection Network.

Requirements

To install basic requirements:

pip install requirements.txt

Datasets

ACE2005 can be found here: https://catalog.ldc.upenn.edu/LDC2006T06

Basic training

To evaluate a setting with serveral random trials, execute

python run_experiments_multi.py

Main hyperparameters in train_MLBiNet.py include:

--tagging_mechanism, mechanism to model event inter-dependency, you can choose one of "forward_decoder", "backward_decoder" or "bidirectional_decoder"

--num_tag_layers, number of tagging layers, 1 indicates that we do sentence-level ED, 2 indicates that information of adjacent sentences were aggregated, ...

--max_doc_len, maximum number of consecutive sentences are extracted as a mini-document, we can set it as 8 or 16

--tag_dim, dimension of an uni-directional event tagging vector

--self_att_not, whether to apply self-attention mechanism in sentence encoder

Main results

Overall performance on ACE2005

image

Performance on detecting multiple events collectively

image

where 1/1 means one sentence that has one event; otherwise, 1/n is used.

Performance of our proposed method with different multi-layer settings or decoder methods

image

How to Cite

@inproceedings{ACL2021_MLBiNet,
  author    = {Dongfang Lou and
               Zhilin Liao and
               Shumin Deng and
               Ningyu Zhang and
               Huajun Chen},
  title     = {MLBiNet: A Cross-Sentence Collective Event Detection Network},
  booktitle = {{ACL}},
  publisher = {Association for Computational Linguistics},
  year      = {2021}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].