All Projects → SapienzaNLP → unify-srl

SapienzaNLP / unify-srl

Licence: other
Unifying Cross-Lingual Semantic Role Labeling with Heterogeneous Linguistic Resources (NAACL-2021).

Programming Languages

python
139335 projects - #7 most used programming language
perl
6916 projects
shell
77523 projects

Projects that are alternatives of or similar to unify-srl

event-embedding-multitask
*SEM 2018: Learning Distributed Event Representations with a Multi-Task Approach
Stars: ✭ 22 (+83.33%)
Mutual labels:  semantics, semantic-role-labeling
text2text
Text2Text: Cross-lingual natural language processing and generation toolkit
Stars: ✭ 188 (+1466.67%)
Mutual labels:  cross-lingual
CLSP
Code and data for EMNLP 2018 paper "Cross-lingual Lexical Sememe Prediction"
Stars: ✭ 19 (+58.33%)
Mutual labels:  cross-lingual
m3gm
Max-Margin Markov Graph Models for WordNet (EMNLP 2018)
Stars: ✭ 40 (+233.33%)
Mutual labels:  semantics
semantic role labeling deep learning
SRL deep learning model is based on DB-LSTM which is described in this paper : [End-to-end learning of semantic role labeling using recurrent neural networks](http://www.aclweb.org/anthology/P15-1109)
Stars: ✭ 20 (+66.67%)
Mutual labels:  semantic-role-labeling
vericert
A formally verified high-level synthesis tool based on CompCert and written in Coq.
Stars: ✭ 63 (+425%)
Mutual labels:  semantics
copycat
Modern port of Melanie Mitchell's and Douglas Hofstadter's Copycat
Stars: ✭ 84 (+600%)
Mutual labels:  semantics
watset-java
An implementation of the Watset clustering algorithm in Java.
Stars: ✭ 24 (+100%)
Mutual labels:  semantics
binary-decompilation
Extracting high level semantic information from binary code
Stars: ✭ 55 (+358.33%)
Mutual labels:  semantics
cross-lingual-struct-flow
PyTorch implementation of ACL paper https://arxiv.org/abs/1906.02656
Stars: ✭ 23 (+91.67%)
Mutual labels:  cross-lingual
Cross-Lingual-MRC
Cross-Lingual Machine Reading Comprehension (EMNLP 2019)
Stars: ✭ 66 (+450%)
Mutual labels:  cross-lingual
mixed-language-training
Attention-Informed Mixed-Language Training for Zero-shot Cross-lingual Task-oriented Dialogue Systems (AAAI-2020)
Stars: ✭ 29 (+141.67%)
Mutual labels:  cross-lingual
biomappings
🗺️ Community curated and predicted equivalences and related mappings between named biological entities that are not available from primary sources.
Stars: ✭ 24 (+100%)
Mutual labels:  semantics
sequence labeling tf
Sequence Labeling in Tensorflow
Stars: ✭ 18 (+50%)
Mutual labels:  semantic-role-labeling
BabelNet-Sememe-Prediction
Code and data of the AAAI-20 paper "Towards Building a Multilingual Sememe Knowledge Base: Predicting Sememes for BabelNet Synsets"
Stars: ✭ 18 (+50%)
Mutual labels:  semantics
exams-qa
A Multi-subject High School Examinations Dataset for Cross-lingual and Multilingual Question Answering
Stars: ✭ 25 (+108.33%)
Mutual labels:  cross-lingual
deepnlp
小时候练手的nlp项目
Stars: ✭ 11 (-8.33%)
Mutual labels:  semantic-role-labeling
zmsp
The Mingled Structured Predictor
Stars: ✭ 20 (+66.67%)
Mutual labels:  semantic-role-labeling
Compositional-Generalization-in-Natural-Language-Processing
Compositional Generalization in Natual Language Processing. A roadmap.
Stars: ✭ 26 (+116.67%)
Mutual labels:  semantics
preprocess-conll05
Scripts for preprocessing the CoNLL-2005 SRL dataset.
Stars: ✭ 17 (+41.67%)
Mutual labels:  semantic-role-labeling

Unifying Cross-Lingual Semantic Role Labeling with Heterogeneous Linguistic Resources

Paper Conference License: CC BY-NC 4.0

Description

This is the repository for the paper Unifying Cross-Lingual Semantic Role Labeling with Heterogeneous Linguistic Resources, to be presented at NAACL 2021 by Simone Conia, Andrea Bacciu and Roberto Navigli.

Abstract

While cross-lingual techniques are finding increasing success in a wide range of Natural Language Processing tasks, their application to Semantic Role Labeling (SRL) has been strongly limited by the fact that each language adopts its own linguistic formalism, from PropBank for English to AnCora for Spanish and PDT-Vallex for Czech, inter alia. In this work, we address this issue and present a unified model to perform cross-lingual SRL over heterogeneous linguistic resources. Our model implicitly learns a high-quality mapping for different formalisms across diverse languages without resorting to word alignment and/or translation techniques. We find that, not only is our cross-lingual system competitive with the current state of the art but that it is also robust to low-data scenarios. Most interestingly, our unified model is able to annotate a sentence in a single forward pass with all the inventories it was trained with, providing a tool for the analysis and comparison of linguistic theories across different languages.

Download

You can download a copy of all the files in this repository by cloning the git repository:

git clone https://github.com/SapienzaNLP/unify-srl.git

or download a zip archive.

Model Checkpoint

To install

To install you can use the environment.yml.
To use the model with NVIDIA CUDA remember to install the torch-scatter package made for CUDA (we suggest CUDA 10.2) as described in the documentation.

pip install torch-scatter==2.0.5 -f https://pytorch-geometric.com/whl/torch-1.5.0+${CUDA}.html

Cite this work

@inproceedings{conia-etal-2021-unify-srl,
    title = "Unifying Cross-Lingual Semantic Role Labeling with Heterogeneous Linguistic Resources",
    author = "Conia, Simone  and
      Bacciu, Andrea  and
      Navigli, Roberto",
    booktitle = "Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies",
    month = jun,
    year = "2021",
    address = "Online",
    publisher = "Association for Computational Linguistics",
    url = "https://www.aclweb.org/anthology/2021.naacl-main.31",
    pages = "338--351",
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].