All Projects → Nealcly → Bilstm Lan

Nealcly / Bilstm Lan

Licence: apache-2.0
Hierarchically-Refined Label Attention Network for Sequence Labeling

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Bilstm Lan

Ncrfpp
NCRF++, a Neural Sequence Labeling Toolkit. Easy use to any sequence labeling tasks (e.g. NER, POS, Segmentation). It includes character LSTM/CNN, word LSTM/CNN and softmax/CRF components.
Stars: ✭ 1,767 (+633.2%)
Mutual labels:  named-entity-recognition, sequence-labeling, part-of-speech-tagger
Pynlp
A pythonic wrapper for Stanford CoreNLP.
Stars: ✭ 103 (-57.26%)
Mutual labels:  named-entity-recognition, part-of-speech-tagger
Seq2annotation
基于 TensorFlow & PaddlePaddle 的通用序列标注算法库(目前包含 BiLSTM+CRF, Stacked-BiLSTM+CRF 和 IDCNN+CRF,更多算法正在持续添加中)实现中文分词(Tokenizer / segmentation)、词性标注(Part Of Speech, POS)和命名实体识别(Named Entity Recognition, NER)等序列标注任务。
Stars: ✭ 70 (-70.95%)
Mutual labels:  named-entity-recognition, part-of-speech-tagger
Multi Task Nlp
multi_task_NLP is a utility toolkit enabling NLP developers to easily train and infer a single model for multiple tasks.
Stars: ✭ 221 (-8.3%)
Mutual labels:  named-entity-recognition, sequence-labeling
Seqeval
A Python framework for sequence labeling evaluation(named-entity recognition, pos tagging, etc...)
Stars: ✭ 508 (+110.79%)
Mutual labels:  named-entity-recognition, sequence-labeling
Cluener2020
CLUENER2020 中文细粒度命名实体识别 Fine Grained Named Entity Recognition
Stars: ✭ 689 (+185.89%)
Mutual labels:  named-entity-recognition, sequence-labeling
Kashgari
Kashgari is a production-level NLP Transfer learning framework built on top of tf.keras for text-labeling and text-classification, includes Word2Vec, BERT, and GPT2 Language Embedding.
Stars: ✭ 2,235 (+827.39%)
Mutual labels:  named-entity-recognition, sequence-labeling
Slot filling and intent detection of slu
slot filling, intent detection, joint training, ATIS & SNIPS datasets, the Facebook’s multilingual dataset, MIT corpus, E-commerce Shopping Assistant (ECSA) dataset, CoNLL2003 NER, ELMo, BERT, XLNet
Stars: ✭ 298 (+23.65%)
Mutual labels:  named-entity-recognition, sequence-labeling
Lac
百度NLP:分词,词性标注,命名实体识别,词重要性
Stars: ✭ 2,792 (+1058.51%)
Mutual labels:  named-entity-recognition, part-of-speech-tagger
Ld Net
Efficient Contextualized Representation: Language Model Pruning for Sequence Labeling
Stars: ✭ 148 (-38.59%)
Mutual labels:  named-entity-recognition, sequence-labeling
Awesome Persian Nlp Ir
Curated List of Persian Natural Language Processing and Information Retrieval Tools and Resources
Stars: ✭ 460 (+90.87%)
Mutual labels:  named-entity-recognition, part-of-speech-tagger
Pyhanlp
中文分词 词性标注 命名实体识别 依存句法分析 新词发现 关键词短语提取 自动摘要 文本分类聚类 拼音简繁 自然语言处理
Stars: ✭ 2,564 (+963.9%)
Mutual labels:  named-entity-recognition, part-of-speech-tagger
Neuronlp2
Deep neural models for core NLP tasks (Pytorch version)
Stars: ✭ 397 (+64.73%)
Mutual labels:  named-entity-recognition, sequence-labeling
Named entity recognition
中文命名实体识别(包括多种模型:HMM,CRF,BiLSTM,BiLSTM+CRF的具体实现)
Stars: ✭ 995 (+312.86%)
Mutual labels:  named-entity-recognition, sequence-labeling
Autoner
Learning Named Entity Tagger from Domain-Specific Dictionary
Stars: ✭ 357 (+48.13%)
Mutual labels:  named-entity-recognition, sequence-labeling
Anago
Bidirectional LSTM-CRF and ELMo for Named-Entity Recognition, Part-of-Speech Tagging and so on.
Stars: ✭ 1,392 (+477.59%)
Mutual labels:  named-entity-recognition, sequence-labeling
AlpacaTag
AlpacaTag: An Active Learning-based Crowd Annotation Framework for Sequence Tagging (ACL 2019 Demo)
Stars: ✭ 126 (-47.72%)
Mutual labels:  named-entity-recognition, sequence-labeling
CrowdLayer
A neural network layer that enables training of deep neural networks directly from crowdsourced labels (e.g. from Amazon Mechanical Turk) or, more generally, labels from multiple annotators with different biases and levels of expertise.
Stars: ✭ 45 (-81.33%)
Mutual labels:  named-entity-recognition, sequence-labeling
Flair
A very simple framework for state-of-the-art Natural Language Processing (NLP)
Stars: ✭ 11,065 (+4491.29%)
Mutual labels:  named-entity-recognition, sequence-labeling
Spark Nlp
State of the Art Natural Language Processing
Stars: ✭ 2,518 (+944.81%)
Mutual labels:  named-entity-recognition, part-of-speech-tagger

BiLSTM - Label Attention Network (BiLSTM-LAN)

Hierarchically-Refined Label Attention Network for Sequence Labeling (EMNLP 2019)

Model Structure

The model consists of two BiLSTM-LAN layers. Each BiLSTM-LAN layer is composed of a BiLSTM encoding sublayer and a label-attention inference sublayer. In paticular, the former is the same as the BiLSTM layer in the baseline model, while the latter uses multihead attention to jointly encode information from the word representation subspace and the label representation subspace.

Requirement

  • Python3
  • PyTorch: 0.3

Train models

  • Download data and word embedding
  • Run the script:
python main.py --learning_rate 0.01 --lr_decay 0.035 --dropout 0.5 --hidden_dim 400 --lstm_layer 3 --momentum 0.9 --whether_clip_grad True --clip_grad 5.0 \
--train_dir 'wsj_pos/train.pos' --dev_dir 'wsj_pos/dev.pos' --test_dir 'wsj_pos/test.pos' --model_dir 'wsj_pos/' --word_emb_dir 'glove.6B.100d.txt'

Performance

ID TASK Dataset Performace
1 POS wsj 97.65
2 POS UD v2.2 95.59
3 NER OntoNotes 5.0 88.16
4 CCG CCGBank 94.7

Cite

Leyang Cui and Yue Zhang. 2019. Hierarchically-refined label attention network for sequence labeling.InProceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 4106–4119, Hong Kong, China. Association for Computational Linguistics.

@inproceedings{cui-zhang-2019-hierarchically,
    title = "Hierarchically-Refined Label Attention Network for Sequence Labeling",
    author = "Cui, Leyang  and
      Zhang, Yue",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)",
    month = nov,
    year = "2019",
    address = "Hong Kong, China",
    publisher = "Association for Computational Linguistics",
    url = "https://www.aclweb.org/anthology/D19-1422",
    pages = "4106--4119",
}

Acknowledgments

NCRF++

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].