All Projects → XuezheMax → Neuronlp2

XuezheMax / Neuronlp2

Licence: gpl-3.0
Deep neural models for core NLP tasks (Pytorch version)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Neuronlp2

Anago
Bidirectional LSTM-CRF and ELMo for Named-Entity Recognition, Part-of-Speech Tagging and so on.
Stars: ✭ 1,392 (+250.63%)
Mutual labels:  natural-language-processing, named-entity-recognition, sequence-labeling
Seqeval
A Python framework for sequence labeling evaluation(named-entity recognition, pos tagging, etc...)
Stars: ✭ 508 (+27.96%)
Mutual labels:  natural-language-processing, named-entity-recognition, sequence-labeling
Flair
A very simple framework for state-of-the-art Natural Language Processing (NLP)
Stars: ✭ 11,065 (+2687.15%)
Mutual labels:  natural-language-processing, named-entity-recognition, sequence-labeling
Ner Lstm
Named Entity Recognition using multilayered bidirectional LSTM
Stars: ✭ 532 (+34.01%)
Mutual labels:  natural-language-processing, deep-neural-networks, named-entity-recognition
Ncrfpp
NCRF++, a Neural Sequence Labeling Toolkit. Easy use to any sequence labeling tasks (e.g. NER, POS, Segmentation). It includes character LSTM/CNN, word LSTM/CNN and softmax/CRF components.
Stars: ✭ 1,767 (+345.09%)
Mutual labels:  natural-language-processing, named-entity-recognition, sequence-labeling
Pytorch-NLU
Pytorch-NLU,一个中文文本分类、序列标注工具包,支持中文长文本、短文本的多类、多标签分类任务,支持中文命名实体识别、词性标注、分词等序列标注任务。 Ptorch NLU, a Chinese text classification and sequence annotation toolkit, supports multi class and multi label classification tasks of Chinese long text and short text, and supports sequence annotation tasks such as Chinese named entity recognition, part of speech ta…
Stars: ✭ 151 (-61.96%)
Mutual labels:  named-entity-recognition, sequence-labeling
AlpacaTag
AlpacaTag: An Active Learning-based Crowd Annotation Framework for Sequence Tagging (ACL 2019 Demo)
Stars: ✭ 126 (-68.26%)
Mutual labels:  named-entity-recognition, sequence-labeling
Transformers Tutorials
Github repo with tutorials to fine tune transformers for diff NLP tasks
Stars: ✭ 384 (-3.27%)
Mutual labels:  natural-language-processing, named-entity-recognition
Ner
Named Entity Recognition
Stars: ✭ 288 (-27.46%)
Mutual labels:  natural-language-processing, named-entity-recognition
Pytorch Bert Crf Ner
KoBERT와 CRF로 만든 한국어 개체명인식기 (BERT+CRF based Named Entity Recognition model for Korean)
Stars: ✭ 236 (-40.55%)
Mutual labels:  natural-language-processing, named-entity-recognition
Chatbot ner
chatbot_ner: Named Entity Recognition for chatbots.
Stars: ✭ 273 (-31.23%)
Mutual labels:  natural-language-processing, named-entity-recognition
Gector
Official implementation of the paper “GECToR – Grammatical Error Correction: Tag, Not Rewrite” // Published on BEA15 Workshop (co-located with ACL 2020) https://www.aclweb.org/anthology/2020.bea-1.16.pdf
Stars: ✭ 287 (-27.71%)
Mutual labels:  natural-language-processing, sequence-labeling
CrossNER
CrossNER: Evaluating Cross-Domain Named Entity Recognition (AAAI-2021)
Stars: ✭ 87 (-78.09%)
Mutual labels:  named-entity-recognition, sequence-labeling
sequence labeling tf
Sequence Labeling in Tensorflow
Stars: ✭ 18 (-95.47%)
Mutual labels:  named-entity-recognition, sequence-labeling
CrowdLayer
A neural network layer that enables training of deep neural networks directly from crowdsourced labels (e.g. from Amazon Mechanical Turk) or, more generally, labels from multiple annotators with different biases and levels of expertise.
Stars: ✭ 45 (-88.66%)
Mutual labels:  named-entity-recognition, sequence-labeling
pyner
🌈 Implementation of Neural Network based Named Entity Recognizer (Lample+, 2016) using Chainer.
Stars: ✭ 45 (-88.66%)
Mutual labels:  named-entity-recognition, sequence-labeling
Awesome Distributed Deep Learning
A curated list of awesome Distributed Deep Learning resources.
Stars: ✭ 277 (-30.23%)
Mutual labels:  natural-language-processing, deep-neural-networks
Ai Deadlines
⏰ AI conference deadline countdowns
Stars: ✭ 3,852 (+870.28%)
Mutual labels:  natural-language-processing, deep-neural-networks
Bytenet Tensorflow
ByteNet for character-level language modelling
Stars: ✭ 319 (-19.65%)
Mutual labels:  natural-language-processing, deep-neural-networks
Nlp Progress
Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the most common NLP tasks.
Stars: ✭ 19,518 (+4816.37%)
Mutual labels:  natural-language-processing, named-entity-recognition

NeuroNLP2

Deep neural models for core NLP tasks based on Pytorch(version 2)

This is the code we used in the following papers

End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF

Xuezhe Ma, Eduard Hovy

ACL 2016

Neural Probabilistic Model for Non-projective MST Parsing

Xuezhe Ma, Eduard Hovy

IJCNLP 2017

Stack-Pointer Networks for Dependency Parsing

Xuezhe Ma, Zecong Hu, Jingzhou Liu, Nanyun Peng, Graham Neubig and Eduard Hovy

ACL 2018

It also includes the re-implementation of the Stanford Deep BiAffine Parser:

Deep Biaffine Attention for Neural Dependency Parsing

Timothy Dozat, Christopher D. Manning

ICLR 2017

Updates

  1. Upgraded the code to support PyTorch 1.3 and Python 3.6
  2. Re-factored code to better organization
  3. Implemented the batch version of Stack-Pointer Parser decoding algorithm, about 50 times faster!

Requirements

Python 3.6, PyTorch >=1.3.1, Gensim >= 0.12.0

Data format

For the data format used in our implementation, please read this issue.

Running the experiments

First to the experiments folder:

cd experiments

Sequence labeling

To train a CRF POS tagger of PTB WSJ corpus,

./scripts/run_pos_wsj.sh

where the arguments for train/dev/test data, together with the pretrained word embedding should be setup.

To train a NER model on CoNLL-2003 English data set,

./scripts/run_ner_conll03.sh

Dependency Parsing

To train a Stack-Pointer parser, simply run

./scripts/run_stackptr.sh

Remeber to setup the paths for data and embeddings.

To train a Deep BiAffine parser, simply run

./scripts/run_deepbiaf.sh

Again, remember to setup the paths for data and embeddings.

To train a Neural MST parser,

./scripts/run_neuromst.sh
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].