All Projects → threelittlemonkeys → Lstm Crf Pytorch

threelittlemonkeys / Lstm Crf Pytorch

LSTM-CRF in PyTorch

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Lstm Crf Pytorch

Ner Pytorch
LSTM+CRF NER
Stars: ✭ 260 (-28.57%)
Mutual labels:  crf, sequence-labeling
Lm Lstm Crf
Empower Sequence Labeling with Task-Aware Language Model
Stars: ✭ 778 (+113.74%)
Mutual labels:  crf, sequence-labeling
Slot filling and intent detection of slu
slot filling, intent detection, joint training, ATIS & SNIPS datasets, the Facebook’s multilingual dataset, MIT corpus, E-commerce Shopping Assistant (ECSA) dataset, CoNLL2003 NER, ELMo, BERT, XLNet
Stars: ✭ 298 (-18.13%)
Mutual labels:  crf, sequence-labeling
A Pytorch Tutorial To Sequence Labeling
Empower Sequence Labeling with Task-Aware Neural Language Model | a PyTorch Tutorial to Sequence Labeling
Stars: ✭ 257 (-29.4%)
Mutual labels:  crf, sequence-labeling
Ncrfpp
NCRF++, a Neural Sequence Labeling Toolkit. Easy use to any sequence labeling tasks (e.g. NER, POS, Segmentation). It includes character LSTM/CNN, word LSTM/CNN and softmax/CRF components.
Stars: ✭ 1,767 (+385.44%)
Mutual labels:  crf, sequence-labeling
Named entity recognition
中文命名实体识别(包括多种模型:HMM,CRF,BiLSTM,BiLSTM+CRF的具体实现)
Stars: ✭ 995 (+173.35%)
Mutual labels:  crf, sequence-labeling
Sltk
序列化标注工具,基于PyTorch实现BLSTM-CNN-CRF模型,CoNLL 2003 English NER测试集F1值为91.10%(word and char feature)。
Stars: ✭ 338 (-7.14%)
Mutual labels:  crf, sequence-labeling
Rnnsharp
RNNSharp is a toolkit of deep recurrent neural network which is widely used for many different kinds of tasks, such as sequence labeling, sequence-to-sequence and so on. It's written by C# language and based on .NET framework 4.6 or above versions. RNNSharp supports many different types of networks, such as forward and bi-directional network, sequence-to-sequence network, and different types of layers, such as LSTM, Softmax, sampled Softmax and others.
Stars: ✭ 277 (-23.9%)
Mutual labels:  crf, sequence-labeling
Hscrf Pytorch
ACL 2018: Hybrid semi-Markov CRF for Neural Sequence Labeling (http://aclweb.org/anthology/P18-2038)
Stars: ✭ 284 (-21.98%)
Mutual labels:  crf, sequence-labeling
Ntagger
reference pytorch code for named entity tagging
Stars: ✭ 58 (-84.07%)
Mutual labels:  crf, sequence-labeling
Pytorch ner bilstm cnn crf
End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF implement in pyotrch
Stars: ✭ 249 (-31.59%)
Mutual labels:  crf, sequence-labeling
deepseg
Chinese word segmentation in tensorflow 2.x
Stars: ✭ 23 (-93.68%)
Mutual labels:  crf, sequence-labeling
Bert For Sequence Labeling And Text Classification
This is the template code to use BERT for sequence lableing and text classification, in order to facilitate BERT for more tasks. Currently, the template code has included conll-2003 named entity identification, Snips Slot Filling and Intent Prediction.
Stars: ✭ 293 (-19.51%)
Mutual labels:  sequence-labeling
PIE
Fast + Non-Autoregressive Grammatical Error Correction using BERT. Code and Pre-trained models for paper "Parallel Iterative Edit Models for Local Sequence Transduction": www.aclweb.org/anthology/D19-1435.pdf (EMNLP-IJCNLP 2019)
Stars: ✭ 164 (-54.95%)
Mutual labels:  sequence-labeling
CrowdLayer
A neural network layer that enables training of deep neural networks directly from crowdsourced labels (e.g. from Amazon Mechanical Turk) or, more generally, labels from multiple annotators with different biases and levels of expertise.
Stars: ✭ 45 (-87.64%)
Mutual labels:  sequence-labeling
keras-bert-ner
Keras solution of Chinese NER task using BiLSTM-CRF/BiGRU-CRF/IDCNN-CRF model with Pretrained Language Model: supporting BERT/RoBERTa/ALBERT
Stars: ✭ 7 (-98.08%)
Mutual labels:  crf
knowledge-graph-nlp-in-action
从模型训练到部署,实战知识图谱(Knowledge Graph)&自然语言处理(NLP)。涉及 Tensorflow, Bert+Bi-LSTM+CRF,Neo4j等 涵盖 Named Entity Recognition,Text Classify,Information Extraction,Relation Extraction 等任务。
Stars: ✭ 58 (-84.07%)
Mutual labels:  crf
Autoner
Learning Named Entity Tagger from Domain-Specific Dictionary
Stars: ✭ 357 (-1.92%)
Mutual labels:  sequence-labeling
Ner Lstm Crf
An easy-to-use named entity recognition (NER) toolkit, implemented the Bi-LSTM+CRF model in tensorflow.
Stars: ✭ 337 (-7.42%)
Mutual labels:  crf
Gector
Official implementation of the paper “GECToR – Grammatical Error Correction: Tag, Not Rewrite” // Published on BEA15 Workshop (co-located with ACL 2020) https://www.aclweb.org/anthology/2020.bea-1.16.pdf
Stars: ✭ 287 (-21.15%)
Mutual labels:  sequence-labeling

LSTM-CRF in PyTorch

A minimal PyTorch (1.7.1) implementation of bidirectional LSTM-CRF for sequence labelling.

Supported features:

  • Mini-batch training with CUDA
  • Lookup, CNNs, RNNs and/or self-attention in the embedding layer
  • Hierarchical recurrent encoding (HRE)
  • A PyTorch implementation of conditional random field (CRF)
  • Vectorized computation of CRF loss
  • Vectorized Viterbi decoding

Usage

Training data should be formatted as below:

token/tag token/tag token/tag ...
token/tag token/tag token/tag ...
...

For more detail, see README.md in each subdirectory.

To prepare data:

python3 prepare.py training_data

To train:

python3 train.py model char_to_idx word_to_idx tag_to_idx training_data.csv (validation_data) num_epoch

To predict:

python3 predict.py model.epochN word_to_idx tag_to_idx test_data

To evaluate:

python3 evaluate.py model.epochN word_to_idx tag_to_idx test_data

References

Zhiheng Huang, Wei Xu, Kai Yu. 2015. Bidirectional LSTM-CRF Models for Sequence Tagging. arXiv:1508.01991.

Harshit Kumar, Arvind Agarwal, Riddhiman Dasgupta, Sachindra Joshi. 2018. Dialogue Act Sequence Labeling Using Hierarchical Encoder with CRF. In AAAI.

Xuezhe Ma, Eduard Hovy. 2016. End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF. arXiv:1603.01354.

Shotaro Misawa, Motoki Taniguchi, Yasuhide Miura, Tomoko Ohkuma. 2017. Character-based Bidirectional LSTM-CRF with Words and Characters for Japanese Named Entity Recognition. In Proceedings of the 1st Workshop on Subword and Character Level Models in NLP.

Yan Shao, Christian Hardmeier, Jörg Tiedemann, Joakim Nivre. 2017. Character-based Joint Segmentation and POS Tagging for Chinese using Bidirectional RNN-CRF. arXiv:1704.01314.

Slav Petrov, Dipanjan Das, Ryan McDonald. 2011. A Universal Part-of-Speech Tagset. arXiv:1104.2086.

Nils Reimers, Iryna Gurevych. 2017. Optimal Hyperparameters for Deep LSTM-Networks for Sequence Labeling Tasks. arXiv:1707.06799.

Feifei Zhai, Saloni Potdar, Bing Xiang, Bowen Zhou. 2017. Neural Models for Sequence Chunking. In AAAI.

Zenan Zhai, Dat Quoc Nguyen, Karin Verspoor. 2018. Comparing CNN and LSTM Character-level Embeddings in BiLSTM-CRF Models for Chemical and Disease Named Entity Recognition. arXiv:1808.08450.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].