All Projects → monologg → NER-Multimodal-pytorch

monologg / NER-Multimodal-pytorch

Licence: other
Pytorch Implementation of "Adaptive Co-attention Network for Named Entity Recognition in Tweets" (AAAI 2018)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to NER-Multimodal-pytorch

CrossNER
CrossNER: Evaluating Cross-Domain Named Entity Recognition (AAAI-2021)
Stars: ✭ 87 (+107.14%)
Mutual labels:  named-entity-recognition, ner
deep-atrous-ner
Deep-Atrous-CNN-NER: Word level model for Named Entity Recognition
Stars: ✭ 35 (-16.67%)
Mutual labels:  named-entity-recognition, ner
anonymization-api
How to build and deploy an anonymization API with FastAPI
Stars: ✭ 51 (+21.43%)
Mutual labels:  named-entity-recognition, ner
SynLSTM-for-NER
Code and models for the paper titled "Better Feature Integration for Named Entity Recognition", NAACL 2021.
Stars: ✭ 26 (-38.1%)
Mutual labels:  named-entity-recognition, ner
mitie-ruby
Named-entity recognition for Ruby
Stars: ✭ 77 (+83.33%)
Mutual labels:  named-entity-recognition, ner
scikitcrf NER
Python library for custom entity recognition using Sklearn CRF
Stars: ✭ 17 (-59.52%)
Mutual labels:  named-entity-recognition, ner
lingvo--Ner-ru
Named entity recognition (NER) in Russian texts / Определение именованных сущностей (NER) в тексте на русском языке
Stars: ✭ 38 (-9.52%)
Mutual labels:  named-entity-recognition, ner
PhoNER COVID19
COVID-19 Named Entity Recognition for Vietnamese (NAACL 2021)
Stars: ✭ 55 (+30.95%)
Mutual labels:  named-entity-recognition, ner
presidio-research
This package features data-science related tasks for developing new recognizers for Presidio. It is used for the evaluation of the entire system, as well as for evaluating specific PII recognizers or PII detection models.
Stars: ✭ 62 (+47.62%)
Mutual labels:  named-entity-recognition, ner
ner-d
Python module for Named Entity Recognition (NER) using natural language processing.
Stars: ✭ 14 (-66.67%)
Mutual labels:  named-entity-recognition, ner
TweebankNLP
[LREC 2022] An off-the-shelf pre-trained Tweet NLP Toolkit (NER, tokenization, lemmatization, POS tagging, dependency parsing) + Tweebank-NER dataset
Stars: ✭ 84 (+100%)
Mutual labels:  named-entity-recognition, ner
NER corpus chinese
NER(命名实体识别)中文语料,一站式获取
Stars: ✭ 102 (+142.86%)
Mutual labels:  named-entity-recognition, ner
NER-and-Linking-of-Ancient-and-Historic-Places
An NER tool for ancient place names based on Pleiades and Spacy.
Stars: ✭ 26 (-38.1%)
Mutual labels:  named-entity-recognition, ner
molminer
Python library and command-line tool for extracting compounds from scientific literature. Written in Python.
Stars: ✭ 38 (-9.52%)
Mutual labels:  named-entity-recognition, ner
neural name tagging
Code for "Reliability-aware Dynamic Feature Composition for Name Tagging" (ACL2019)
Stars: ✭ 39 (-7.14%)
Mutual labels:  named-entity-recognition, ner
korean ner tagging challenge
KU_NERDY 이동엽, 임희석 (2017 국어 정보 처리 시스템경진대회 금상) - 한글 및 한국어 정보처리 학술대회
Stars: ✭ 30 (-28.57%)
Mutual labels:  named-entity-recognition, ner
Ner Bert Pytorch
PyTorch solution of named entity recognition task Using Google AI's pre-trained BERT model.
Stars: ✭ 249 (+492.86%)
Mutual labels:  named-entity-recognition, ner
KoBERT-NER
NER Task with KoBERT (with Naver NLP Challenge dataset)
Stars: ✭ 76 (+80.95%)
Mutual labels:  named-entity-recognition, ner
simple NER
simple rule based named entity recognition
Stars: ✭ 29 (-30.95%)
Mutual labels:  named-entity-recognition, ner
react-taggy
A simple zero-dependency React component for tagging user-defined entities within a block of text.
Stars: ✭ 29 (-30.95%)
Mutual labels:  named-entity-recognition, ner

NER-Multimodal-pytorch

(Unofficial) Pytorch Implementation of "Adaptive Co-attention Network for Named Entity Recognition in Tweets" (AAAI 2018)

Model

Dependencies

  • python>=3.5
  • torch==1.3.1
  • torchvision==0.4.2
  • pillow==7.0.0
  • pytorch-crf==0.7.2
  • seqeval==0.0.12
  • gdown>=3.10.1
$ pip3 install -r requirements.txt

Data

Train Dev Test
# of Data 4,000 1,000 3,257

1. Pretrained Word Vectors

  • Original code's pretrained word embedding can be downloaded at here.
  • But it takes quite a long time to download, so I take out the word vectors (word_vector_200d.vec) that are only in word vocab.

2. Extracted VGG Features

  • Image features are extracted from last pooling layer of VGG16.

  • If you want to extract the feature by yourself, follow as below.

    1. Clone the repo of original code.
    2. Copy data/ner_img from original code to this repo.
    3. Run as below. (img_vgg_features.pt will be saved in data dir)
    $ python3 save_vgg_feature.py
  • Extracted features will be downloaded automatically when you run main.py.

Detail

  • There are some differences between the paper and the original code, so I tried to follow the paper's equations as possible.
  • Build the vocab with train, dev, and test dataset. (same as the original code)
    • Making the vocab only with train dataset decreases performance a lot. (about 5%)
  • Use Adam optimizer instead of RMSProp.

How to run

$ python3 main.py --do_train --do_eval

Result

F1 (%)
Re-implementation 67.17
Baseline (paper) 70.69

References

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].