All Projects → kamalkraj → Bert Ner

kamalkraj / Bert Ner

Licence: agpl-3.0
Pytorch-Named-Entity-Recognition-with-BERT

Programming Languages

python
139335 projects - #7 most used programming language
cpp11
221 projects

Projects that are alternatives of or similar to Bert Ner

Bert Ner Tf
Named Entity Recognition with BERT using TensorFlow 2.0
Stars: ✭ 155 (-81.3%)
Mutual labels:  postman, named-entity-recognition, inference, pretrained-models, curl
Curlx
◼️ Supercharge curl with history, collections and more.
Stars: ✭ 169 (-79.61%)
Mutual labels:  postman, curl
Androidhttp
Android Http网络开发神兵利器
Stars: ✭ 88 (-89.38%)
Mutual labels:  postman, curl
object-size-detector-python
Monitor mechanical bolts as they move down a conveyor belt. When a bolt of an irregular size is detected, this solution emits an alert.
Stars: ✭ 26 (-96.86%)
Mutual labels:  inference, pretrained-models
object-flaw-detector-cpp
Detect various irregularities of a product as it moves along a conveyor belt.
Stars: ✭ 19 (-97.71%)
Mutual labels:  inference, pretrained-models
People Counter Python
Create a smart video application using the Intel Distribution of OpenVINO toolkit. The toolkit uses models and inference to run single-class object detection.
Stars: ✭ 62 (-92.52%)
Mutual labels:  inference, pretrained-models
intruder-detector-python
Build an application that alerts you when someone enters a restricted area. Learn how to use models for multiclass object detection.
Stars: ✭ 16 (-98.07%)
Mutual labels:  inference, pretrained-models
Pytorch-NLU
Pytorch-NLU,一个中文文本分类、序列标注工具包,支持中文长文本、短文本的多类、多标签分类任务,支持中文命名实体识别、词性标注、分词等序列标注任务。 Ptorch NLU, a Chinese text classification and sequence annotation toolkit, supports multi class and multi label classification tasks of Chinese long text and short text, and supports sequence annotation tasks such as Chinese named entity recognition, part of speech ta…
Stars: ✭ 151 (-81.79%)
Mutual labels:  named-entity-recognition, pretrained-models
concurrent-video-analytic-pipeline-optimization-sample-l
Create a concurrent video analysis pipeline featuring multistream face and human pose detection, vehicle attribute detection, and the ability to encode multiple videos to local storage in a single stream.
Stars: ✭ 39 (-95.3%)
Mutual labels:  inference, pretrained-models
safety-gear-detector-python
Observe workers as they pass in front of a camera to determine if they have adequate safety protection.
Stars: ✭ 54 (-93.49%)
Mutual labels:  inference, pretrained-models
motor-defect-detector-python
Predict performance issues with manufacturing equipment motors. Perform local or cloud analytics of the issues found, and then display the data on a user interface to determine when failures might arise.
Stars: ✭ 24 (-97.1%)
Mutual labels:  inference, pretrained-models
object-flaw-detector-python
Detect various irregularities of a product as it moves along a conveyor belt.
Stars: ✭ 17 (-97.95%)
Mutual labels:  inference, pretrained-models
setup-linux-debian
Installs essential JavaScript development programs.
Stars: ✭ 16 (-98.07%)
Mutual labels:  curl, postman
Bert Multitask Learning
BERT for Multitask Learning
Stars: ✭ 380 (-54.16%)
Mutual labels:  named-entity-recognition, pretrained-models
Conv Emotion
This repo contains implementation of different architectures for emotion recognition in conversations.
Stars: ✭ 646 (-22.07%)
Mutual labels:  pretrained-models
Yedda
YEDDA: A Lightweight Collaborative Text Span Annotation Tool. Code for ACL 2018 Best Demo Paper Nomination.
Stars: ✭ 704 (-15.08%)
Mutual labels:  named-entity-recognition
Nlp Recipes
Natural Language Processing Best Practices & Examples
Stars: ✭ 5,783 (+597.59%)
Mutual labels:  pretrained-models
Faster Than Requests
Faster requests on Python 3
Stars: ✭ 639 (-22.92%)
Mutual labels:  curl
Multi Model Server
Multi Model Server is a tool for serving neural net models for inference
Stars: ✭ 770 (-7.12%)
Mutual labels:  inference
Cheat.sh
the only cheat sheet you need
Stars: ✭ 27,798 (+3253.2%)
Mutual labels:  curl

BERT NER

Use google BERT to do CoNLL-2003 NER !

new Train model using Python and Inference using C++

ALBERT-TF2.0

BERT-NER-TENSORFLOW-2.0

BERT-SQuAD

Requirements

  • python3
  • pip3 install -r requirements.txt

Run

python run_ner.py --data_dir=data/ --bert_model=bert-base-cased --task_name=ner --output_dir=out_base --max_seq_length=128 --do_train --num_train_epochs 5 --do_eval --warmup_proportion=0.1

Result

BERT-BASE

Validation Data

             precision    recall  f1-score   support

        PER     0.9677    0.9745    0.9711      1842
        LOC     0.9654    0.9711    0.9682      1837
       MISC     0.8851    0.9111    0.8979       922
        ORG     0.9299    0.9292    0.9295      1341

avg / total     0.9456    0.9534    0.9495      5942

Test Data

             precision    recall  f1-score   support

        PER     0.9635    0.9629    0.9632      1617
        ORG     0.8883    0.9097    0.8989      1661
        LOC     0.9272    0.9317    0.9294      1668
       MISC     0.7689    0.8248    0.7959       702

avg / total     0.9065    0.9209    0.9135      5648

Pretrained model download from here

BERT-LARGE

Validation Data

             precision    recall  f1-score   support

        ORG     0.9288    0.9441    0.9364      1341
        LOC     0.9754    0.9728    0.9741      1837
       MISC     0.8976    0.9219    0.9096       922
        PER     0.9762    0.9799    0.9781      1842

avg / total     0.9531    0.9606    0.9568      5942

Test Data

             precision    recall  f1-score   support

        LOC     0.9366    0.9293    0.9329      1668
        ORG     0.8881    0.9175    0.9026      1661
        PER     0.9695    0.9623    0.9659      1617
       MISC     0.7787    0.8319    0.8044       702

avg / total     0.9121    0.9232    0.9174      5648

Pretrained model download from here

Inference

from bert import Ner

model = Ner("out_base/")

output = model.predict("Steve went to Paris")

print(output)
'''
    [
        {
            "confidence": 0.9981840252876282,
            "tag": "B-PER",
            "word": "Steve"
        },
        {
            "confidence": 0.9998939037322998,
            "tag": "O",
            "word": "went"
        },
        {
            "confidence": 0.999891996383667,
            "tag": "O",
            "word": "to"
        },
        {
            "confidence": 0.9991968274116516,
            "tag": "B-LOC",
            "word": "Paris"
        }
    ]
'''

Inference C++

Pretrained and converted bert-base model download from here

Download libtorch from here

  • install cmake, tested with cmake version 3.10.2

  • unzip downloaded model and libtorch in BERT-NER

  • Compile C++ App

      cd cpp-app/
      cmake -DCMAKE_PREFIX_PATH=../libtorch
    

    cmake output image

    make
    

    make output image

  • Runing APP

       ./app ../base
    

    inference output image

NB: Bert-Base C++ model is split in to two parts.

  • Bert Feature extractor and NER classifier.
  • This is done because jit trace don't support input depended for loop or if conditions inside forword function of model.

Deploy REST-API

BERT NER model deployed as rest api

python api.py

API will be live at 0.0.0.0:8000 endpoint predict

cURL request

curl -X POST http://0.0.0.0:8000/predict -H 'Content-Type: application/json' -d '{ "text": "Steve went to Paris" }'

Output

{
    "result": [
        {
            "confidence": 0.9981840252876282,
            "tag": "B-PER",
            "word": "Steve"
        },
        {
            "confidence": 0.9998939037322998,
            "tag": "O",
            "word": "went"
        },
        {
            "confidence": 0.999891996383667,
            "tag": "O",
            "word": "to"
        },
        {
            "confidence": 0.9991968274116516,
            "tag": "B-LOC",
            "word": "Paris"
        }
    ]
}

cURL

curl output image

Postman

postman output image

C++ unicode support

Tensorflow version

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].