BERTOverflowA Pre-trained BERT on StackOverflow Corpus
Stars: ✭ 40 (-73.51%)
Spark NlpState of the Art Natural Language Processing
Stars: ✭ 2,518 (+1567.55%)
TorchBlocksA PyTorch-based toolkit for natural language processing
Stars: ✭ 85 (-43.71%)
Bert Bilstm Crf NerTensorflow solution of NER task Using BiLSTM-CRF model with Google BERT Fine-tuning And private Server services
Stars: ✭ 3,838 (+2441.72%)
knowledge-graph-nlp-in-action从模型训练到部署,实战知识图谱(Knowledge Graph)&自然语言处理(NLP)。涉及 Tensorflow, Bert+Bi-LSTM+CRF,Neo4j等 涵盖 Named Entity Recognition,Text Classify,Information Extraction,Relation Extraction 等任务。
Stars: ✭ 58 (-61.59%)
KashgariKashgari is a production-level NLP Transfer learning framework built on top of tf.keras for text-labeling and text-classification, includes Word2Vec, BERT, and GPT2 Language Embedding.
Stars: ✭ 2,235 (+1380.13%)
Mt DnnMulti-Task Deep Neural Networks for Natural Language Understanding
Stars: ✭ 1,871 (+1139.07%)
DeepNERAn Easy-to-use, Modular and Prolongable package of deep-learning based Named Entity Recognition Models.
Stars: ✭ 9 (-94.04%)
Pytorch-NLUPytorch-NLU,一个中文文本分类、序列标注工具包,支持中文长文本、短文本的多类、多标签分类任务,支持中文命名实体识别、词性标注、分词等序列标注任务。 Ptorch NLU, a Chinese text classification and sequence annotation toolkit, supports multi class and multi label classification tasks of Chinese long text and short text, and supports sequence annotation tasks such as Chinese named entity recognition, part of speech ta…
Stars: ✭ 151 (+0%)
OpenUEOpenUE是一个轻量级知识图谱抽取工具 (An Open Toolkit for Universal Extraction from Text published at EMNLP2020: https://aclanthology.org/2020.emnlp-demos.1.pdf)
Stars: ✭ 274 (+81.46%)
banglabertThis repository contains the official release of the model "BanglaBERT" and associated downstream finetuning code and datasets introduced in the paper titled "BanglaBERT: Language Model Pretraining and Benchmarks for Low-Resource Language Understanding Evaluation in Bangla" accpeted in Findings of the Annual Conference of the North American Chap…
Stars: ✭ 186 (+23.18%)
BERT-NERUsing pre-trained BERT models for Chinese and English NER with 🤗Transformers
Stars: ✭ 114 (-24.5%)
XpersonaXPersona: Evaluating Multilingual Personalized Chatbot
Stars: ✭ 54 (-64.24%)
PDNThe official PyTorch implementation of "Pathfinder Discovery Networks for Neural Message Passing" (WebConf '21)
Stars: ✭ 44 (-70.86%)
parsbert-ner🤗 ParsBERT Persian NER Tasks
Stars: ✭ 15 (-90.07%)
nervaluateFull named-entity (i.e., not tag/token) evaluation metrics based on SemEval’13
Stars: ✭ 40 (-73.51%)
nejiFlexible and powerful platform for biomedical information extraction from text
Stars: ✭ 37 (-75.5%)
ark-nlpA private nlp coding package, which quickly implements the SOTA solutions.
Stars: ✭ 232 (+53.64%)
slotminerTool for slot extraction from text
Stars: ✭ 15 (-90.07%)
deep-atrous-nerDeep-Atrous-CNN-NER: Word level model for Named Entity Recognition
Stars: ✭ 35 (-76.82%)
eve-botEVE bot, a customer service chatbot to enhance virtual engagement for Twitter Apple Support
Stars: ✭ 31 (-79.47%)
trinity-ieInformation extraction pipeline containing coreference resolution, named entity linking, and relationship extraction
Stars: ✭ 59 (-60.93%)
AlpacaTagAlpacaTag: An Active Learning-based Crowd Annotation Framework for Sequence Tagging (ACL 2019 Demo)
Stars: ✭ 126 (-16.56%)
ai web RISKOUT BTS국방 리스크 관리 플랫폼 (🏅 국방부장관상/Minister of National Defense Award)
Stars: ✭ 18 (-88.08%)
presidio-researchThis package features data-science related tasks for developing new recognizers for Presidio. It is used for the evaluation of the entire system, as well as for evaluating specific PII recognizers or PII detection models.
Stars: ✭ 62 (-58.94%)
textwiser[AAAI 2021] TextWiser: Text Featurization Library
Stars: ✭ 26 (-82.78%)
ganbert-pytorchEnhancing the BERT training with Semi-supervised Generative Adversarial Networks in Pytorch/HuggingFace
Stars: ✭ 60 (-60.26%)
tfbert基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。
Stars: ✭ 54 (-64.24%)
ParsBigBirdPersian Bert For Long-Range Sequences
Stars: ✭ 58 (-61.59%)
JD2Skills-BERT-XMLCCode and Dataset for the Bhola et al. (2020) Retrieving Skills from Job Descriptions: A Language Model Based Extreme Multi-label Classification Framework
Stars: ✭ 33 (-78.15%)
WSDM-Cup-2019[ACM-WSDM] 3rd place solution at WSDM Cup 2019, Fake News Classification on Kaggle.
Stars: ✭ 62 (-58.94%)
InformationExtractionSystemInformation Extraction System can perform NLP tasks like Named Entity Recognition, Sentence Simplification, Relation Extraction etc.
Stars: ✭ 27 (-82.12%)
muse-as-serviceREST API for sentence tokenization and embedding using Multilingual Universal Sentence Encoder.
Stars: ✭ 45 (-70.2%)
lingvo--Ner-ruNamed entity recognition (NER) in Russian texts / Определение именованных сущностей (NER) в тексте на русском языке
Stars: ✭ 38 (-74.83%)
nerA command-line utility for extracting names of people, places, and organizations from text on macOS.
Stars: ✭ 75 (-50.33%)
mirror-bert[EMNLP 2021] Mirror-BERT: Converting Pretrained Language Models to universal text encoders without labels.
Stars: ✭ 56 (-62.91%)
SentimentAnalysis(BOW, TF-IDF, Word2Vec, BERT) Word Embeddings + (SVM, Naive Bayes, Decision Tree, Random Forest) Base Classifiers + Pre-trained BERT on Tensorflow Hub + 1-D CNN and Bi-Directional LSTM on IMDB Movie Reviews Dataset
Stars: ✭ 40 (-73.51%)
BERT-embeddingA simple wrapper class for extracting features(embedding) and comparing them using BERT in TensorFlow
Stars: ✭ 24 (-84.11%)
bert-squeeze🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
Stars: ✭ 56 (-62.91%)
golgothaContextualised Embeddings and Language Modelling using BERT and Friends using R
Stars: ✭ 39 (-74.17%)
ner-dPython module for Named Entity Recognition (NER) using natural language processing.
Stars: ✭ 14 (-90.73%)
simple NERsimple rule based named entity recognition
Stars: ✭ 29 (-80.79%)
task-transferabilityData and code for our paper "Exploring and Predicting Transferability across NLP Tasks", to appear at EMNLP 2020.
Stars: ✭ 35 (-76.82%)
rasa milktea chatbotChatbot with bert chinese model, base on rasa framework(中文聊天机器人,结合bert意图分析,基于rasa框架)
Stars: ✭ 97 (-35.76%)
berserkerBerserker - BERt chineSE woRd toKenizER
Stars: ✭ 17 (-88.74%)
bert-tensorflow-pytorch-spacy-conversionInstructions for how to convert a BERT Tensorflow model to work with HuggingFace's pytorch-transformers, and spaCy. This walk-through uses DeepPavlov's RuBERT as example.
Stars: ✭ 26 (-82.78%)
LMMSLanguage Modelling Makes Sense - WSD (and more) with Contextual Embeddings
Stars: ✭ 79 (-47.68%)