KashgariKashgari is a production-level NLP Transfer learning framework built on top of tf.keras for text-labeling and text-classification, includes Word2Vec, BERT, and GPT2 Language Embedding.
Stars: ✭ 2,235 (+5487.5%)
Mutual labels: bert, sequence-labeling
Pytorch-NLUPytorch-NLU,一个中文文本分类、序列标注工具包,支持中文长文本、短文本的多类、多标签分类任务,支持中文命名实体识别、词性标注、分词等序列标注任务。 Ptorch NLU, a Chinese text classification and sequence annotation toolkit, supports multi class and multi label classification tasks of Chinese long text and short text, and supports sequence annotation tasks such as Chinese named entity recognition, part of speech ta…
Stars: ✭ 151 (+277.5%)
Mutual labels: bert, sequence-labeling
deepsegChinese word segmentation in tensorflow 2.x
Stars: ✭ 23 (-42.5%)
Mutual labels: sequence-labeling, bilstm-crf
PIEFast + Non-Autoregressive Grammatical Error Correction using BERT. Code and Pre-trained models for paper "Parallel Iterative Edit Models for Local Sequence Transduction": www.aclweb.org/anthology/D19-1435.pdf (EMNLP-IJCNLP 2019)
Stars: ✭ 164 (+310%)
Mutual labels: bert, sequence-labeling
ChineseNER中文NER的那些事儿
Stars: ✭ 241 (+502.5%)
Mutual labels: bert, bilstm-crf
sticker2Further developed as SyntaxDot: https://github.com/tensordot/syntaxdot
Stars: ✭ 14 (-65%)
Mutual labels: bert
hard-label-attackNatural Language Attacks in a Hard Label Black Box Setting.
Stars: ✭ 26 (-35%)
Mutual labels: bert
SA-BERTCIKM 2020: Speaker-Aware BERT for Multi-Turn Response Selection in Retrieval-Based Chatbots
Stars: ✭ 71 (+77.5%)
Mutual labels: bert
consistencyImplementation of models in our EMNLP 2019 paper: A Logic-Driven Framework for Consistency of Neural Models
Stars: ✭ 26 (-35%)
Mutual labels: bert
Cross-Lingual-MRCCross-Lingual Machine Reading Comprehension (EMNLP 2019)
Stars: ✭ 66 (+65%)
Mutual labels: bert
AnnA Anki neuronal AppendixUsing machine learning on your anki collection to enhance the scheduling via semantic clustering and semantic similarity
Stars: ✭ 39 (-2.5%)
Mutual labels: bert
DE-LIMITDeEpLearning models for MultIlingual haTespeech (DELIMIT): Benchmarking multilingual models across 9 languages and 16 datasets.
Stars: ✭ 90 (+125%)
Mutual labels: bert
Transformers-TutorialsThis repository contains demos I made with the Transformers library by HuggingFace.
Stars: ✭ 2,828 (+6970%)
Mutual labels: bert
CrossNERCrossNER: Evaluating Cross-Domain Named Entity Recognition (AAAI-2021)
Stars: ✭ 87 (+117.5%)
Mutual labels: sequence-labeling
NAG-BERT[EACL'21] Non-Autoregressive with Pretrained Language Model
Stars: ✭ 47 (+17.5%)
Mutual labels: bert
roberta-wwm-base-distillthis is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large
Stars: ✭ 61 (+52.5%)
Mutual labels: bert
wechselCode for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.
Stars: ✭ 39 (-2.5%)
Mutual labels: bert