KashgariKashgari is a production-level NLP Transfer learning framework built on top of tf.keras for text-labeling and text-classification, includes Word2Vec, BERT, and GPT2 Language Embedding.
SIGIR2021 ConureOne Person, One Model, One World: Learning Continual User Representation without Forgetting
PIEFast + Non-Autoregressive Grammatical Error Correction using BERT. Code and Pre-trained models for paper "Parallel Iterative Edit Models for Local Sequence Transduction": www.aclweb.org/anthology/D19-1435.pdf (EMNLP-IJCNLP 2019)
use-cases-of-bertUse-cases of Hugging Face's BERT (e.g. paraphrase generation, unsupervised extractive summarization).
MSR2021-ProgramRepairCode of our paper Applying CodeBERT for Automated Program Repair of Java Simple Bugs which is accepted to MSR 2021.
bert-tensorflow-pytorch-spacy-conversionInstructions for how to convert a BERT Tensorflow model to work with HuggingFace's pytorch-transformers, and spaCy. This walk-through uses DeepPavlov's RuBERT as example.
RECCONThis repository contains the dataset and the PyTorch implementations of the models from the paper Recognizing Emotion Cause in Conversations.
german-sentiment-libAn easy to use python package for deep learning-based german sentiment classification.
Distill-BERT-TextgenResearch code for ACL 2020 paper: "Distilling Knowledge Learned in BERT for Text Generation".
R-BERTPytorch re-implementation of R-BERT model
FinBERT-QAFinancial Domain Question Answering with pre-trained BERT Language Model