bert nliA Natural Language Inference (NLI) model based on Transformers (BERT and ALBERT)
consistencyImplementation of models in our EMNLP 2019 paper: A Logic-Driven Framework for Consistency of Neural Models
anonymisationAnonymization of legal cases (Fr) based on Flair embeddings
roberta-wwm-base-distillthis is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large
AnnA Anki neuronal AppendixUsing machine learning on your anki collection to enhance the scheduling via semantic clustering and semantic similarity
NLP-paper🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
NAG-BERT[EACL'21] Non-Autoregressive with Pretrained Language Model
wechselCode for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.
PromptPapersMust-read papers on prompt-based tuning for pre-trained language models.
sticker2Further developed as SyntaxDot: https://github.com/tensordot/syntaxdot
DE-LIMITDeEpLearning models for MultIlingual haTespeech (DELIMIT): Benchmarking multilingual models across 9 languages and 16 datasets.
SA-BERTCIKM 2020: Speaker-Aware BERT for Multi-Turn Response Selection in Retrieval-Based Chatbots
KitanaQAKitanaQA: Adversarial training and data augmentation for neural question-answering models
JointIDSFBERT-based joint intent detection and slot filling with intent-slot attention mechanism (INTERSPEECH 2021)
banglabertThis repository contains the official release of the model "BanglaBERT" and associated downstream finetuning code and datasets introduced in the paper titled "BanglaBERT: Language Model Pretraining and Benchmarks for Low-Resource Language Understanding Evaluation in Bangla" accpeted in Findings of the Annual Conference of the North American Chap…
BertSimilarityComputing similarity of two sentences with google's BERT algorithm。利用Bert计算句子相似度。语义相似度计算。文本相似度计算。
CAIL法研杯CAIL2019阅读理解赛题参赛模型
backpropBackprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
ganbertEnhancing the BERT training with Semi-supervised Generative Adversarial Networks
DiscEvalDiscourse Based Evaluation of Language Understanding
oreilly-bert-nlpThis repository contains code for the O'Reilly Live Online Training for BERT
Transformer-QG-on-SQuADImplement Question Generator with SOTA pre-trained Language Models (RoBERTa, BERT, GPT, BART, T5, etc.)
neural-ranking-kdImproving Efficient Neural Ranking Models with Cross-Architecture Knowledge Distillation
CheXbertCombining Automatic Labelers and Expert Annotations for Accurate Radiology Report Labeling Using BERT
BERT-QECode and resources for the paper "BERT-QE: Contextualized Query Expansion for Document Re-ranking".
TabFormerCode & Data for "Tabular Transformers for Modeling Multivariate Time Series" (ICASSP, 2021)
ExpBERTCode for our ACL '20 paper "Representation Engineering with Natural Language Explanations"
R-ATRegularized Adversarial Training
beirA Heterogeneous Benchmark for Information Retrieval. Easy to use, evaluate your models across 15+ diverse IR datasets.
bert attn vizVisualize BERT's self-attention layers on text classification tasks
AliceMindALIbaba's Collection of Encoder-decoders from MinD (Machine IntelligeNce of Damo) Lab
sisterSImple SenTence EmbeddeR