All Projects → roberta-wwm-base-distill → Similar Projects or Alternatives

332 Open source projects that are alternatives of or similar to roberta-wwm-base-distill

Clue
中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
Stars: ✭ 2,425 (+3875.41%)
Mutual labels:  pretrained-models, bert, roberta
vietnamese-roberta
A Robustly Optimized BERT Pretraining Approach for Vietnamese
Stars: ✭ 22 (-63.93%)
Mutual labels:  pretrained-models, bert, roberta
Tianchi2020ChineseMedicineQuestionGeneration
2020 阿里云天池大数据竞赛-中医药文献问题生成挑战赛
Stars: ✭ 20 (-67.21%)
Mutual labels:  bert, roberta
erc
Emotion recognition in conversation
Stars: ✭ 34 (-44.26%)
Mutual labels:  bert, roberta
KLUE
📖 Korean NLU Benchmark
Stars: ✭ 420 (+588.52%)
Mutual labels:  bert, roberta
HugsVision
HugsVision is a easy to use huggingface wrapper for state-of-the-art computer vision
Stars: ✭ 154 (+152.46%)
Mutual labels:  pretrained-models, bert
Transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Stars: ✭ 55,742 (+91280.33%)
Mutual labels:  pretrained-models, bert
syntaxdot
Neural syntax annotator, supporting sequence labeling, lemmatization, and dependency parsing.
Stars: ✭ 32 (-47.54%)
Mutual labels:  pretrained-models, bert
Transformer-QG-on-SQuAD
Implement Question Generator with SOTA pre-trained Language Models (RoBERTa, BERT, GPT, BART, T5, etc.)
Stars: ✭ 28 (-54.1%)
Mutual labels:  bert, roberta
COVID-19-Tweet-Classification-using-Roberta-and-Bert-Simple-Transformers
Rank 1 / 216
Stars: ✭ 24 (-60.66%)
Mutual labels:  bert, roberta
Albert zh
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
Stars: ✭ 3,500 (+5637.7%)
Mutual labels:  bert, roberta
Roberta zh
RoBERTa中文预训练模型: RoBERTa for Chinese
Stars: ✭ 1,953 (+3101.64%)
Mutual labels:  bert, roberta
Chinese Bert Wwm
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
Stars: ✭ 6,357 (+10321.31%)
Mutual labels:  bert, roberta
CLUE pytorch
CLUE baseline pytorch CLUE的pytorch版本基线
Stars: ✭ 72 (+18.03%)
Mutual labels:  bert, roberta
Text-Summarization
Abstractive and Extractive Text summarization using Transformers.
Stars: ✭ 38 (-37.7%)
Mutual labels:  bert, roberta
transformer-models
Deep Learning Transformer models in MATLAB
Stars: ✭ 90 (+47.54%)
Mutual labels:  pretrained-models, bert
Awesome Sentence Embedding
A curated list of pretrained sentence and word embedding models
Stars: ✭ 1,973 (+3134.43%)
Mutual labels:  pretrained-models, bert
bert-squeeze
🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
Stars: ✭ 56 (-8.2%)
Mutual labels:  bert, distillation
Pytorch-NLU
Pytorch-NLU,一个中文文本分类、序列标注工具包,支持中文长文本、短文本的多类、多标签分类任务,支持中文命名实体识别、词性标注、分词等序列标注任务。 Ptorch NLU, a Chinese text classification and sequence annotation toolkit, supports multi class and multi label classification tasks of Chinese long text and short text, and supports sequence annotation tasks such as Chinese named entity recognition, part of speech ta…
Stars: ✭ 151 (+147.54%)
Mutual labels:  pretrained-models, bert
Bertviz
Tool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)
Stars: ✭ 3,443 (+5544.26%)
Mutual labels:  bert, roberta
AiSpace
AiSpace: Better practices for deep learning model development and deployment For Tensorflow 2.0
Stars: ✭ 28 (-54.1%)
Mutual labels:  pretrained-models, bert
les-military-mrc-rank7
莱斯杯:全国第二届“军事智能机器阅读”挑战赛 - Rank7 解决方案
Stars: ✭ 37 (-39.34%)
Mutual labels:  bert, roberta
intruder-detector-python
Build an application that alerts you when someone enters a restricted area. Learn how to use models for multiclass object detection.
Stars: ✭ 16 (-73.77%)
Mutual labels:  pretrained-models
KitanaQA
KitanaQA: Adversarial training and data augmentation for neural question-answering models
Stars: ✭ 58 (-4.92%)
Mutual labels:  bert
bert for corrector
基于bert进行中文文本纠错
Stars: ✭ 199 (+226.23%)
Mutual labels:  bert
question generator
An NLP system for generating reading comprehension questions
Stars: ✭ 188 (+208.2%)
Mutual labels:  bert
ProteinLM
Protein Language Model
Stars: ✭ 76 (+24.59%)
Mutual labels:  pretrained-models
SA-BERT
CIKM 2020: Speaker-Aware BERT for Multi-Turn Response Selection in Retrieval-Based Chatbots
Stars: ✭ 71 (+16.39%)
Mutual labels:  bert
oreilly-bert-nlp
This repository contains code for the O'Reilly Live Online Training for BERT
Stars: ✭ 19 (-68.85%)
Mutual labels:  bert
wechsel
Code for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.
Stars: ✭ 39 (-36.07%)
Mutual labels:  bert
Romanian-Transformers
This repo is the home of Romanian Transformers.
Stars: ✭ 60 (-1.64%)
Mutual labels:  bert
PCPM
Presenting Collection of Pretrained Models. Links to pretrained models in NLP and voice.
Stars: ✭ 21 (-65.57%)
Mutual labels:  pretrained-models
JointIDSF
BERT-based joint intent detection and slot filling with intent-slot attention mechanism (INTERSPEECH 2021)
Stars: ✭ 55 (-9.84%)
Mutual labels:  bert
BiaffineDependencyParsing
BERT+Self-attention Encoder ; Biaffine Decoder ; Pytorch Implement
Stars: ✭ 67 (+9.84%)
Mutual labels:  bert
neural-ranking-kd
Improving Efficient Neural Ranking Models with Cross-Architecture Knowledge Distillation
Stars: ✭ 74 (+21.31%)
Mutual labels:  bert
hard-label-attack
Natural Language Attacks in a Hard Label Black Box Setting.
Stars: ✭ 26 (-57.38%)
Mutual labels:  bert
CheXbert
Combining Automatic Labelers and Expert Annotations for Accurate Radiology Report Labeling Using BERT
Stars: ✭ 51 (-16.39%)
Mutual labels:  bert
banglabert
This repository contains the official release of the model "BanglaBERT" and associated downstream finetuning code and datasets introduced in the paper titled "BanglaBERT: Language Model Pretraining and Benchmarks for Low-Resource Language Understanding Evaluation in Bangla" accpeted in Findings of the Annual Conference of the North American Chap…
Stars: ✭ 186 (+204.92%)
Mutual labels:  bert
ghostnet.pytorch
73.6% GhostNet 1.0x pre-trained model on ImageNet
Stars: ✭ 90 (+47.54%)
Mutual labels:  pretrained-models
BERT-QE
Code and resources for the paper "BERT-QE: Contextualized Query Expansion for Document Re-ranking".
Stars: ✭ 43 (-29.51%)
Mutual labels:  bert
PromptPapers
Must-read papers on prompt-based tuning for pre-trained language models.
Stars: ✭ 2,317 (+3698.36%)
Mutual labels:  bert
tensorflow-ml-nlp-tf2
텐서플로2와 머신러닝으로 시작하는 자연어처리 (로지스틱회귀부터 BERT와 GPT3까지) 실습자료
Stars: ✭ 245 (+301.64%)
Mutual labels:  bert
TriB-QA
吹逼我们是认真的
Stars: ✭ 45 (-26.23%)
Mutual labels:  bert
korpatbert
특허분야 특화된 한국어 AI언어모델 KorPatBERT
Stars: ✭ 48 (-21.31%)
Mutual labels:  bert
BertSimilarity
Computing similarity of two sentences with google's BERT algorithm。利用Bert计算句子相似度。语义相似度计算。文本相似度计算。
Stars: ✭ 348 (+470.49%)
Mutual labels:  bert
sparsezoo
Neural network model repository for highly sparse and sparse-quantized models with matching sparsification recipes
Stars: ✭ 264 (+332.79%)
Mutual labels:  pretrained-models
BERT-chinese-text-classification-pytorch
This repo contains a PyTorch implementation of a pretrained BERT model for text classification.
Stars: ✭ 92 (+50.82%)
Mutual labels:  bert
AnnA Anki neuronal Appendix
Using machine learning on your anki collection to enhance the scheduling via semantic clustering and semantic similarity
Stars: ✭ 39 (-36.07%)
Mutual labels:  bert
NLP-paper
🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-62.3%)
Mutual labels:  bert
gender-unbiased BERT-based pronoun resolution
Source code for the ACL workshop paper and Kaggle competition by Google AI team
Stars: ✭ 42 (-31.15%)
Mutual labels:  bert
CAIL
法研杯CAIL2019阅读理解赛题参赛模型
Stars: ✭ 34 (-44.26%)
Mutual labels:  bert
TabFormer
Code & Data for "Tabular Transformers for Modeling Multivariate Time Series" (ICASSP, 2021)
Stars: ✭ 209 (+242.62%)
Mutual labels:  bert
ExpBERT
Code for our ACL '20 paper "Representation Engineering with Natural Language Explanations"
Stars: ✭ 28 (-54.1%)
Mutual labels:  bert
RECCON
This repository contains the dataset and the PyTorch implementations of the models from the paper Recognizing Emotion Cause in Conversations.
Stars: ✭ 126 (+106.56%)
Mutual labels:  roberta
ganimation replicate
An Out-of-the-Box Replication of GANimation using PyTorch, pretrained weights are available!
Stars: ✭ 165 (+170.49%)
Mutual labels:  pretrained-models
R-AT
Regularized Adversarial Training
Stars: ✭ 19 (-68.85%)
Mutual labels:  bert
sticker2
Further developed as SyntaxDot: https://github.com/tensordot/syntaxdot
Stars: ✭ 14 (-77.05%)
Mutual labels:  bert
object-size-detector-python
Monitor mechanical bolts as they move down a conveyor belt. When a bolt of an irregular size is detected, this solution emits an alert.
Stars: ✭ 26 (-57.38%)
Mutual labels:  pretrained-models
rasa-bert-finetune
支持rasa-nlu 的bert finetune
Stars: ✭ 46 (-24.59%)
Mutual labels:  bert
gap-text2sql
GAP-text2SQL: Learning Contextual Representations for Semantic Parsing with Generation-Augmented Pre-Training
Stars: ✭ 83 (+36.07%)
Mutual labels:  pretrained-models
1-60 of 332 similar projects