All Projects → lonePatient → Awesome Pretrained Chinese Nlp Models

lonePatient / Awesome Pretrained Chinese Nlp Models

Licence: mit
Awesome Pretrained Chinese NLP Models,高质量中文预训练模型集合

Projects that are alternatives of or similar to Awesome Pretrained Chinese Nlp Models

Clue
中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
Stars: ✭ 2,425 (+1143.59%)
Mutual labels:  chinese, pretrained-models, nlu
gap-text2sql
GAP-text2SQL: Learning Contextual Representations for Semantic Parsing with Generation-Augmented Pre-Training
Stars: ✭ 83 (-57.44%)
Mutual labels:  nlu, pretrained-models
AiSpace
AiSpace: Better practices for deep learning model development and deployment For Tensorflow 2.0
Stars: ✭ 28 (-85.64%)
Mutual labels:  chinese, pretrained-models
Cluepretrainedmodels
高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型
Stars: ✭ 493 (+152.82%)
Mutual labels:  chinese, pretrained-models
xbot
Task-oriented Chatbot
Stars: ✭ 78 (-60%)
Mutual labels:  nlu, nlg
Chatette
A powerful dataset generator for Rasa NLU, inspired by Chatito
Stars: ✭ 205 (+5.13%)
Mutual labels:  nlu, nlg
pen.el
Pen.el stands for Prompt Engineering in emacs. It facilitates the creation, discovery and usage of prompts to language models. Pen supports OpenAI, EleutherAI, Aleph-Alpha, HuggingFace and others. It's the engine for the LookingGlass imaginary web browser.
Stars: ✭ 376 (+92.82%)
Mutual labels:  nlu, nlg
Gluon Nlp
NLP made easy
Stars: ✭ 2,344 (+1102.05%)
Mutual labels:  nlu, nlg
Chatito
🎯🗯 Generate datasets for AI chatbots, NLP tasks, named entity recognition or text classification models using a simple DSL!
Stars: ✭ 678 (+247.69%)
Mutual labels:  nlu, nlg
Tc Bot
User Simulation for Task-Completion Dialogues
Stars: ✭ 733 (+275.9%)
Mutual labels:  nlu, nlg
Chatbot cn
基于金融-司法领域(兼有闲聊性质)的聊天机器人,其中的主要模块有信息抽取、NLU、NLG、知识图谱等,并且利用Django整合了前端展示,目前已经封装了nlp和kg的restful接口
Stars: ✭ 791 (+305.64%)
Mutual labels:  nlu, nlg
Nlp Recipes
Natural Language Processing Best Practices & Examples
Stars: ✭ 5,783 (+2865.64%)
Mutual labels:  pretrained-models, nlu
Gpt2 Ml
GPT2 for Multiple Languages, including pretrained models. GPT2 多语言支持, 15亿参数中文预训练模型
Stars: ✭ 1,066 (+446.67%)
Mutual labels:  chinese, pretrained-models
Nlp Papers
Papers and Book to look at when starting NLP 📚
Stars: ✭ 111 (-43.08%)
Mutual labels:  nlu, nlg
Facenet Pytorch
Pretrained Pytorch face detection (MTCNN) and facial recognition (InceptionResnet) models
Stars: ✭ 2,564 (+1214.87%)
Mutual labels:  pretrained-models
Jszhuyin
JS 注音:JavaScript 自動選字注音輸入法;"Smart" Chinese Zhuyin Input Method in JavaScript.
Stars: ✭ 184 (-5.64%)
Mutual labels:  chinese
Reading And Annotate Rocketmq 3.4.6
阿里巴巴分布式消息队列中间件rocketmq-3.4.6源码分析、中文详细注释,停止更新(源码学习交流QQ群:568892619)
Stars: ✭ 159 (-18.46%)
Mutual labels:  chinese
Bert Ner Tf
Named Entity Recognition with BERT using TensorFlow 2.0
Stars: ✭ 155 (-20.51%)
Mutual labels:  pretrained-models
Char Rnn Chinese
Multi-layer Recurrent Neural Networks (LSTM, GRU, RNN) for character-level language models in Torch. Based on code of https://github.com/karpathy/char-rnn. Support Chinese and other things.
Stars: ✭ 192 (-1.54%)
Mutual labels:  chinese
Leetcode Python
LeetCode solutions in Python2. LeetCode题解 in Python2。
Stars: ✭ 182 (-6.67%)
Mutual labels:  chinese

Awesome Pretrained Chinese NLP ModelsAwesome

在自然语言处理领域中,预训练语言模型(Pretrained Language Models)已成为非常重要的基础技术,本仓库主要收集目前网上公开的一些高质量中文预训练模型(感谢分享资源的大佬),并将持续更新......

: 🤗huggingface模型下载地址: 1. 清华大学开源镜像 2. 官方地址

Expand Table of Contents

NLU系列

BERT

  • 2018 | BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding | Jacob Devlin, et al. | arXiv | PDF
  • 2019 | Pre-Training with Whole Word Masking for Chinese BERT | Yiming Cui, et al. | arXiv | PDF
模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
BERT-Base base Google Drive Google Research github 通用
BERT-wwm base

Google Drive
讯飞云-07Xj

Google Drive Yiming Cui github 通用
BERT-wwm-ext base

Google Drive
讯飞云-4cMG

Google Drive Yiming Cui github 通用
bert-base-民事 base 阿里云 THUNLP github 司法
bert-base-刑事 base 阿里云 THUNLP github 司法
BAAI-JDAI-BERT base 京东云 JDAI github 电商客服对话
FinBERT base

Google Drive
百度网盘-1cmp

Google Drive
百度网盘-986f

Value Simplex github 金融科技领域
EduBERT base 好未来AI 好未来AI tal-tech github 教育领域
MC-BERT base Google Drive Alibaba AI Research github 医学领域
guwenbert-base base

百度网盘-4jng
huggingface

Ethan github 古文领域
guwenbert-large large

百度网盘-m5sz
huggingface

Ethan github 古文领域

备注:

wwm全称为**Whole Word Masking **,一个完整的词的部分WordPiece子词被mask,则同属该词的其他部分也会被mask

ext表示在更多数据集下训练

RoBERTa

  • 2019 | RoBERTa: A Robustly Optimized BERT Pretraining Approach | Yinhan Liu, et al. | arXiv | PDF
模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
RoBERTa-tiny-clue tiny Google Drive 百度网盘-8qvb CLUE github 通用
RoBERTa-tiny-pair tiny google drive 百度网盘-8qvb CLUE github 通用
RoBERTa-tiny3L768-clue tiny Google Drive CLUE github 通用
RoBERTa-tiny3L312-clue tiny google drive 百度网盘-8qvb CLUE github 通用
RoBERTa-large-pair large Google Drive 百度网盘-8qvb CLUE github 通用
RoBERTa-large-clue large google drive 百度网盘-8qvb CLUE github 通用
RBT3 3层base

Google Drive
讯飞云-b9nx

Google Drive Yiming Cui github 通用
RBTL3 3层large

Google Drive
讯飞云-vySW

Google Drive Yiming Cui github 通用
RBTL4 4层large 讯飞云-e8dN Yiming Cui github 通用
RBTL6 6层large 讯飞云-XNMA Yiming Cui github 通用
RoBERTa-wwm-ext base

Google Drive
讯飞云-Xe1p

Google Drive Yiming Cui github 通用
RoBERTa-wwm-ext-large large

Google Drive
讯飞云-u6gC

Google Drive Yiming Cui github 通用
RoBERTa-base base

Google Drive
百度网盘

Google Drive
百度网盘

brightmart github 通用
RoBERTa-Large large

Google Drive
百度网盘

Google Drive brightmart github 通用
RoBERTa-tiny tiny huggingface huggingface DBIIR @ RUC UER 通用
RoBERTa-mini mini huggingface huggingface DBIIR @ RUC UER 通用
RoBERTa-small small huggingface huggingface DBIIR @ RUC UER 通用
RoBERTa-medium medium huggingface huggingface DBIIR @ RUC UER 通用
RoBERTa-base base huggingface huggingface DBIIR @ RUC UER 通用

ALBERT

  • 2019 | ALBERT: A Lite BERT For Self-Supervised Learning Of Language Representations | Zhenzhong Lan, et al. | arXiv | PDF
模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
Albert_tiny tiny Google Drive Google Drive brightmart github 通用
Albert_base_zh base Google Drive Google Drive brightmart github 通用
Albert_large_zh large Google Drive Google Drive brightmart github 通用
Albert_xlarge_zh xlarge Google Drive Google Drive brightmart github 通用
Albert_base base Google Drive Google Research github 通用
Albert_large large Google Drive Google Research github 通用
Albert_xlarge xlarge Google Drive Google Research github 通用
Albert_xxlarge xxlarge Google Drive Google Research github 通用

NEZHA

  • 2019 | NEZHA: Neural Contextualized Representation for Chinese Language Understanding | Junqiu Wei, et al. | arXiv | PDF
模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
NEZHA-base base

Google Drive
百度网盘-ntn3

lonePatient HUAWEI github 通用
NEZHA-base-wwm base

Google Drive
百度网盘-f68o

lonePatient HUAWEI github 通用
NEZHA-large large

Google Drive
百度网盘-7thu

lonePatient HUAWEI github 通用
NEZHA-large-wwm large

Google Drive
百度网盘-ni4o

lonePatient HUAWEI github 通用

WoNEZHA(word-base)

base 百度网盘-qgkq ZhuiyiTechnology github 通用

MacBERT

  • 2020 | Revisiting Pre-Trained Models for Chinese Natural Language Processing | Yiming Cui, et al. | arXiv | PDF
模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
MacBERT-base base

Google Drive
讯飞云-E2cP

Yiming Cui github 通用
MacBERT-large large

Google Drive
讯飞云-3Yg3

Yiming Cui github 通用

WoBERT

  • 2020 | 提速不掉点:基于词颗粒度的中文WoBERT | 苏剑林. | spaces | Blog post
模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
WoBERT base 百度网盘-kim2 ZhuiyiTechnology github 通用
WoBERT-plus base 百度网盘-aedw ZhuiyiTechnology github 通用

XLNET

  • 2019 | XLNet: Generalized Autoregressive Pretraining for Language Understanding | Zhilin Yang, et al. | arXiv | PDF
模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
XLNet-base base

Google Drive
讯飞云-uCpe

Google Drive Yiming Cui github 通用
XLNet-mid middle

Google Drive
讯飞云-68En

Google Drive Yiming Cui github 通用
XLNet_zh_Large large 百度网盘 brightmart github 通用

ELECTRA

  • 2020 | ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators | Kevin Clark, et al. | arXiv | PDF
模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
ELECTRA-180g-large large

Google Drive
讯飞云-Yfcy

Yiming Cui github 通用
ELECTRA-180g-small-ex small

Google Drive
讯飞云-GUdp

Yiming Cui github 通用
ELECTRA-180g-base base

Google Drive
讯飞云-Xcvm

Yiming Cui github 通用
ELECTRA-180g-small small

Google Drive
讯飞云-qsHj

Yiming Cui github 通用
legal-ELECTRA-large large

Google Drive
讯飞云-7f7b

Yiming Cui github 司法领域
legal-ELECTRA-base base

Google Drive
讯飞云-7f7b

Yiming Cui github 司法领域
legal-ELECTRA-small small

Google Drive
讯飞云-7f7b

Yiming Cui github 司法领域
ELECTRA-tiny tiny

Google Drive
百度网盘-rs99

CLUE github 通用

ZEN

  • 2019 | ZEN: Pre-training Chinese Text Encoder Enhanced by N-gram Representations | Shizhe Diao, et al. | arXiv | PDF
模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
ZEN-Base base

Google Drive
百度网盘

Sinovation Ventures AI Institute github 通用

ERNIE

  • 2019 | ERNIE: Enhanced Representation through Knowledge Integration | Yu Sun, et al. | arXiv | PDF

  • 2020 | SKEP: Sentiment Knowledge Enhanced Pre-training for Sentiment Analysis | Hao Tian, et al. | arXiv | PDF

模型 版本 PaddlePaddle PyTorch 作者 源地址 应用领域
ernie-1.0-base base link PaddlePaddle github 通用
ernie_1.0_skep_large large link Baidu github 情感分析

备注:

PaddlePaddle转TensorFlow可参考: tensorflow_ernie

PaddlePaddle转PyTorch可参考: ERNIE-Pytorch

NLG系列

GPT

  • 2019 | Improving Language Understandingby Generative Pre-Training | Alec Radford, et al. | arXiv | PDF

  • 2019 | Language Models are Unsupervised Multitask Learners | Alec Radford, et al. | arXiv | PDF

模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
GPT2 30亿语料

Google Drive
百度网盘-ffz6

Caspar ZHANG gpt2-ml 通用
GPT2 15亿语料

Google Drive
百度网盘-q9vr

Caspar ZHANG gpt2-ml 通用
CDial-GPTLCCC-base base huggingface thu-coai CDial-GPT 中文对话
CDial-GPT2LCCC-base base huggingface thu-coai CDial-GPT 中文对话
CDial-GPTLCCC-large large huggingface thu-coai CDial-GPT 中文对话
GPT2-dialogue base

Google Drive百度网盘-osi6

yangjianxin1 GPT2-chitchat 闲聊对话
GPT2-mmi base

Google Drive百度网盘-1j88

yangjianxin1 GPT2-chitchat 闲聊对话
GPT2-散文模型 base

Google Drive百度网盘-fpyu

Zeyao Du GPT2-Chinese 散文
GPT2-诗词模型 base

Google Drive百度网盘-7fev

Zeyao Du GPT2-Chinese 诗词
GPT2-对联模型 base

Google Drive百度网盘-i5n0

Zeyao Du GPT2-Chinese 对联

NEZHA-Gen

  • 2019 | NEZHA: Neural Contextualized Representation for Chinese Language Understanding | Junqiu Wei, et al. | arXiv | PDF

  • 2019 | Improving Language Understandingby Generative Pre-Training | Alec Radford, et al. | arXiv | PDF

模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
NEZHA-Gen base

Google Drive
百度网盘-rb5m

HUAWEI github 通用
NEZHA-Gen base

Google Drive
百度网盘-ytim

HUAWEI github 诗歌

CPM-Generate

  • 2020 | CPM: A Large-scale Generative Chinese Pre-trained Language Model | Zhengyan Zhang, et al. | arXiv | PDF
模型 版本 资源 PyTorch 作者 源地址 应用领域
CPM 26亿参数 项目首页 模型下载 Tsinghua AI github 通用

备注:

PyTorch转TensorFlow可参考: CPM-LM-TF2

PyTorch转PaddlePaddle可参考: CPM-Generate-Paddle

T5

  • 2019 | Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer | Colin Raffel, et al. | arXiv | PDF
模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
T5 small huggingface huggingface DBIIR @ RUC UER 通用

T5-PEGASUS

  • 2019 | Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer | Colin Raffel, et al. | arXiv | PDF

  • 2019 | PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization | Jingqing Zhang, et al. | arXiv | PDF

  • 2021 | T5 PEGASUS:开源一个中文生成式预训练模型 | 苏剑林. | spaces | Blog post

模型 版本 Keras PyTorch 作者 源地址 应用领域
T5 PEGASUS base 百度网盘-3sfn ZhuiyiTechnology github 通用
T5 PEGASUS small 百度网盘-qguk ZhuiyiTechnology github 通用

Keras转PyTorch可参考: t5-pegasus-pytorch

NLU-NLG系列

UniLM

  • 2019 | Unified Language Model Pre-training for Natural Language Understanding and Generation | Li Dong, et al. | arXiv | PDF
模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
Unilm base 百度网盘-tblr 百度网盘-etwf YunwenTechnology github 通用

Simbert

  • 2020 | 鱼与熊掌兼得:融合检索和生成的SimBERT模型 | 苏剑林. | spaces | Blog post
模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
SimBERT Tiny tiny 百度网盘-1tp7 ZhuiyiTechnology github 通用
SimBERT Small small 百度网盘-nu67 ZhuiyiTechnology github 通用
SimBERT Base base 百度网盘-6xhq ZhuiyiTechnology github 通用
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].