Clue中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
Stars: ✭ 2,425 (+1027.91%)
Mutual labels: transformers, language-model, albert
backpropBackprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Stars: ✭ 229 (+6.51%)
Mutual labels: transformers, language-model
wechselCode for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.
Stars: ✭ 39 (-81.86%)
Mutual labels: transformers, language-model
KoBERT-TransformersKoBERT on 🤗 Huggingface Transformers 🤗 (with Bug Fixed)
Stars: ✭ 162 (-24.65%)
Mutual labels: transformers, korean-nlp
KoELECTRA-PipelineTransformers Pipeline with KoELECTRA
Stars: ✭ 37 (-82.79%)
Mutual labels: transformers, korean-nlp
COCO-LM[NeurIPS 2021] COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
Stars: ✭ 109 (-49.3%)
Mutual labels: transformers, language-model
language-plannerOfficial Code for "Language Models as Zero-Shot Planners: Extracting Actionable Knowledge for Embodied Agents"
Stars: ✭ 84 (-60.93%)
Mutual labels: transformers, language-model
miniconsUtility for analyzing Transformer based representations of language.
Stars: ✭ 28 (-86.98%)
Mutual labels: transformers, language-model
Tokenizers💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
Stars: ✭ 5,077 (+2261.4%)
Mutual labels: transformers, language-model
Haystack🔍 Haystack is an open source NLP framework that leverages Transformer models. It enables developers to implement production-ready neural search, question answering, semantic document search and summarization for a wide range of applications.
Stars: ✭ 3,409 (+1485.58%)
Mutual labels: transformers, language-model
Spark NlpState of the Art Natural Language Processing
Stars: ✭ 2,518 (+1071.16%)
Mutual labels: transformers, albert
PLBARTOfficial code of our work, Unified Pre-training for Program Understanding and Generation [NAACL 2021].
Stars: ✭ 151 (-29.77%)
Mutual labels: language-model
nlp-papersMust-read papers on Natural Language Processing (NLP)
Stars: ✭ 87 (-59.53%)
Mutual labels: transformers
Nlp ArchitectA model library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing neural networks
Stars: ✭ 2,768 (+1187.44%)
Mutual labels: transformers
LegalQAKorean LegalQA using SentenceKoBART
Stars: ✭ 77 (-64.19%)
Mutual labels: korean-nlp
Fengshenbang-LMFengshenbang-LM(封神榜大模型)是IDEA研究院认知计算与自然语言研究中心主导的大模型开源体系,成为中文AIGC和认知智能的基础设施。
Stars: ✭ 1,813 (+743.26%)
Mutual labels: transformers
Pytorch Sentiment AnalysisTutorials on getting started with PyTorch and TorchText for sentiment analysis.
Stars: ✭ 3,209 (+1392.56%)
Mutual labels: transformers
Nn🧑🏫 50! Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Stars: ✭ 5,720 (+2560.47%)
Mutual labels: transformers
gplPowerful unsupervised domain adaptation method for dense retrieval. Requires only unlabeled corpus and yields massive improvement: "GPL: Generative Pseudo Labeling for Unsupervised Domain Adaptation of Dense Retrieval" https://arxiv.org/abs/2112.07577
Stars: ✭ 216 (+0.47%)
Mutual labels: transformers