Tokenizers💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
Stars: ✭ 5,077 (+5541.11%)
Mutual labels: transformers, gpt
TransQuestTransformer based translation quality estimation
Stars: ✭ 85 (-5.56%)
Mutual labels: transformers
COCO-LM[NeurIPS 2021] COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
Stars: ✭ 109 (+21.11%)
Mutual labels: transformers
LIT[AAAI 2022] This is the official PyTorch implementation of "Less is More: Pay Less Attention in Vision Transformers"
Stars: ✭ 79 (-12.22%)
Mutual labels: transformers
Fengshenbang-LMFengshenbang-LM(封神榜大模型)是IDEA研究院认知计算与自然语言研究中心主导的大模型开源体系,成为中文AIGC和认知智能的基础设施。
Stars: ✭ 1,813 (+1914.44%)
Mutual labels: transformers
react-advertisingA JavaScript library for display ads in React applications.
Stars: ✭ 50 (-44.44%)
Mutual labels: gpt
Nlp ArchitectA model library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing neural networks
Stars: ✭ 2,768 (+2975.56%)
Mutual labels: transformers
textUsing Transformers from HuggingFace in R
Stars: ✭ 66 (-26.67%)
Mutual labels: transformers
STAM-pytorchImplementation of STAM (Space Time Attention Model), a pure and simple attention model that reaches SOTA for video classification
Stars: ✭ 109 (+21.11%)
Mutual labels: transformers
thermostatCollection of NLP model explanations and accompanying analysis tools
Stars: ✭ 126 (+40%)
Mutual labels: transformers
SnowflakeNet(TPAMI 2022) Snowflake Point Deconvolution for Point Cloud Completion and Generation with Skip-Transformer
Stars: ✭ 74 (-17.78%)
Mutual labels: transformers
finetune-gpt2xlGuide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpeed
Stars: ✭ 353 (+292.22%)
Mutual labels: gpt-neo
MultiOS-USBBoot operating systems directly from ISO files
Stars: ✭ 106 (+17.78%)
Mutual labels: gpt
KoBERT-TransformersKoBERT on 🤗 Huggingface Transformers 🤗 (with Bug Fixed)
Stars: ✭ 162 (+80%)
Mutual labels: transformers
jax-modelsUnofficial JAX implementations of deep learning research papers
Stars: ✭ 108 (+20%)
Mutual labels: transformers
nlp-papersMust-read papers on Natural Language Processing (NLP)
Stars: ✭ 87 (-3.33%)
Mutual labels: transformers
gplPowerful unsupervised domain adaptation method for dense retrieval. Requires only unlabeled corpus and yields massive improvement: "GPL: Generative Pseudo Labeling for Unsupervised Domain Adaptation of Dense Retrieval" https://arxiv.org/abs/2112.07577
Stars: ✭ 216 (+140%)
Mutual labels: transformers
KB-ALBERTKB국민은행에서 제공하는 경제/금융 도메인에 특화된 한국어 ALBERT 모델
Stars: ✭ 215 (+138.89%)
Mutual labels: transformers
TabFormerCode & Data for "Tabular Transformers for Modeling Multivariate Time Series" (ICASSP, 2021)
Stars: ✭ 209 (+132.22%)
Mutual labels: gpt
Transformer-in-PyTorchTransformer/Transformer-XL/R-Transformer examples and explanations
Stars: ✭ 21 (-76.67%)
Mutual labels: transformers