Nlp ArchitectA model library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing neural networks
Stars: ✭ 2,768 (+3118.6%)
Mutual labels: transformers
thermostatCollection of NLP model explanations and accompanying analysis tools
Stars: ✭ 126 (+46.51%)
Mutual labels: transformers
jax-modelsUnofficial JAX implementations of deep learning research papers
Stars: ✭ 108 (+25.58%)
Mutual labels: transformers
COCO-LM[NeurIPS 2021] COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
Stars: ✭ 109 (+26.74%)
Mutual labels: transformers
gplPowerful unsupervised domain adaptation method for dense retrieval. Requires only unlabeled corpus and yields massive improvement: "GPL: Generative Pseudo Labeling for Unsupervised Domain Adaptation of Dense Retrieval" https://arxiv.org/abs/2112.07577
Stars: ✭ 216 (+151.16%)
Mutual labels: transformers
KB-ALBERTKB국민은행에서 제공하는 경제/금융 도메인에 특화된 한국어 ALBERT 모델
Stars: ✭ 215 (+150%)
Mutual labels: transformers
Nn🧑🏫 50! Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Stars: ✭ 5,720 (+6551.16%)
Mutual labels: transformers
Basic-UI-for-GPT-J-6B-with-low-vramA repository to run gpt-j-6b on low vram machines (4.2 gb minimum vram for 2000 token context, 3.5 gb for 1000 token context). Model loading takes 12gb free ram.
Stars: ✭ 90 (+4.65%)
Mutual labels: transformers
SnowflakeNet(TPAMI 2022) Snowflake Point Deconvolution for Point Cloud Completion and Generation with Skip-Transformer
Stars: ✭ 74 (-13.95%)
Mutual labels: transformers
TransQuestTransformer based translation quality estimation
Stars: ✭ 85 (-1.16%)
Mutual labels: transformers
KoBERT-TransformersKoBERT on 🤗 Huggingface Transformers 🤗 (with Bug Fixed)
Stars: ✭ 162 (+88.37%)
Mutual labels: transformers
naruNeural Relation Understanding: neural cardinality estimators for tabular data
Stars: ✭ 76 (-11.63%)
Mutual labels: transformers
nlp-papersMust-read papers on Natural Language Processing (NLP)
Stars: ✭ 87 (+1.16%)
Mutual labels: transformers
Transformer-in-PyTorchTransformer/Transformer-XL/R-Transformer examples and explanations
Stars: ✭ 21 (-75.58%)
Mutual labels: transformers
Pytorch Sentiment AnalysisTutorials on getting started with PyTorch and TorchText for sentiment analysis.
Stars: ✭ 3,209 (+3631.4%)
Mutual labels: transformers
LIT[AAAI 2022] This is the official PyTorch implementation of "Less is More: Pay Less Attention in Vision Transformers"
Stars: ✭ 79 (-8.14%)
Mutual labels: transformers
clip-italianCLIP (Contrastive Language–Image Pre-training) for Italian
Stars: ✭ 113 (+31.4%)
Mutual labels: transformers
textUsing Transformers from HuggingFace in R
Stars: ✭ 66 (-23.26%)
Mutual labels: transformers
STAM-pytorchImplementation of STAM (Space Time Attention Model), a pure and simple attention model that reaches SOTA for video classification
Stars: ✭ 109 (+26.74%)
Mutual labels: transformers