All Projects → cyk1337 → Transformer-in-PyTorch

cyk1337 / Transformer-in-PyTorch

Licence: Apache-2.0 license
Transformer/Transformer-XL/R-Transformer examples and explanations

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Transformer-in-PyTorch

robustness-vit
Contains code for the paper "Vision Transformers are Robust Learners" (AAAI 2022).
Stars: ✭ 78 (+271.43%)
Mutual labels:  transformers, self-attention
iPerceive
Applying Common-Sense Reasoning to Multi-Modal Dense Video Captioning and Video Question Answering | Python3 | PyTorch | CNNs | Causality | Reasoning | LSTMs | Transformers | Multi-Head Self Attention | Published in IEEE Winter Conference on Applications of Computer Vision (WACV) 2021
Stars: ✭ 52 (+147.62%)
Mutual labels:  transformers, self-attention
Nlp Architect
A model library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing neural networks
Stars: ✭ 2,768 (+13080.95%)
Mutual labels:  transformers
naru
Neural Relation Understanding: neural cardinality estimators for tabular data
Stars: ✭ 76 (+261.9%)
Mutual labels:  transformers
gpl
Powerful unsupervised domain adaptation method for dense retrieval. Requires only unlabeled corpus and yields massive improvement: "GPL: Generative Pseudo Labeling for Unsupervised Domain Adaptation of Dense Retrieval" https://arxiv.org/abs/2112.07577
Stars: ✭ 216 (+928.57%)
Mutual labels:  transformers
COCO-LM
[NeurIPS 2021] COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
Stars: ✭ 109 (+419.05%)
Mutual labels:  transformers
MASTER-pytorch
Code for the paper "MASTER: Multi-Aspect Non-local Network for Scene Text Recognition" (Pattern Recognition 2021)
Stars: ✭ 263 (+1152.38%)
Mutual labels:  self-attention
Nn
🧑‍🏫 50! Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Stars: ✭ 5,720 (+27138.1%)
Mutual labels:  transformers
TransQuest
Transformer based translation quality estimation
Stars: ✭ 85 (+304.76%)
Mutual labels:  transformers
query-selector
LONG-TERM SERIES FORECASTING WITH QUERYSELECTOR – EFFICIENT MODEL OF SPARSEATTENTION
Stars: ✭ 63 (+200%)
Mutual labels:  self-attention
KB-ALBERT
KB국민은행에서 제공하는 경제/금융 도메인에 특화된 한국어 ALBERT 모델
Stars: ✭ 215 (+923.81%)
Mutual labels:  transformers
Transformer-Implementations
Library - Vanilla, ViT, DeiT, BERT, GPT
Stars: ✭ 34 (+61.9%)
Mutual labels:  transformers
seq2seq-pytorch
Sequence to Sequence Models in PyTorch
Stars: ✭ 41 (+95.24%)
Mutual labels:  self-attention
thermostat
Collection of NLP model explanations and accompanying analysis tools
Stars: ✭ 126 (+500%)
Mutual labels:  transformers
nlp-papers
Must-read papers on Natural Language Processing (NLP)
Stars: ✭ 87 (+314.29%)
Mutual labels:  transformers
R-MeN
Transformer-based Memory Networks for Knowledge Graph Embeddings (ACL 2020) (Pytorch and Tensorflow)
Stars: ✭ 74 (+252.38%)
Mutual labels:  self-attention
Pytorch Sentiment Analysis
Tutorials on getting started with PyTorch and TorchText for sentiment analysis.
Stars: ✭ 3,209 (+15180.95%)
Mutual labels:  transformers
Fengshenbang-LM
Fengshenbang-LM(封神榜大模型)是IDEA研究院认知计算与自然语言研究中心主导的大模型开源体系,成为中文AIGC和认知智能的基础设施。
Stars: ✭ 1,813 (+8533.33%)
Mutual labels:  transformers
SnowflakeNet
(TPAMI 2022) Snowflake Point Deconvolution for Point Cloud Completion and Generation with Skip-Transformer
Stars: ✭ 74 (+252.38%)
Mutual labels:  transformers
jax-models
Unofficial JAX implementations of deep learning research papers
Stars: ✭ 108 (+414.29%)
Mutual labels:  transformers

Transformer-in-PyTorch

GitHub

  • This repo contains the Transformer variants implementation in PyTorch (Transformer / Transformer-XL / R-Transformer). PR is welcome.
  • If you are unfamilar with Transformer and its variants, refer to my blog: transformer explanation.

Citation

For attribution in academic contexts, please cite this work as:

@misc{chai2019-transformer-in-pytorch,
  author = {Chai, Yekun},
  title = {Transformer-in-PyTorch},
  year = {2019},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/cyk1337/Transformer-in-PyTorch}}
}

@misc{chai2019attn-summary,
  author = {Chai, Yekun},
  title = {{Attention in a Nutshell}},
  year = {2019},
  howpublished = {\url{http://cyk1337.github.io/notes/2019/01/22/NLP/Attention-in-a-nutshell/}},
}

References:

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].