All Projects → lucidrains → transganformer

lucidrains / transganformer

Licence: MIT License
Implementation of TransGanFormer, an all-attention GAN that combines the finding from the recent GanFormer and TransGan paper

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to transganformer

long-short-transformer
Implementation of Long-Short Transformer, combining local and global inductive biases for attention over long sequences, in Pytorch
Stars: ✭ 103 (-24.82%)
Mutual labels:  transformers, attention-mechanism
Vit Pytorch
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
Stars: ✭ 7,199 (+5154.74%)
Mutual labels:  transformers, attention-mechanism
Ylg
[CVPR 2020] Official Implementation: "Your Local GAN: Designing Two Dimensional Local Attention Mechanisms for Generative Models".
Stars: ✭ 109 (-20.44%)
Mutual labels:  attention-mechanism, generative-adversarial-networks
Dalle Pytorch
Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
Stars: ✭ 3,661 (+2572.26%)
Mutual labels:  transformers, attention-mechanism
uniformer-pytorch
Implementation of Uniformer, a simple attention and 3d convolutional net that achieved SOTA in a number of video classification tasks, debuted in ICLR 2022
Stars: ✭ 90 (-34.31%)
Mutual labels:  transformers, attention-mechanism
STAM-pytorch
Implementation of STAM (Space Time Attention Model), a pure and simple attention model that reaches SOTA for video classification
Stars: ✭ 109 (-20.44%)
Mutual labels:  transformers, attention-mechanism
Reformer Pytorch
Reformer, the efficient Transformer, in Pytorch
Stars: ✭ 1,644 (+1100%)
Mutual labels:  transformers, attention-mechanism
RETRO-pytorch
Implementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch
Stars: ✭ 473 (+245.26%)
Mutual labels:  transformers, attention-mechanism
nuwa-pytorch
Implementation of NÜWA, state of the art attention network for text to video synthesis, in Pytorch
Stars: ✭ 347 (+153.28%)
Mutual labels:  transformers, attention-mechanism
keras attention
🔖 An Attention Layer in Keras
Stars: ✭ 43 (-68.61%)
Mutual labels:  attention-mechanism
attention-guided-sparsity
Attention-Based Guided Structured Sparsity of Deep Neural Networks
Stars: ✭ 26 (-81.02%)
Mutual labels:  attention-mechanism
Diverse-Structure-Inpainting
CVPR 2021: "Generating Diverse Structure for Image Inpainting With Hierarchical VQ-VAE"
Stars: ✭ 131 (-4.38%)
Mutual labels:  generative-adversarial-networks
knowledge-neurons
A library for finding knowledge neurons in pretrained transformer models.
Stars: ✭ 72 (-47.45%)
Mutual labels:  transformers
trapper
State-of-the-art NLP through transformer models in a modular design and consistent APIs.
Stars: ✭ 28 (-79.56%)
Mutual labels:  transformers
mSRGAN-A-GAN-for-single-image-super-resolution-on-high-content-screening-microscopy-images.
Generative Adversarial Network for single image super-resolution in high content screening microscopy images
Stars: ✭ 52 (-62.04%)
Mutual labels:  generative-adversarial-networks
vista-net
Code for the paper "VistaNet: Visual Aspect Attention Network for Multimodal Sentiment Analysis", AAAI'19
Stars: ✭ 67 (-51.09%)
Mutual labels:  attention-mechanism
TermiNetwork
🌏 A zero-dependency networking solution for building modern and secure iOS, watchOS, macOS and tvOS applications.
Stars: ✭ 80 (-41.61%)
Mutual labels:  transformers
HVT
[ICCV 2021] Official implementation of "Scalable Vision Transformers with Hierarchical Pooling"
Stars: ✭ 26 (-81.02%)
Mutual labels:  transformers
BangalASR
Transformer based Bangla Speech Recognition
Stars: ✭ 20 (-85.4%)
Mutual labels:  transformers
Retinal-Disease-Diagnosis-With-Residual-Attention-Networks
Using Residual Attention Networks to diagnose retinal diseases in medical images
Stars: ✭ 14 (-89.78%)
Mutual labels:  attention-mechanism

TransGanFormer (wip)

Implementation of TransGanFormer, an all-attention GAN that combines the finding from the recent GansFormer and TransGan paper. It will also contain a bunch of tricks I have picked up building transformers and GANs for the last year or so, including efficient linear attention and pixel level attention.

Install

$ pip install transganformer

Usage

$ transganformer --data ./path/to/data

Citations

@misc{jiang2021transgan,
    title   = {TransGAN: Two Transformers Can Make One Strong GAN}, 
    author  = {Yifan Jiang and Shiyu Chang and Zhangyang Wang},
    year    = {2021},
    eprint  = {2102.07074},
    archivePrefix = {arXiv},
    primaryClass = {cs.CV}
}
@misc{hudson2021generative,
    title   = {Generative Adversarial Transformers}, 
    author  = {Drew A. Hudson and C. Lawrence Zitnick},
    year    = {2021},
    eprint  = {2103.01209},
    archivePrefix = {arXiv},
    primaryClass = {cs.CV}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].