All Projects → akanimax → attn_gan_pytorch

akanimax / attn_gan_pytorch

Licence: MIT license
python package for self-attention gan implemented as extension of PyTorch nn.Module. paper -> https://arxiv.org/abs/1805.08318

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to attn gan pytorch

TensorMONK
A collection of deep learning models (PyTorch implemtation)
Stars: ✭ 21 (+31.25%)
Mutual labels:  sagan
Advanced Models
여러가지 유명한 신경망 모델들을 제공합니다. (DCGAN, VAE, Resnet 등등)
Stars: ✭ 48 (+200%)
Mutual labels:  sagan

attn_gan_pytorch

python package for self-attention gan implemented as extension of PyTorch nn.Module. paper -> https://arxiv.org/abs/1805.08318

Also includes generic layers for image based attention mechanism. Includes a Full-Attention layer as proposed by in another project of mine here

Installation:

This is a python package availbale at the pypi.org. So, installation is fairly straightforward. This package depends on a suitable GPU version of torch and torch-vision for your architecture. So, please download suitable pytorch prior to installing this package. Follow the instructions at pytorch.org to install your version of PyTorch.

Install with following commands:

$ workon [your virtual environment] 
$ pip install attn-gan-pytorch

Celeba Samples:

some celeba samples generated using this code for the fagan architecture:

generated samples

Head over to the Fagan project repo for more info!

Also, this repo contains the code for using this package to build the SAGAN architecture as mentioned in the paper. Please refer the samples/ directory for this.

Thanks

Please feel free to open PRs here if you train on other datasets using this package. Suggestions / Issues / Contributions are most welcome.

Best regards,
@akanimax :)

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].