All Projects → AntixK → Pytorch Vae

AntixK / Pytorch Vae

Licence: apache-2.0
A Collection of Variational Autoencoders (VAE) in PyTorch.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Pytorch Vae

benchmark VAE
Unifying Variational Autoencoder (VAE) implementations in Pytorch (NeurIPS 2022)
Stars: ✭ 1,211 (-55.21%)
Mutual labels:  reproducible-research, vae, beta-vae, vae-implementation
continuous Bernoulli
There are C language computer programs about the simulator, transformation, and test statistic of continuous Bernoulli distribution. More than that, the book contains continuous Binomial distribution and continuous Trinomial distribution.
Stars: ✭ 22 (-99.19%)
Mutual labels:  vae, variational-autoencoders, vae-implementation
vae-concrete
Keras implementation of a Variational Auto Encoder with a Concrete Latent Distribution
Stars: ✭ 51 (-98.11%)
Mutual labels:  vae, gumbel-softmax
probabilistic nlg
Tensorflow Implementation of Stochastic Wasserstein Autoencoder for Probabilistic Sentence Generation (NAACL 2019).
Stars: ✭ 28 (-98.96%)
Mutual labels:  vae, wae
Generative-Model
Repository for implementation of generative models with Tensorflow 1.x
Stars: ✭ 66 (-97.56%)
Mutual labels:  vae, beta-vae
nvae
An unofficial toy implementation for NVAE 《A Deep Hierarchical Variational Autoencoder》
Stars: ✭ 83 (-96.93%)
Mutual labels:  vae, pytorch-implementation
Alae
[CVPR2020] Adversarial Latent Autoencoders
Stars: ✭ 3,178 (+17.53%)
Mutual labels:  paper-implementations, pytorch-implementation
sqair
Implementation of Sequential Attend, Infer, Repeat (SQAIR)
Stars: ✭ 96 (-96.45%)
Mutual labels:  vae, iwae
VAE-Gumbel-Softmax
An implementation of a Variational-Autoencoder using the Gumbel-Softmax reparametrization trick in TensorFlow (tested on r1.5 CPU and GPU) in ICLR 2017.
Stars: ✭ 66 (-97.56%)
Mutual labels:  vae, gumbel-softmax
ladder-vae-pytorch
Ladder Variational Autoencoders (LVAE) in PyTorch
Stars: ✭ 59 (-97.82%)
Mutual labels:  vae, variational-autoencoders
Disentangling Vae
Experiments for understanding disentanglement in VAE latent representations
Stars: ✭ 398 (-85.28%)
Mutual labels:  vae, reproducible-research
Android Clean Architecture Boilerplate
Apply clean architecture on Android
Stars: ✭ 141 (-94.79%)
Mutual labels:  architecture
Mosby Conductor
Plugin for conductor to integrate Mosby
Stars: ✭ 134 (-95.04%)
Mutual labels:  architecture
Deep Learning With Python
Example projects I completed to understand Deep Learning techniques with Tensorflow. Please note that I do no longer maintain this repository.
Stars: ✭ 134 (-95.04%)
Mutual labels:  vae
Go Clean Architecture
👨‍💻 REST API example, built by following Uncle Bob’s clean architecture principles
Stars: ✭ 133 (-95.08%)
Mutual labels:  architecture
Beat Blender
Blend beats using machine learning to create music in a fun new way.
Stars: ✭ 147 (-94.56%)
Mutual labels:  vae
Cleanarchitecture.workerservice
A solution template using Clean Architecture for building a .NET Core Worker Service.
Stars: ✭ 142 (-94.75%)
Mutual labels:  architecture
Solution Architecture Patterns
Reusable, vendor-neutral, industry-specific, vendor-specific solution architecture patterns for enterprise
Stars: ✭ 2,541 (-6.03%)
Mutual labels:  architecture
Swift Cleanarchitecture
Simple Swift project applying concepts inspired on the Clean Architecture
Stars: ✭ 133 (-95.08%)
Mutual labels:  architecture
Armscomponent
📦 A complete android componentization solution, powered by MVPArms (MVPArms 官方快速组件化方案).
Stars: ✭ 1,664 (-38.46%)
Mutual labels:  architecture

PyTorch VAE

A collection of Variational AutoEncoders (VAEs) implemented in pytorch with focus on reproducibility. The aim of this project is to provide a quick and simple working example for many of the cool VAE models out there. All the models are trained on the CelebA dataset for consistency and comparison. The architecture of all the models are kept as similar as possible with the same layers, except for cases where the original paper necessitates a radically different architecture (Ex. VQ VAE uses Residual layers and no Batch-Norm, unlike other models). Here are the results of each model.

Requirements

  • Python >= 3.5
  • PyTorch >= 1.3
  • Pytorch Lightning >= 0.6.0 (GitHub Repo)
  • CUDA enabled computing device

Installation

$ git clone https://github.com/AntixK/PyTorch-VAE
$ cd PyTorch-VAE
$ pip install -r requirements.txt

Usage

$ cd PyTorch-VAE
$ python run.py -c configs/<config-file-name.yaml>

Config file template

model_params:
  name: "<name of VAE model>"
  in_channels: 3
  latent_dim: 
    .         # Other parameters required by the model
    .
    .

exp_params:
  data_path: "<path to the celebA dataset>"
  img_size: 64    # Models are designed to work for this size
  batch_size: 64  # Better to have a square number
  LR: 0.005
  weight_decay:
    .         # Other arguments required for training, like scheduler etc.
    .
    .

trainer_params:
  gpus: 1         
  max_nb_epochs: 50
  gradient_clip_val: 1.5
    .
    .
    .

logging_params:
  save_dir: "logs/"
  name: "<experiment name>"
  manual_seed: 

View TensorBoard Logs

$ cd logs/<experiment name>/version_<the version you want>
$ tensorboard --logdir tf

Results

Model Paper Reconstruction Samples
VAE (Code, Config) Link
Conditional VAE (Code, Config) Link
WAE - MMD (RBF Kernel) (Code, Config) Link
WAE - MMD (IMQ Kernel) (Code, Config) Link
Beta-VAE (Code, Config) Link
Disentangled Beta-VAE (Code, Config) Link
Beta-TC-VAE (Code, Config) Link
IWAE (K = 5) (Code, Config) Link
MIWAE (K = 5, M = 3) (Code, Config) Link
DFCVAE (Code, Config) Link
MSSIM VAE (Code, Config) Link
Categorical VAE (Code, Config) Link
Joint VAE (Code, Config) Link
Info VAE (Code, Config) Link
LogCosh VAE (Code, Config) Link
SWAE (200 Projections) (Code, Config) Link
VQ-VAE (K = 512, D = 64) (Code, Config) Link N/A
DIP VAE (Code, Config) Link

Contributing

If you have trained a better model, using these implementations, by fine-tuning the hyper-params in the config file, I would be happy to include your result (along with your config file) in this repo, citing your name 😊.

Additionally, if you would like to contribute some models, please submit a PR.

License

Apache License 2.0

Permissions Limitations Conditions
✔️ Commercial use Trademark use ⓘ License and copyright notice
✔️ Modification Liability ⓘ State changes
✔️ Distribution Warranty
✔️ Patent use
✔️ Private use

Citation

@misc{Subramanian2020,
  author = {Subramanian, A.K},
  title = {PyTorch-VAE},
  year = {2020},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/AntixK/PyTorch-VAE}}
}

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].