Tensorflow Mnist CvaeTensorflow implementation of conditional variational auto-encoder for MNIST
Stars: ✭ 139 (-39.3%)
Disentangling VaeExperiments for understanding disentanglement in VAE latent representations
Stars: ✭ 398 (+73.8%)
VAE-Gumbel-SoftmaxAn implementation of a Variational-Autoencoder using the Gumbel-Softmax reparametrization trick in TensorFlow (tested on r1.5 CPU and GPU) in ICLR 2017.
Stars: ✭ 66 (-71.18%)
Tensorflow Mnist VaeTensorflow implementation of variational auto-encoder for MNIST
Stars: ✭ 422 (+84.28%)
Cada Vae PytorchOfficial implementation of the paper "Generalized Zero- and Few-Shot Learning via Aligned Variational Autoencoders" (CVPR 2019)
Stars: ✭ 198 (-13.54%)
Vq VaeMinimalist implementation of VQ-VAE in Pytorch
Stars: ✭ 224 (-2.18%)
Pytorch VaeA CNN Variational Autoencoder (CNN-VAE) implemented in PyTorch
Stars: ✭ 181 (-20.96%)
soft-intro-vae-pytorch[CVPR 2021 Oral] Official PyTorch implementation of Soft-IntroVAE from the paper "Soft-IntroVAE: Analyzing and Improving Introspective Variational Autoencoders"
Stars: ✭ 170 (-25.76%)
benchmark VAEUnifying Variational Autoencoder (VAE) implementations in Pytorch (NeurIPS 2022)
Stars: ✭ 1,211 (+428.82%)
BagelIPCCC 2018: Robust and Unsupervised KPI Anomaly Detection Based on Conditional Variational Autoencoder
Stars: ✭ 45 (-80.35%)
playing with vaeComparing FC VAE / FCN VAE / PCA / UMAP on MNIST / FMNIST
Stars: ✭ 53 (-76.86%)
Vae protein functionProtein function prediction using a variational autoencoder
Stars: ✭ 57 (-75.11%)
S Vae TfTensorflow implementation of Hyperspherical Variational Auto-Encoders
Stars: ✭ 198 (-13.54%)
Deep Learning With PythonExample projects I completed to understand Deep Learning techniques with Tensorflow. Please note that I do no longer maintain this repository.
Stars: ✭ 134 (-41.48%)
MIDI-VAENo description or website provided.
Stars: ✭ 56 (-75.55%)
tensorflow-mnist-AAETensorflow implementation of adversarial auto-encoder for MNIST
Stars: ✭ 86 (-62.45%)
srVAEVAE with RealNVP prior and Super-Resolution VAE in PyTorch. Code release for https://arxiv.org/abs/2006.05218.
Stars: ✭ 56 (-75.55%)
haskell-vaeLearning about Haskell with Variational Autoencoders
Stars: ✭ 18 (-92.14%)
Vae TensorflowA Tensorflow implementation of a Variational Autoencoder for the deep learning course at the University of Southern California (USC).
Stars: ✭ 117 (-48.91%)
Pytorch RlThis repository contains model-free deep reinforcement learning algorithms implemented in Pytorch
Stars: ✭ 394 (+72.05%)
S Vae PytorchPytorch implementation of Hyperspherical Variational Auto-Encoders
Stars: ✭ 255 (+11.35%)
Awesome VaesA curated list of awesome work on VAEs, disentanglement, representation learning, and generative models.
Stars: ✭ 418 (+82.53%)
Variational AutoencoderVariational autoencoder implemented in tensorflow and pytorch (including inverse autoregressive flow)
Stars: ✭ 807 (+252.4%)
classifying-vae-lstmmusic generation with a classifying variational autoencoder (VAE) and LSTM
Stars: ✭ 27 (-88.21%)
continuous BernoulliThere are C language computer programs about the simulator, transformation, and test statistic of continuous Bernoulli distribution. More than that, the book contains continuous Binomial distribution and continuous Trinomial distribution.
Stars: ✭ 22 (-90.39%)
Vae For Image GenerationImplemented Variational Autoencoder generative model in Keras for image generation and its latent space visualization on MNIST and CIFAR10 datasets
Stars: ✭ 87 (-62.01%)
SmrtHandle class imbalance intelligently by using variational auto-encoders to generate synthetic observations of your minority class.
Stars: ✭ 102 (-55.46%)
Deep Generative ModelsDeep generative models implemented with TensorFlow 2.0: eg. Restricted Boltzmann Machine (RBM), Deep Belief Network (DBN), Deep Boltzmann Machine (DBM), Convolutional Variational Auto-Encoder (CVAE), Convolutional Generative Adversarial Network (CGAN)
Stars: ✭ 34 (-85.15%)
Fun-with-MNISTPlaying with MNIST. Machine Learning. Generative Models.
Stars: ✭ 23 (-89.96%)
vae-concreteKeras implementation of a Variational Auto Encoder with a Concrete Latent Distribution
Stars: ✭ 51 (-77.73%)
VAE-Latent-Space-ExplorerInteractive exploration of MNIST variational autoencoder latent space with React and tensorflow.js.
Stars: ✭ 30 (-86.9%)
pyroVEDInvariant representation learning from imaging and spectral data
Stars: ✭ 23 (-89.96%)
MojitalkCode for "MojiTalk: Generating Emotional Responses at Scale" https://arxiv.org/abs/1711.04090
Stars: ✭ 107 (-53.28%)
Tf VqvaeTensorflow Implementation of the paper [Neural Discrete Representation Learning](https://arxiv.org/abs/1711.00937) (VQ-VAE).
Stars: ✭ 226 (-1.31%)
Beat BlenderBlend beats using machine learning to create music in a fun new way.
Stars: ✭ 147 (-35.81%)
Adversarial video summaryUnofficial PyTorch Implementation of SUM-GAN from "Unsupervised Video Summarization with Adversarial LSTM Networks" (CVPR 2017)
Stars: ✭ 187 (-18.34%)
Vae SeqVariational Auto-Encoders in a Sequential Setting.
Stars: ✭ 145 (-36.68%)
Synthesize3dviadepthorsil[CVPR 2017] Generation and reconstruction of 3D shapes via modeling multi-view depth maps or silhouettes
Stars: ✭ 141 (-38.43%)
Tensorflow Mnist CnnMNIST classification using Convolutional NeuralNetwork. Various techniques such as data augmentation, dropout, batchnormalization, etc are implemented.
Stars: ✭ 182 (-20.52%)
NnpulearningNon-negative Positive-Unlabeled (nnPU) and unbiased Positive-Unlabeled (uPU) learning reproductive code on MNIST and CIFAR10
Stars: ✭ 181 (-20.96%)
Vmf vae nlpCode for EMNLP18 paper "Spherical Latent Spaces for Stable Variational Autoencoders"
Stars: ✭ 140 (-38.86%)
Mnist drawThis is a sample project demonstrating the use of Keras (Tensorflow) for the training of a MNIST model for handwriting recognition using CoreML on iOS 11 for inference.
Stars: ✭ 139 (-39.3%)
Generative adversarial networks 101Keras implementations of Generative Adversarial Networks. GANs, DCGAN, CGAN, CCGAN, WGAN and LSGAN models with MNIST and CIFAR-10 datasets.
Stars: ✭ 138 (-39.74%)
OptimusOptimus: the first large-scale pre-trained VAE language model
Stars: ✭ 180 (-21.4%)
BnafPytorch implementation of Block Neural Autoregressive Flow
Stars: ✭ 138 (-39.74%)
LingvoLingvo
Stars: ✭ 2,361 (+931%)
Neuraldialog LarlPyTorch implementation of latent space reinforcement learning for E2E dialog published at NAACL 2019. It is released by Tiancheng Zhao (Tony) from Dialog Research Center, LTI, CMU
Stars: ✭ 127 (-44.54%)