Awesome VaesA curated list of awesome work on VAEs, disentanglement, representation learning, and generative models.
Stars: ✭ 418 (+416.05%)
vqvae-2PyTorch implementation of VQ-VAE-2 from "Generating Diverse High-Fidelity Images with VQ-VAE-2"
Stars: ✭ 65 (-19.75%)
Tf VqvaeTensorflow Implementation of the paper [Neural Discrete Representation Learning](https://arxiv.org/abs/1711.00937) (VQ-VAE).
Stars: ✭ 226 (+179.01%)
Sentence VaePyTorch Re-Implementation of "Generating Sentences from a Continuous Space" by Bowman et al 2015 https://arxiv.org/abs/1511.06349
Stars: ✭ 462 (+470.37%)
Vae protein functionProtein function prediction using a variational autoencoder
Stars: ✭ 57 (-29.63%)
score sde pytorchPyTorch implementation for Score-Based Generative Modeling through Stochastic Differential Equations (ICLR 2021, Oral)
Stars: ✭ 755 (+832.1%)
Dfc VaeVariational Autoencoder trained by Feature Perceputal Loss
Stars: ✭ 74 (-8.64%)
char-VAEInspired by the neural style algorithm in the computer vision field, we propose a high-level language model with the aim of adapting the linguistic style.
Stars: ✭ 18 (-77.78%)
generative deep learningGenerative Deep Learning Sessions led by Anugraha Sinha (Machine Learning Tokyo)
Stars: ✭ 24 (-70.37%)
srVAEVAE with RealNVP prior and Super-Resolution VAE in PyTorch. Code release for https://arxiv.org/abs/2006.05218.
Stars: ✭ 56 (-30.86%)
Vae For Image GenerationImplemented Variational Autoencoder generative model in Keras for image generation and its latent space visualization on MNIST and CIFAR10 datasets
Stars: ✭ 87 (+7.41%)
Generative ModelsCollection of generative models, e.g. GAN, VAE in Pytorch and Tensorflow.
Stars: ✭ 6,701 (+8172.84%)
InpaintNetCode accompanying ISMIR'19 paper titled "Learning to Traverse Latent Spaces for Musical Score Inpaintning"
Stars: ✭ 48 (-40.74%)
style-vaeImplementation of VAE and Style-GAN Architecture Achieving State of the Art Reconstruction
Stars: ✭ 25 (-69.14%)
lavaLatent Variable Models in R https://kkholst.github.io/lava/
Stars: ✭ 28 (-65.43%)
TriangleGANTriangleGAN, ACM MM 2019.
Stars: ✭ 28 (-65.43%)
gans-in-action"GAN 인 액션"(한빛미디어, 2020)의 코드 저장소입니다.
Stars: ✭ 29 (-64.2%)
adaptive-f-divergenceA tensorflow implementation of the NIPS 2018 paper "Variational Inference with Tail-adaptive f-Divergence"
Stars: ✭ 20 (-75.31%)
Pytorch-RL-CPPA Repository with C++ implementations of Reinforcement Learning Algorithms (Pytorch)
Stars: ✭ 73 (-9.88%)
ShapeFormerOfficial repository for the ShapeFormer Project
Stars: ✭ 97 (+19.75%)
MISEMultimodal Image Synthesis and Editing: A Survey
Stars: ✭ 214 (+164.2%)
favorite-research-papersListing my favorite research papers 📝 from different fields as I read them.
Stars: ✭ 12 (-85.19%)
VAE-Gumbel-SoftmaxAn implementation of a Variational-Autoencoder using the Gumbel-Softmax reparametrization trick in TensorFlow (tested on r1.5 CPU and GPU) in ICLR 2017.
Stars: ✭ 66 (-18.52%)
Carla-ppoThis repository hosts a customized PPO based agent for Carla. The goal of this project is to make it easier to interact with and experiment in Carla with reinforcement learning based agents -- this, by wrapping Carla in a gym like environment that can handle custom reward functions, custom debug output, etc.
Stars: ✭ 122 (+50.62%)
3DCSGNetCSGNet for voxel based input
Stars: ✭ 34 (-58.02%)
gcWGANGuided Conditional Wasserstein GAN for De Novo Protein Design
Stars: ✭ 38 (-53.09%)
Lr-LiVAETensorflow implementation of Disentangling Latent Space for VAE by Label Relevant/Irrelevant Dimensions (CVPR 2019)
Stars: ✭ 29 (-64.2%)
LatentDiffEq.jlLatent Differential Equations models in Julia.
Stars: ✭ 34 (-58.02%)
GraphCNN-GANGraph-convolutional GAN for point cloud generation. Code from ICLR 2019 paper Learning Localized Generative Models for 3D Point Clouds via Graph Convolution
Stars: ✭ 50 (-38.27%)
graph-nvpGraphNVP: An Invertible Flow Model for Generating Molecular Graphs
Stars: ✭ 69 (-14.81%)
GrabNetGrabNet: A Generative model to generate realistic 3D hands grasping unseen objects (ECCV2020)
Stars: ✭ 146 (+80.25%)
VAENAR-TTSPyTorch Implementation of VAENAR-TTS: Variational Auto-Encoder based Non-AutoRegressive Text-to-Speech Synthesis.
Stars: ✭ 66 (-18.52%)
DVAEOfficial implementation of Dynamical VAEs
Stars: ✭ 75 (-7.41%)
GDPPGenerator loss to reduce mode-collapse and to improve the generated samples quality.
Stars: ✭ 32 (-60.49%)
contiguous-succotashRecurrent Variational Autoencoder with Dilated Convolutions that generates sequential data implemented in pytorch
Stars: ✭ 71 (-12.35%)
py-msa-kdenlivePython script to load a Kdenlive (OSS NLE video editor) project file, and conform the edit on video or numpy arrays.
Stars: ✭ 25 (-69.14%)
Advanced Models여러가지 유명한 신경망 모델들을 제공합니다. (DCGAN, VAE, Resnet 등등)
Stars: ✭ 48 (-40.74%)
AI Learning HubAI Learning Hub for Machine Learning, Deep Learning, Computer Vision and Statistics
Stars: ✭ 53 (-34.57%)
TensorMONKA collection of deep learning models (PyTorch implemtation)
Stars: ✭ 21 (-74.07%)
nvaeAn unofficial toy implementation for NVAE 《A Deep Hierarchical Variational Autoencoder》
Stars: ✭ 83 (+2.47%)
simpleganTensorflow-based framework to ease training of generative models
Stars: ✭ 19 (-76.54%)
Parallel-Tacotron2PyTorch Implementation of Google's Parallel Tacotron 2: A Non-Autoregressive Neural TTS Model with Differentiable Duration Modeling
Stars: ✭ 149 (+83.95%)
Generative-ModelRepository for implementation of generative models with Tensorflow 1.x
Stars: ✭ 66 (-18.52%)
pytorch-GANMy pytorch implementation for GAN
Stars: ✭ 12 (-85.19%)
MidiTokA convenient MIDI / symbolic music tokenizer for Deep Learning networks, with multiple strategies 🎶
Stars: ✭ 180 (+122.22%)
LSRPytorch Implementation of our ACL 2020 Paper "Reasoning with Latent Structure Refinement for Document-Level Relation Extraction"
Stars: ✭ 121 (+49.38%)
vae-torchVariational autoencoder for anomaly detection (in PyTorch).
Stars: ✭ 38 (-53.09%)
continuous BernoulliThere are C language computer programs about the simulator, transformation, and test statistic of continuous Bernoulli distribution. More than that, the book contains continuous Binomial distribution and continuous Trinomial distribution.
Stars: ✭ 22 (-72.84%)