continuous BernoulliThere are C language computer programs about the simulator, transformation, and test statistic of continuous Bernoulli distribution. More than that, the book contains continuous Binomial distribution and continuous Trinomial distribution.
Stars: ✭ 22 (-93.37%)
Mutual labels: vae
contiguous-succotashRecurrent Variational Autoencoder with Dilated Convolutions that generates sequential data implemented in pytorch
Stars: ✭ 71 (-78.61%)
Mutual labels: vae
char-VAEInspired by the neural style algorithm in the computer vision field, we propose a high-level language model with the aim of adapting the linguistic style.
Stars: ✭ 18 (-94.58%)
Mutual labels: vae
VAE-Gumbel-SoftmaxAn implementation of a Variational-Autoencoder using the Gumbel-Softmax reparametrization trick in TensorFlow (tested on r1.5 CPU and GPU) in ICLR 2017.
Stars: ✭ 66 (-80.12%)
Mutual labels: vae
DiffuseVAEA combination of VAE's and Diffusion Models for efficient, controllable and high-fidelity generation from low-dimensional latents
Stars: ✭ 81 (-75.6%)
Mutual labels: vae
BERT4Rec-VAE-PytorchPytorch implementation of BERT4Rec and Netflix VAE.
Stars: ✭ 212 (-36.14%)
Mutual labels: vae
DaisyrecA developing recommender system in pytorch. Algorithm: KNN, LFM, SLIM, NeuMF, FM, DeepFM, VAE and so on, which aims to fair comparison for recommender system benchmarks
Stars: ✭ 280 (-15.66%)
Mutual labels: vae
sqairImplementation of Sequential Attend, Infer, Repeat (SQAIR)
Stars: ✭ 96 (-71.08%)
Mutual labels: vae
srVAEVAE with RealNVP prior and Super-Resolution VAE in PyTorch. Code release for https://arxiv.org/abs/2006.05218.
Stars: ✭ 56 (-83.13%)
Mutual labels: vae
Parallel-Tacotron2PyTorch Implementation of Google's Parallel Tacotron 2: A Non-Autoregressive Neural TTS Model with Differentiable Duration Modeling
Stars: ✭ 149 (-55.12%)
Mutual labels: vae
generative deep learningGenerative Deep Learning Sessions led by Anugraha Sinha (Machine Learning Tokyo)
Stars: ✭ 24 (-92.77%)
Mutual labels: vae
ladder-vae-pytorchLadder Variational Autoencoders (LVAE) in PyTorch
Stars: ✭ 59 (-82.23%)
Mutual labels: vae
TensorMONKA collection of deep learning models (PyTorch implemtation)
Stars: ✭ 21 (-93.67%)
Mutual labels: vae
classifying-vae-lstmmusic generation with a classifying variational autoencoder (VAE) and LSTM
Stars: ✭ 27 (-91.87%)
Mutual labels: vae
style-vaeImplementation of VAE and Style-GAN Architecture Achieving State of the Art Reconstruction
Stars: ✭ 25 (-92.47%)
Mutual labels: vae
learning-to-drive-in-5-minutesImplementation of reinforcement learning approach to make a car learn to drive smoothly in minutes
Stars: ✭ 227 (-31.63%)
Mutual labels: vae
Beta VaePytorch implementation of β-VAE
Stars: ✭ 326 (-1.81%)
Mutual labels: vae
S Vae PytorchPytorch implementation of Hyperspherical Variational Auto-Encoders
Stars: ✭ 255 (-23.19%)
Mutual labels: vae
disent🧶 Modular VAE disentanglement framework for python built with PyTorch Lightning ▸ Including metrics and datasets ▸ With strongly supervised, weakly supervised and unsupervised methods ▸ Easily configured and run with Hydra config ▸ Inspired by disentanglement_lib
Stars: ✭ 41 (-87.65%)
Mutual labels: vae