SIVIUsing neural network to build expressive hierarchical distribution; A variational method to accurately estimate posterior uncertainty; A fast and general method for Bayesian inference. (ICML 2018)
Stars: ✭ 49 (-81.92%)
Variational AutoencoderVariational autoencoder implemented in tensorflow and pytorch (including inverse autoregressive flow)
Stars: ✭ 807 (+197.79%)
CIKM18-LCVACode for CIKM'18 paper, Linked Causal Variational Autoencoder for Inferring Paired Spillover Effects.
Stars: ✭ 13 (-95.2%)
Awesome VaesA curated list of awesome work on VAEs, disentanglement, representation learning, and generative models.
Stars: ✭ 418 (+54.24%)
Generative models tutorial with demoGenerative Models Tutorial with Demo: Bayesian Classifier Sampling, Variational Auto Encoder (VAE), Generative Adversial Networks (GANs), Popular GANs Architectures, Auto-Regressive Models, Important Generative Model Papers, Courses, etc..
Stars: ✭ 276 (+1.85%)
lagvaeLagrangian VAE
Stars: ✭ 27 (-90.04%)
KvaeKalman Variational Auto-Encoder
Stars: ✭ 115 (-57.56%)
Tensorflow Mnist CvaeTensorflow implementation of conditional variational auto-encoder for MNIST
Stars: ✭ 139 (-48.71%)
haskell-vaeLearning about Haskell with Variational Autoencoders
Stars: ✭ 18 (-93.36%)
soft-intro-vae-pytorch[CVPR 2021 Oral] Official PyTorch implementation of Soft-IntroVAE from the paper "Soft-IntroVAE: Analyzing and Improving Introspective Variational Autoencoders"
Stars: ✭ 170 (-37.27%)
Bayesian Neural NetworksPytorch implementations of Bayes By Backprop, MC Dropout, SGLD, the Local Reparametrization Trick, KF-Laplace, SG-HMC and more
Stars: ✭ 900 (+232.1%)
MIDI-VAENo description or website provided.
Stars: ✭ 56 (-79.34%)
Pymc3Probabilistic Programming in Python: Bayesian Modeling and Probabilistic Machine Learning with Aesara
Stars: ✭ 6,214 (+2192.99%)
Bayes NnLecture notes on Bayesian deep learning
Stars: ✭ 444 (+63.84%)
Probabilistic unetA U-Net combined with a variational auto-encoder that is able to learn conditional distributions over semantic segmentations.
Stars: ✭ 427 (+57.56%)
VbmcVariational Bayesian Monte Carlo (VBMC) algorithm for posterior and model inference in MATLAB
Stars: ✭ 123 (-54.61%)
ProbregPython package for point cloud registration using probabilistic model (Coherent Point Drift, GMMReg, SVR, GMMTree, FilterReg, Bayesian CPD)
Stars: ✭ 306 (+12.92%)
viabelEfficient, lightweight variational inference and approximation bounds
Stars: ✭ 27 (-90.04%)
naruNeural Relation Understanding: neural cardinality estimators for tabular data
Stars: ✭ 76 (-71.96%)
BcpdBayesian Coherent Point Drift (BCPD/BCPD++); Source Code Available
Stars: ✭ 116 (-57.2%)
PyroDeep universal probabilistic programming with Python and PyTorch
Stars: ✭ 7,224 (+2565.68%)
boundary-gpKnow Your Boundaries: Constraining Gaussian Processes by Variational Harmonic Features
Stars: ✭ 21 (-92.25%)
Pytorch BayesiancnnBayesian Convolutional Neural Network with Variational Inference based on Bayes by Backprop in PyTorch.
Stars: ✭ 779 (+187.45%)
Celeste.jlScalable inference for a generative model of astronomical images
Stars: ✭ 142 (-47.6%)
VAE-Latent-Space-ExplorerInteractive exploration of MNIST variational autoencoder latent space with React and tensorflow.js.
Stars: ✭ 30 (-88.93%)
cmdstanrCmdStanR: the R interface to CmdStan
Stars: ✭ 82 (-69.74%)
GpstuffGPstuff - Gaussian process models for Bayesian analysis
Stars: ✭ 106 (-60.89%)
sqairImplementation of Sequential Attend, Infer, Repeat (SQAIR)
Stars: ✭ 96 (-64.58%)
Good PapersI try my best to keep updated cutting-edge knowledge in Machine Learning/Deep Learning and Natural Language Processing. These are my notes on some good papers
Stars: ✭ 248 (-8.49%)
DropoutsPyTorch Implementations of Dropout Variants
Stars: ✭ 72 (-73.43%)
autoreparamAutomatic Reparameterisation of Probabilistic Programs
Stars: ✭ 29 (-89.3%)
Bayes By BackpropPyTorch implementation of "Weight Uncertainty in Neural Networks"
Stars: ✭ 119 (-56.09%)
svae cf[ WSDM '19 ] Sequential Variational Autoencoders for Collaborative Filtering
Stars: ✭ 38 (-85.98%)
MxfusionModular Probabilistic Programming on MXNet
Stars: ✭ 95 (-64.94%)
Probabilistic ModelsCollection of probabilistic models and inference algorithms
Stars: ✭ 217 (-19.93%)
noisy-K-FACNatural Gradient, Variational Inference
Stars: ✭ 29 (-89.3%)
GpflowGaussian processes in TensorFlow
Stars: ✭ 1,547 (+470.85%)
benchmark VAEUnifying Variational Autoencoder (VAE) implementations in Pytorch (NeurIPS 2022)
Stars: ✭ 1,211 (+346.86%)
Deepbayes 2018Seminars DeepBayes Summer School 2018
Stars: ✭ 1,021 (+276.75%)
artificial neural networksA collection of Methods and Models for various architectures of Artificial Neural Networks
Stars: ✭ 40 (-85.24%)
prosperA Python Library for Probabilistic Sparse Coding with Non-Standard Priors and Superpositions
Stars: ✭ 17 (-93.73%)
Inverse rlAdversarial Imitation Via Variational Inverse Reinforcement Learning
Stars: ✭ 79 (-70.85%)
probai-2021-pyroRepo for the Tutorials of Day1-Day3 of the Nordic Probabilistic AI School 2021 (https://probabilistic.ai/)
Stars: ✭ 45 (-83.39%)
NanoFlowPyTorch implementation of the paper "NanoFlow: Scalable Normalizing Flows with Sublinear Parameter Complexity." (NeurIPS 2020)
Stars: ✭ 63 (-76.75%)
Rnn VaeVariational Autoencoder with Recurrent Neural Network based on Google DeepMind's "DRAW: A Recurrent Neural Network For Image Generation"
Stars: ✭ 39 (-85.61%)
VINFRepository for DTU Special Course, focusing on Variational Inference using Normalizing Flows (VINF). Supervised by Michael Riis Andersen
Stars: ✭ 23 (-91.51%)
Gp Infer NetScalable Training of Inference Networks for Gaussian-Process Models, ICML 2019
Stars: ✭ 37 (-86.35%)