NanoFlowPyTorch implementation of the paper "NanoFlow: Scalable Normalizing Flows with Sublinear Parameter Complexity." (NeurIPS 2020)
Stars: ✭ 63 (+215%)
normalizing-flowsPyTorch implementation of normalizing flow models
Stars: ✭ 271 (+1255%)
CIKM18-LCVACode for CIKM'18 paper, Linked Causal Variational Autoencoder for Inferring Paired Spillover Effects.
Stars: ✭ 13 (-35%)
KvaeKalman Variational Auto-Encoder
Stars: ✭ 115 (+475%)
Generative models tutorial with demoGenerative Models Tutorial with Demo: Bayesian Classifier Sampling, Variational Auto Encoder (VAE), Generative Adversial Networks (GANs), Popular GANs Architectures, Auto-Regressive Models, Important Generative Model Papers, Courses, etc..
Stars: ✭ 276 (+1280%)
Gumbel-CRFImplementation of NeurIPS 20 paper: Latent Template Induction with Gumbel-CRFs
Stars: ✭ 51 (+155%)
benchmark VAEUnifying Variational Autoencoder (VAE) implementations in Pytorch (NeurIPS 2022)
Stars: ✭ 1,211 (+5955%)
lagvaeLagrangian VAE
Stars: ✭ 27 (+35%)
Tensorflow Mnist CvaeTensorflow implementation of conditional variational auto-encoder for MNIST
Stars: ✭ 139 (+595%)
Awesome VaesA curated list of awesome work on VAEs, disentanglement, representation learning, and generative models.
Stars: ✭ 418 (+1990%)
soft-intro-vae-pytorch[CVPR 2021 Oral] Official PyTorch implementation of Soft-IntroVAE from the paper "Soft-IntroVAE: Analyzing and Improving Introspective Variational Autoencoders"
Stars: ✭ 170 (+750%)
naruNeural Relation Understanding: neural cardinality estimators for tabular data
Stars: ✭ 76 (+280%)
Variational AutoencoderVariational autoencoder implemented in tensorflow and pytorch (including inverse autoregressive flow)
Stars: ✭ 807 (+3935%)
SIVIUsing neural network to build expressive hierarchical distribution; A variational method to accurately estimate posterior uncertainty; A fast and general method for Bayesian inference. (ICML 2018)
Stars: ✭ 49 (+145%)
haskell-vaeLearning about Haskell with Variational Autoencoders
Stars: ✭ 18 (-10%)
CVAE DialCVAE_XGate model in paper "Xu, Dusek, Konstas, Rieser. Better Conversations by Modeling, Filtering, and Optimizing for Coherence and Diversity"
Stars: ✭ 16 (-20%)
delfiDensity estimation likelihood-free inference. No longer actively developed see https://github.com/mackelab/sbi instead
Stars: ✭ 66 (+230%)
Pylians3Libraries to analyze numerical simulations (python3)
Stars: ✭ 35 (+75%)
PyLDAA Latent Dirichlet Allocation implementation in Python.
Stars: ✭ 51 (+155%)
noisy-K-FACNatural Gradient, Variational Inference
Stars: ✭ 29 (+45%)
SelSumAbstractive opinion summarization system (SelSum) and the largest dataset of Amazon product summaries (AmaSum). EMNLP 2021 conference paper.
Stars: ✭ 36 (+80%)
adVAEImplementation of 'Self-Adversarial Variational Autoencoder with Gaussian Anomaly Prior Distribution for Anomaly Detection'
Stars: ✭ 17 (-15%)
playing with vaeComparing FC VAE / FCN VAE / PCA / UMAP on MNIST / FMNIST
Stars: ✭ 53 (+165%)
kscoreNonparametric Score Estimators, ICML 2020
Stars: ✭ 32 (+60%)
artificial neural networksA collection of Methods and Models for various architectures of Artificial Neural Networks
Stars: ✭ 40 (+100%)
vaeganAn implementation of VAEGAN (variational autoencoder + generative adversarial network).
Stars: ✭ 88 (+340%)
normalizing-flowsImplementations of normalizing flows using python and tensorflow
Stars: ✭ 15 (-25%)
vae-pytorchAE and VAE Playground in PyTorch
Stars: ✭ 53 (+165%)
adaptive-f-divergenceA tensorflow implementation of the NIPS 2018 paper "Variational Inference with Tail-adaptive f-Divergence"
Stars: ✭ 20 (+0%)
VINFRepository for DTU Special Course, focusing on Variational Inference using Normalizing Flows (VINF). Supervised by Michael Riis Andersen
Stars: ✭ 23 (+15%)
GPBoostCombining tree-boosting with Gaussian process and mixed effects models
Stars: ✭ 360 (+1700%)
multimodal-vae-publicA PyTorch implementation of "Multimodal Generative Models for Scalable Weakly-Supervised Learning" (https://arxiv.org/abs/1802.05335)
Stars: ✭ 98 (+390%)
rssRegression with Summary Statistics.
Stars: ✭ 42 (+110%)
AI Learning HubAI Learning Hub for Machine Learning, Deep Learning, Computer Vision and Statistics
Stars: ✭ 53 (+165%)
DUNCode for "Depth Uncertainty in Neural Networks" (https://arxiv.org/abs/2006.08437)
Stars: ✭ 65 (+225%)
VAE-Gumbel-SoftmaxAn implementation of a Variational-Autoencoder using the Gumbel-Softmax reparametrization trick in TensorFlow (tested on r1.5 CPU and GPU) in ICLR 2017.
Stars: ✭ 66 (+230%)
vae-torchVariational autoencoder for anomaly detection (in PyTorch).
Stars: ✭ 38 (+90%)
STEPSpatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits
Stars: ✭ 39 (+95%)
KernelEstimator.jlThe julia package for nonparametric density estimate and regression
Stars: ✭ 25 (+25%)
linguistic-style-transfer-pytorchImplementation of "Disentangled Representation Learning for Non-Parallel Text Style Transfer(ACL 2019)" in Pytorch
Stars: ✭ 55 (+175%)
continuous-time-flow-processPyTorch code of "Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows" (NeurIPS 2020)
Stars: ✭ 34 (+70%)
sliced score matchingCode for reproducing results in the sliced score matching paper (UAI 2019)
Stars: ✭ 68 (+240%)
BayesByHypernetCode for the paper Implicit Weight Uncertainty in Neural Networks
Stars: ✭ 63 (+215%)
lego-face-VAEVariational autoencoder for Lego minifig faces
Stars: ✭ 15 (-25%)
MongeAmpereFlowContinuous-time gradient flow for generative modeling and variational inference
Stars: ✭ 29 (+45%)
continuous BernoulliThere are C language computer programs about the simulator, transformation, and test statistic of continuous Bernoulli distribution. More than that, the book contains continuous Binomial distribution and continuous Trinomial distribution.
Stars: ✭ 22 (+10%)
deeprob-kitA Python Library for Deep Probabilistic Modeling
Stars: ✭ 32 (+60%)
OCDVAEContinualLearningOpen-source code for our paper: Unified Probabilistic Deep Continual Learning through Generative Replay and Open Set Recognition
Stars: ✭ 56 (+180%)
adaboostAn implementation of the paper "A Short Introduction to Boosting"
Stars: ✭ 20 (+0%)
semi-supervised-NFsCode for the paper Semi-Conditional Normalizing Flows for Semi-Supervised Learning
Stars: ✭ 23 (+15%)
cflow-adOfficial PyTorch code for WACV 2022 paper "CFLOW-AD: Real-Time Unsupervised Anomaly Detection with Localization via Conditional Normalizing Flows"
Stars: ✭ 138 (+590%)