srVAEVAE with RealNVP prior and Super-Resolution VAE in PyTorch. Code release for https://arxiv.org/abs/2006.05218.
Stars: ✭ 56 (+143.48%)
VAE-Gumbel-SoftmaxAn implementation of a Variational-Autoencoder using the Gumbel-Softmax reparametrization trick in TensorFlow (tested on r1.5 CPU and GPU) in ICLR 2017.
Stars: ✭ 66 (+186.96%)
benchmark VAEUnifying Variational Autoencoder (VAE) implementations in Pytorch (NeurIPS 2022)
Stars: ✭ 1,211 (+5165.22%)
BagelIPCCC 2018: Robust and Unsupervised KPI Anomaly Detection Based on Conditional Variational Autoencoder
Stars: ✭ 45 (+95.65%)
Pytorch RlThis repository contains model-free deep reinforcement learning algorithms implemented in Pytorch
Stars: ✭ 394 (+1613.04%)
Deep Learning With PythonExample projects I completed to understand Deep Learning techniques with Tensorflow. Please note that I do no longer maintain this repository.
Stars: ✭ 134 (+482.61%)
Cada Vae PytorchOfficial implementation of the paper "Generalized Zero- and Few-Shot Learning via Aligned Variational Autoencoders" (CVPR 2019)
Stars: ✭ 198 (+760.87%)
Vae For Image GenerationImplemented Variational Autoencoder generative model in Keras for image generation and its latent space visualization on MNIST and CIFAR10 datasets
Stars: ✭ 87 (+278.26%)
Tensorflow Mnist VaeTensorflow implementation of variational auto-encoder for MNIST
Stars: ✭ 422 (+1734.78%)
SmrtHandle class imbalance intelligently by using variational auto-encoders to generate synthetic observations of your minority class.
Stars: ✭ 102 (+343.48%)
Tensorflow Mnist CvaeTensorflow implementation of conditional variational auto-encoder for MNIST
Stars: ✭ 139 (+504.35%)
soft-intro-vae-pytorch[CVPR 2021 Oral] Official PyTorch implementation of Soft-IntroVAE from the paper "Soft-IntroVAE: Analyzing and Improving Introspective Variational Autoencoders"
Stars: ✭ 170 (+639.13%)
S Vae PytorchPytorch implementation of Hyperspherical Variational Auto-Encoders
Stars: ✭ 255 (+1008.7%)
continuous BernoulliThere are C language computer programs about the simulator, transformation, and test statistic of continuous Bernoulli distribution. More than that, the book contains continuous Binomial distribution and continuous Trinomial distribution.
Stars: ✭ 22 (-4.35%)
MIDI-VAENo description or website provided.
Stars: ✭ 56 (+143.48%)
Disentangling VaeExperiments for understanding disentanglement in VAE latent representations
Stars: ✭ 398 (+1630.43%)
vae-concreteKeras implementation of a Variational Auto Encoder with a Concrete Latent Distribution
Stars: ✭ 51 (+121.74%)
Variational AutoencoderVariational autoencoder implemented in tensorflow and pytorch (including inverse autoregressive flow)
Stars: ✭ 807 (+3408.7%)
Pytorch VaeA CNN Variational Autoencoder (CNN-VAE) implemented in PyTorch
Stars: ✭ 181 (+686.96%)
S Vae TfTensorflow implementation of Hyperspherical Variational Auto-Encoders
Stars: ✭ 198 (+760.87%)
Vae TensorflowA Tensorflow implementation of a Variational Autoencoder for the deep learning course at the University of Southern California (USC).
Stars: ✭ 117 (+408.7%)
Vae protein functionProtein function prediction using a variational autoencoder
Stars: ✭ 57 (+147.83%)
Awesome VaesA curated list of awesome work on VAEs, disentanglement, representation learning, and generative models.
Stars: ✭ 418 (+1717.39%)
classifying-vae-lstmmusic generation with a classifying variational autoencoder (VAE) and LSTM
Stars: ✭ 27 (+17.39%)
MojitalkCode for "MojiTalk: Generating Emotional Responses at Scale" https://arxiv.org/abs/1711.04090
Stars: ✭ 107 (+365.22%)
Vae Cvae MnistVariational Autoencoder and Conditional Variational Autoencoder on MNIST in PyTorch
Stars: ✭ 229 (+895.65%)
etos-facedetectorSimple and Effective Face Detector, based on Progressive Calibration Networks (PCN) which is an accurate rotation-invariant face detector running at real-time speed on CPU, published in CVPR 2018.
Stars: ✭ 23 (+0%)
tape-neurips2019Tasks Assessing Protein Embeddings (TAPE), a set of five biologically relevant semi-supervised learning tasks spread across different domains of protein biology. (DEPRECATED)
Stars: ✭ 117 (+408.7%)
SIVIUsing neural network to build expressive hierarchical distribution; A variational method to accurately estimate posterior uncertainty; A fast and general method for Bayesian inference. (ICML 2018)
Stars: ✭ 49 (+113.04%)
svae cf[ WSDM '19 ] Sequential Variational Autoencoders for Collaborative Filtering
Stars: ✭ 38 (+65.22%)
molecular-VAEImplementation of the paper - Automatic chemical design using a data-driven continuous representation of molecules
Stars: ✭ 36 (+56.52%)
sinkhorn-label-allocationSinkhorn Label Allocation is a label assignment method for semi-supervised self-training algorithms. The SLA algorithm is described in full in this ICML 2021 paper: https://arxiv.org/abs/2102.08622.
Stars: ✭ 49 (+113.04%)
cucimNo description or website provided.
Stars: ✭ 218 (+847.83%)
pywslPython codes for weakly-supervised learning
Stars: ✭ 118 (+413.04%)
concept-based-xaiLibrary implementing state-of-the-art Concept-based and Disentanglement Learning methods for Explainable AI
Stars: ✭ 41 (+78.26%)
VAE-Latent-Space-ExplorerInteractive exploration of MNIST variational autoencoder latent space with React and tensorflow.js.
Stars: ✭ 30 (+30.43%)
GPQGeneralized Product Quantization Network For Semi-supervised Image Retrieval - CVPR 2020
Stars: ✭ 60 (+160.87%)
AC-VRNNPyTorch code for CVIU paper "AC-VRNN: Attentive Conditional-VRNN for Multi-Future Trajectory Prediction"
Stars: ✭ 21 (-8.7%)
auto-gfqgAutomatic Gap-Fill Question Generation
Stars: ✭ 17 (-26.09%)
Fun-with-MNISTPlaying with MNIST. Machine Learning. Generative Models.
Stars: ✭ 23 (+0%)
SimPLECode for the paper: "SimPLE: Similar Pseudo Label Exploitation for Semi-Supervised Classification"
Stars: ✭ 50 (+117.39%)
rankpruning🧹 Formerly for binary classification with noisy labels. Replaced by cleanlab.
Stars: ✭ 81 (+252.17%)
ST-PlusPlus[CVPR 2022] ST++: Make Self-training Work Better for Semi-supervised Semantic Segmentation
Stars: ✭ 168 (+630.43%)
language-modelsKeras implementations of three language models: character-level RNN, word-level RNN and Sentence VAE (Bowman, Vilnis et al 2016).
Stars: ✭ 39 (+69.57%)
deep-blueberryIf you've always wanted to learn about deep-learning but don't know where to start, then you might have stumbled upon the right place!
Stars: ✭ 17 (-26.09%)
IAST-ECCV2020IAST: Instance Adaptive Self-training for Unsupervised Domain Adaptation (ECCV 2020) https://teacher.bupt.edu.cn/zhuchuang/en/index.htm
Stars: ✭ 84 (+265.22%)
AIML-ProjectsProjects I completed as a part of Great Learning's PGP - Artificial Intelligence and Machine Learning
Stars: ✭ 85 (+269.57%)