AI Learning HubAI Learning Hub for Machine Learning, Deep Learning, Computer Vision and Statistics
Stars: ✭ 53 (-17.19%)
adaptive-f-divergenceA tensorflow implementation of the NIPS 2018 paper "Variational Inference with Tail-adaptive f-Divergence"
Stars: ✭ 20 (-68.75%)
Generative models tutorial with demoGenerative Models Tutorial with Demo: Bayesian Classifier Sampling, Variational Auto Encoder (VAE), Generative Adversial Networks (GANs), Popular GANs Architectures, Auto-Regressive Models, Important Generative Model Papers, Courses, etc..
Stars: ✭ 276 (+331.25%)
Awesome VaesA curated list of awesome work on VAEs, disentanglement, representation learning, and generative models.
Stars: ✭ 418 (+553.13%)
gcWGANGuided Conditional Wasserstein GAN for De Novo Protein Design
Stars: ✭ 38 (-40.62%)
vqvae-2PyTorch implementation of VQ-VAE-2 from "Generating Diverse High-Fidelity Images with VQ-VAE-2"
Stars: ✭ 65 (+1.56%)
DUNCode for "Depth Uncertainty in Neural Networks" (https://arxiv.org/abs/2006.08437)
Stars: ✭ 65 (+1.56%)
probai-2021-pyroRepo for the Tutorials of Day1-Day3 of the Nordic Probabilistic AI School 2021 (https://probabilistic.ai/)
Stars: ✭ 45 (-29.69%)
Gumbel-CRFImplementation of NeurIPS 20 paper: Latent Template Induction with Gumbel-CRFs
Stars: ✭ 51 (-20.31%)
artificial neural networksA collection of Methods and Models for various architectures of Artificial Neural Networks
Stars: ✭ 40 (-37.5%)
Lr-LiVAETensorflow implementation of Disentangling Latent Space for VAE by Label Relevant/Irrelevant Dimensions (CVPR 2019)
Stars: ✭ 29 (-54.69%)
noisy-K-FACNatural Gradient, Variational Inference
Stars: ✭ 29 (-54.69%)
rssRegression with Summary Statistics.
Stars: ✭ 42 (-34.37%)
py-msa-kdenlivePython script to load a Kdenlive (OSS NLE video editor) project file, and conform the edit on video or numpy arrays.
Stars: ✭ 25 (-60.94%)
pytorch-GANMy pytorch implementation for GAN
Stars: ✭ 12 (-81.25%)
haskell-vaeLearning about Haskell with Variational Autoencoders
Stars: ✭ 18 (-71.87%)
latent-pose-reenactmentThe authors' implementation of the "Neural Head Reenactment with Latent Pose Descriptors" (CVPR 2020) paper.
Stars: ✭ 132 (+106.25%)
Generalization-Causality关于domain generalization,domain adaptation,causality,robutness,prompt,optimization,generative model各式各样研究的阅读笔记
Stars: ✭ 482 (+653.13%)
coursera-gan-specializationProgramming assignments and quizzes from all courses within the GANs specialization offered by deeplearning.ai
Stars: ✭ 277 (+332.81%)
RAVEOfficial implementation of the RAVE model: a Realtime Audio Variational autoEncoder
Stars: ✭ 564 (+781.25%)
ReactiveMP.jlJulia package for automatic Bayesian inference on a factor graph with reactive message passing
Stars: ✭ 58 (-9.37%)
ccubeBayesian mixture models for estimating and clustering cancer cell fractions
Stars: ✭ 23 (-64.06%)
PyLDAA Latent Dirichlet Allocation implementation in Python.
Stars: ✭ 51 (-20.31%)
gans-in-action"GAN 인 액션"(한빛미디어, 2020)의 코드 저장소입니다.
Stars: ✭ 29 (-54.69%)
graph-nvpGraphNVP: An Invertible Flow Model for Generating Molecular Graphs
Stars: ✭ 69 (+7.81%)
SelSumAbstractive opinion summarization system (SelSum) and the largest dataset of Amazon product summaries (AmaSum). EMNLP 2021 conference paper.
Stars: ✭ 36 (-43.75%)
VINFRepository for DTU Special Course, focusing on Variational Inference using Normalizing Flows (VINF). Supervised by Michael Riis Andersen
Stars: ✭ 23 (-64.06%)
GraphCNN-GANGraph-convolutional GAN for point cloud generation. Code from ICLR 2019 paper Learning Localized Generative Models for 3D Point Clouds via Graph Convolution
Stars: ✭ 50 (-21.87%)
GDPPGenerator loss to reduce mode-collapse and to improve the generated samples quality.
Stars: ✭ 32 (-50%)
GrabNetGrabNet: A Generative model to generate realistic 3D hands grasping unseen objects (ECCV2020)
Stars: ✭ 146 (+128.13%)
vae-torchVariational autoencoder for anomaly detection (in PyTorch).
Stars: ✭ 38 (-40.62%)
simpleganTensorflow-based framework to ease training of generative models
Stars: ✭ 19 (-70.31%)
BayesByHypernetCode for the paper Implicit Weight Uncertainty in Neural Networks
Stars: ✭ 63 (-1.56%)
continuous-time-flow-processPyTorch code of "Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows" (NeurIPS 2020)
Stars: ✭ 34 (-46.87%)
MidiTokA convenient MIDI / symbolic music tokenizer for Deep Learning networks, with multiple strategies 🎶
Stars: ✭ 180 (+181.25%)
Cross-Speaker-Emotion-TransferPyTorch Implementation of ByteDance's Cross-speaker Emotion Transfer Based on Speaker Condition Layer Normalization and Semi-Supervised Training in Text-To-Speech
Stars: ✭ 107 (+67.19%)
char-VAEInspired by the neural style algorithm in the computer vision field, we propose a high-level language model with the aim of adapting the linguistic style.
Stars: ✭ 18 (-71.87%)
CondGenConditional Structure Generation through Graph Variational Generative Adversarial Nets, NeurIPS 2019.
Stars: ✭ 46 (-28.12%)
style-vaeImplementation of VAE and Style-GAN Architecture Achieving State of the Art Reconstruction
Stars: ✭ 25 (-60.94%)
mix-stageOfficial Repository for the paper Style Transfer for Co-Speech Gesture Animation: A Multi-Speaker Conditional-Mixture Approach published in ECCV 2020 (https://arxiv.org/abs/2007.12553)
Stars: ✭ 22 (-65.62%)
prosperA Python Library for Probabilistic Sparse Coding with Non-Standard Priors and Superpositions
Stars: ✭ 17 (-73.44%)
causal-semantic-generative-modelCodes for Causal Semantic Generative model (CSG), the model proposed in "Learning Causal Semantic Representation for Out-of-Distribution Prediction" (NeurIPS-21)
Stars: ✭ 51 (-20.31%)
GatedPixelCNNPyTorchPyTorch implementation of "Conditional Image Generation with PixelCNN Decoders" by van den Oord et al. 2016
Stars: ✭ 68 (+6.25%)
EVEOfficial repository for the paper "Large-scale clinical interpretation of genetic variants using evolutionary data and deep learning". Joint collaboration between the Marks lab and the OATML group.
Stars: ✭ 37 (-42.19%)
cygenCodes for CyGen, the novel generative modeling framework proposed in "On the Generative Utility of Cyclic Conditionals" (NeurIPS-21)
Stars: ✭ 44 (-31.25%)
ShapeFormerOfficial repository for the ShapeFormer Project
Stars: ✭ 97 (+51.56%)
MMD-GANImproving MMD-GAN training with repulsive loss function
Stars: ✭ 82 (+28.13%)
vireoDemultiplexing pooled scRNA-seq data with or without genotype reference
Stars: ✭ 34 (-46.87%)
active-inferenceA toy model of Friston's active inference in Tensorflow
Stars: ✭ 36 (-43.75%)
sqairImplementation of Sequential Attend, Infer, Repeat (SQAIR)
Stars: ✭ 96 (+50%)