cfml toolsMy collection of causal inference algorithms built on top of accessible, simple, out-of-the-box ML methods, aimed at being explainable and useful in the business context
Stars: ✭ 24 (+84.62%)
haskell-vaeLearning about Haskell with Variational Autoencoders
Stars: ✭ 18 (+38.46%)
Tensorflow Mnist CvaeTensorflow implementation of conditional variational auto-encoder for MNIST
Stars: ✭ 139 (+969.23%)
Awesome VaesA curated list of awesome work on VAEs, disentanglement, representation learning, and generative models.
Stars: ✭ 418 (+3115.38%)
lagvaeLagrangian VAE
Stars: ✭ 27 (+107.69%)
causeinferMachine learning based causal inference/uplift in Python
Stars: ✭ 45 (+246.15%)
KvaeKalman Variational Auto-Encoder
Stars: ✭ 115 (+784.62%)
DowhyDoWhy is a Python library for causal inference that supports explicit modeling and testing of causal assumptions. DoWhy is based on a unified language for causal inference, combining causal graphical models and potential outcomes frameworks.
Stars: ✭ 3,480 (+26669.23%)
Variational AutoencoderVariational autoencoder implemented in tensorflow and pytorch (including inverse autoregressive flow)
Stars: ✭ 807 (+6107.69%)
SIVIUsing neural network to build expressive hierarchical distribution; A variational method to accurately estimate posterior uncertainty; A fast and general method for Bayesian inference. (ICML 2018)
Stars: ✭ 49 (+276.92%)
Generative models tutorial with demoGenerative Models Tutorial with Demo: Bayesian Classifier Sampling, Variational Auto Encoder (VAE), Generative Adversial Networks (GANs), Popular GANs Architectures, Auto-Regressive Models, Important Generative Model Papers, Courses, etc..
Stars: ✭ 276 (+2023.08%)
normalizing-flowsPyTorch implementation of normalizing flow models
Stars: ✭ 271 (+1984.62%)
causal-mlMust-read papers and resources related to causal inference and machine (deep) learning
Stars: ✭ 387 (+2876.92%)
Awesome-Neural-LogicAwesome Neural Logic and Causality: MLN, NLRL, NLM, etc. 因果推断,神经逻辑,强人工智能逻辑推理前沿领域。
Stars: ✭ 106 (+715.38%)
vaeganAn implementation of VAEGAN (variational autoencoder + generative adversarial network).
Stars: ✭ 88 (+576.92%)
linguistic-style-transfer-pytorchImplementation of "Disentangled Representation Learning for Non-Parallel Text Style Transfer(ACL 2019)" in Pytorch
Stars: ✭ 55 (+323.08%)
BayesByHypernetCode for the paper Implicit Weight Uncertainty in Neural Networks
Stars: ✭ 63 (+384.62%)
FSCNMFAn implementation of "Fusing Structure and Content via Non-negative Matrix Factorization for Embedding Information Networks".
Stars: ✭ 16 (+23.08%)
noisy-K-FACNatural Gradient, Variational Inference
Stars: ✭ 29 (+123.08%)
continuous BernoulliThere are C language computer programs about the simulator, transformation, and test statistic of continuous Bernoulli distribution. More than that, the book contains continuous Binomial distribution and continuous Trinomial distribution.
Stars: ✭ 22 (+69.23%)
ethereum-privacyProfiling and Deanonymizing Ethereum Users
Stars: ✭ 37 (+184.62%)
CVAE DialCVAE_XGate model in paper "Xu, Dusek, Konstas, Rieser. Better Conversations by Modeling, Filtering, and Optimizing for Coherence and Diversity"
Stars: ✭ 16 (+23.08%)
CausalityTools.jlAlgorithms for causal inference and the detection of dynamical coupling from time series, and for approximation of the transfer operator and invariant measures.
Stars: ✭ 45 (+246.15%)
adVAEImplementation of 'Self-Adversarial Variational Autoencoder with Gaussian Anomaly Prior Distribution for Anomaly Detection'
Stars: ✭ 17 (+30.77%)
prosperA Python Library for Probabilistic Sparse Coding with Non-Standard Priors and Superpositions
Stars: ✭ 17 (+30.77%)
STEPSpatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits
Stars: ✭ 39 (+200%)
cibookex-rCausal Inference: What If. R and Stata code for Exercises
Stars: ✭ 54 (+315.38%)
FEATHERThe reference implementation of FEATHER from the CIKM '20 paper "Characteristic Functions on Graphs: Birds of a Feather, from Statistical Descriptors to Parametric Models".
Stars: ✭ 34 (+161.54%)
ReactiveMP.jlJulia package for automatic Bayesian inference on a factor graph with reactive message passing
Stars: ✭ 58 (+346.15%)
multimodal-vae-publicA PyTorch implementation of "Multimodal Generative Models for Scalable Weakly-Supervised Learning" (https://arxiv.org/abs/1802.05335)
Stars: ✭ 98 (+653.85%)
active-inferenceA toy model of Friston's active inference in Tensorflow
Stars: ✭ 36 (+176.92%)
HEEREasing Embedding Learning by Comprehensive Transcription of Heterogeneous Information Networks(KDD'18)
Stars: ✭ 60 (+361.54%)
drtmleNonparametric estimators of the average treatment effect with doubly-robust confidence intervals and hypothesis tests
Stars: ✭ 14 (+7.69%)
vae-torchVariational autoencoder for anomaly detection (in PyTorch).
Stars: ✭ 38 (+192.31%)
sqairImplementation of Sequential Attend, Infer, Repeat (SQAIR)
Stars: ✭ 96 (+638.46%)
TriDNRTri-Party Deep Network Representation, IJCAI-16
Stars: ✭ 72 (+453.85%)
probai-2021-pyroRepo for the Tutorials of Day1-Day3 of the Nordic Probabilistic AI School 2021 (https://probabilistic.ai/)
Stars: ✭ 45 (+246.15%)
PyLDAA Latent Dirichlet Allocation implementation in Python.
Stars: ✭ 51 (+292.31%)
resolutions-2019A list of data mining and machine learning papers that I implemented in 2019.
Stars: ✭ 19 (+46.15%)
causal-learnCausal Discovery for Python. Translation and extension of the Tetrad Java code.
Stars: ✭ 428 (+3192.31%)
adaptive-f-divergenceA tensorflow implementation of the NIPS 2018 paper "Variational Inference with Tail-adaptive f-Divergence"
Stars: ✭ 20 (+53.85%)
SelSumAbstractive opinion summarization system (SelSum) and the largest dataset of Amazon product summaries (AmaSum). EMNLP 2021 conference paper.
Stars: ✭ 36 (+176.92%)
vae-pytorchAE and VAE Playground in PyTorch
Stars: ✭ 53 (+307.69%)
doubleml-for-rDoubleML - Double Machine Learning in R
Stars: ✭ 58 (+346.15%)
policytreePolicy learning via doubly robust empirical welfare maximization over trees
Stars: ✭ 59 (+353.85%)
drnet💉📈 Dose response networks (DRNets) are a method for learning to estimate individual dose-response curves for multiple parametric treatments from observational data using neural networks.
Stars: ✭ 48 (+269.23%)
playing with vaeComparing FC VAE / FCN VAE / PCA / UMAP on MNIST / FMNIST
Stars: ✭ 53 (+307.69%)
rssRegression with Summary Statistics.
Stars: ✭ 42 (+223.08%)
VAE-Gumbel-SoftmaxAn implementation of a Variational-Autoencoder using the Gumbel-Softmax reparametrization trick in TensorFlow (tested on r1.5 CPU and GPU) in ICLR 2017.
Stars: ✭ 66 (+407.69%)
doubleml-for-pyDoubleML - Double Machine Learning in Python
Stars: ✭ 129 (+892.31%)
AI Learning HubAI Learning Hub for Machine Learning, Deep Learning, Computer Vision and Statistics
Stars: ✭ 53 (+307.69%)
DUNCode for "Depth Uncertainty in Neural Networks" (https://arxiv.org/abs/2006.08437)
Stars: ✭ 65 (+400%)