All Projects → dilinwang820 → adaptive-f-divergence

dilinwang820 / adaptive-f-divergence

Licence: MIT license
A tensorflow implementation of the NIPS 2018 paper "Variational Inference with Tail-adaptive f-Divergence"

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to adaptive-f-divergence

Generalized-PixelVAE
PixelVAE with or without regularization
Stars: ✭ 64 (+220%)
Mutual labels:  generative-model, variational-inference
Deep Generative Models For Natural Language Processing
DGMs for NLP. A roadmap.
Stars: ✭ 185 (+825%)
Mutual labels:  generative-model, variational-inference
Generative models tutorial with demo
Generative Models Tutorial with Demo: Bayesian Classifier Sampling, Variational Auto Encoder (VAE), Generative Adversial Networks (GANs), Popular GANs Architectures, Auto-Regressive Models, Important Generative Model Papers, Courses, etc..
Stars: ✭ 276 (+1280%)
Mutual labels:  generative-model, variational-inference
Variational Ladder Autoencoder
Implementation of VLAE
Stars: ✭ 196 (+880%)
Mutual labels:  generative-model, variational-inference
Awesome Vaes
A curated list of awesome work on VAEs, disentanglement, representation learning, and generative models.
Stars: ✭ 418 (+1990%)
Mutual labels:  generative-model, variational-inference
AI Learning Hub
AI Learning Hub for Machine Learning, Deep Learning, Computer Vision and Statistics
Stars: ✭ 53 (+165%)
Mutual labels:  generative-model, variational-inference
CondGen
Conditional Structure Generation through Graph Variational Generative Adversarial Nets, NeurIPS 2019.
Stars: ✭ 46 (+130%)
Mutual labels:  generative-model
simplegan
Tensorflow-based framework to ease training of generative models
Stars: ✭ 19 (-5%)
Mutual labels:  generative-model
mix-stage
Official Repository for the paper Style Transfer for Co-Speech Gesture Animation: A Multi-Speaker Conditional-Mixture Approach published in ECCV 2020 (https://arxiv.org/abs/2007.12553)
Stars: ✭ 22 (+10%)
Mutual labels:  generative-model
GatedPixelCNNPyTorch
PyTorch implementation of "Conditional Image Generation with PixelCNN Decoders" by van den Oord et al. 2016
Stars: ✭ 68 (+240%)
Mutual labels:  generative-model
Lr-LiVAE
Tensorflow implementation of Disentangling Latent Space for VAE by Label Relevant/Irrelevant Dimensions (CVPR 2019)
Stars: ✭ 29 (+45%)
Mutual labels:  generative-model
GDPP
Generator loss to reduce mode-collapse and to improve the generated samples quality.
Stars: ✭ 32 (+60%)
Mutual labels:  generative-model
denoising-diffusion-pytorch
Implementation of Denoising Diffusion Probabilistic Model in Pytorch
Stars: ✭ 2,313 (+11465%)
Mutual labels:  generative-model
coursera-gan-specialization
Programming assignments and quizzes from all courses within the GANs specialization offered by deeplearning.ai
Stars: ✭ 277 (+1285%)
Mutual labels:  generative-model
DUN
Code for "Depth Uncertainty in Neural Networks" (https://arxiv.org/abs/2006.08437)
Stars: ✭ 65 (+225%)
Mutual labels:  variational-inference
RAVE
Official implementation of the RAVE model: a Realtime Audio Variational autoEncoder
Stars: ✭ 564 (+2720%)
Mutual labels:  generative-model
vqvae-2
PyTorch implementation of VQ-VAE-2 from "Generating Diverse High-Fidelity Images with VQ-VAE-2"
Stars: ✭ 65 (+225%)
Mutual labels:  generative-model
ccube
Bayesian mixture models for estimating and clustering cancer cell fractions
Stars: ✭ 23 (+15%)
Mutual labels:  variational-inference
Gumbel-CRF
Implementation of NeurIPS 20 paper: Latent Template Induction with Gumbel-CRFs
Stars: ✭ 51 (+155%)
Mutual labels:  generative-model
rss
Regression with Summary Statistics.
Stars: ✭ 42 (+110%)
Mutual labels:  variational-inference

Variational Inference with Tail-adaptive f-Divergence

This repository contains a tensorflow implementation for tail-adaptive f-divergence, as presented in the followng paper:

Dilin Wang, Hao Liu, Qiang Liu, "Variational Inference with Tail-adaptive f-Divergence", NIPS 2018.

Our Algorithm

Our tail-adaptive weights ($\gamma_f$ in Eqn 8) could be easily calculated with the following function,

def get_tail_adaptive_weights(self, l_p, l_q, beta=-1.):
    """returns the tail-adaptive weights
    Args:
        l_p: log p(x), 1-d tensor, log probability of p
        l_q: log q(x), 1-d tensor, log probability of q
        beta: magnitude, default -1
    Returns:
        Tail-adaptive weights
    """
    diff = l_p - l_q
    diff -= tf.reduce_max(diff)
    dx = tf.exp(diff)
    prob = tf.sign(tf.expand_dims(dx, 1) - tf.expand_dims(dx, 0))
    prob = tf.cast(tf.greater(prob, 0.5), tf.float32)
    wx = tf.reduce_sum(prob, axis=1) / tf.cast(tf.size(l_p), tf.float32)
    wx = (1. - wx) ** beta # beta = -1; or beta = -0.5
    
    wx /= tf.reduce_sum(wx)  # self-normalization
    return tf.stop_gradient(wx)

Citation

If you find tail-adaptive f-divergence useful in your research, please consider citing:

    @article{wang2018variational,
      title={Variational Inference with Tail-adaptive f-Divergence},
      author={Wang, Dilin and Liu, Hao and Liu, Qiang},
      journal={arXiv preprint arXiv:1810.11943},
      year={2018}
    }
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].