All Projects → miskcoo → kscore

miskcoo / kscore

Licence: MIT license
Nonparametric Score Estimators, ICML 2020

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to kscore

Gumbel-CRF
Implementation of NeurIPS 20 paper: Latent Template Induction with Gumbel-CRFs
Stars: ✭ 51 (+59.38%)
Mutual labels:  density-estimation, gradient-estimator
Pylians3
Libraries to analyze numerical simulations (python3)
Stars: ✭ 35 (+9.38%)
Mutual labels:  density-estimation
randomforest-density-python
Random Forests for Density Estimation in Python
Stars: ✭ 24 (-25%)
Mutual labels:  density-estimation
Online-Category-Learning
ML algorithm for real-time classification
Stars: ✭ 67 (+109.38%)
Mutual labels:  density-estimation
KernelEstimator.jl
The julia package for nonparametric density estimate and regression
Stars: ✭ 25 (-21.87%)
Mutual labels:  density-estimation
binarygan
Code for "Training Generative Adversarial Networks with Binary Neurons by End-to-end Backpropagation"
Stars: ✭ 25 (-21.87%)
Mutual labels:  gradient-estimator
normalizing-flows
PyTorch implementation of normalizing flow models
Stars: ✭ 271 (+746.88%)
Mutual labels:  density-estimation
NanoFlow
PyTorch implementation of the paper "NanoFlow: Scalable Normalizing Flows with Sublinear Parameter Complexity." (NeurIPS 2020)
Stars: ✭ 63 (+96.88%)
Mutual labels:  density-estimation
naru
Neural Relation Understanding: neural cardinality estimators for tabular data
Stars: ✭ 76 (+137.5%)
Mutual labels:  density-estimation
soft-intro-vae-pytorch
[CVPR 2021 Oral] Official PyTorch implementation of Soft-IntroVAE from the paper "Soft-IntroVAE: Analyzing and Improving Introspective Variational Autoencoders"
Stars: ✭ 170 (+431.25%)
Mutual labels:  density-estimation
sdp
Deep nonparametric estimation of discrete conditional distributions via smoothed dyadic partitioning
Stars: ✭ 15 (-53.12%)
Mutual labels:  density-estimation
structured-volume-sampling
A clean room implementation of Structured Volume Sampling by Bowles and Zimmermann in Unity
Stars: ✭ 27 (-15.62%)
Mutual labels:  density-estimation
AverageShiftedHistograms.jl
⚡ Lightning fast density estimation in Julia ⚡
Stars: ✭ 52 (+62.5%)
Mutual labels:  density-estimation
gradient-boosted-normalizing-flows
We got a stew going!
Stars: ✭ 20 (-37.5%)
Mutual labels:  density-estimation
sliced score matching
Code for reproducing results in the sliced score matching paper (UAI 2019)
Stars: ✭ 68 (+112.5%)
Mutual labels:  density-estimation
delfi
Density estimation likelihood-free inference. No longer actively developed see https://github.com/mackelab/sbi instead
Stars: ✭ 66 (+106.25%)
Mutual labels:  density-estimation

Nonparametric Score Estimators

Yuhao Zhou, Jiaxin Shi, Jun Zhu. https://arxiv.org/abs/2005.10099

Toy Example

python -m examples.spiral --lam=1.0e-5 --kernel=curlfree_imq --estimator=nu

Dependencies

Tensorflow >= 1.14.0

Usage

  • Create a score estimator

    from kscore.estimators import *
    from kscore.kernels import *
    
    # Tikhonov regularization (Theorem 3.1), equivalent to KEF (Example 3.5)
    kef_estimator = Tikhonov(lam=0.0001, use_cg=False, kernel=CurlFreeIMQ())
    
    # Tikhonov regularization + Conjugate Gradient (KEF-CG, Example 3.8)
    kefcg_estimator = Tikhonov(lam=0.0001, use_cg=True, kernel=CurlFreeIMQ())
    
    # Tikhonov regularization + Nystrom approximation (Appendix C.1), 
    # equivalent to NKEF (Example C.1) using 60% samples
    nkef_estimator = Tikhonov(lam=0.0001, use_cg=False, subsample_rate=0.6, kernel=CurlFreeIMQ())
    
    # Tikhonov regularization + Nystrom approximation + Conjugate Gradient
    nkefcg_estimator = Tikhonov(lam=0.0001, use_cg=True, subsample_rate=0.6, kernel=CurlFreeIMQ())
    
    # Landweber iteration (Theorem 3.4)
    landweber_estimator = Landweber(lam=0.00001, kernel=CurlFreeIMQ())
    landweber_estimator = Landweber(iternum=100, kernel=CurlFreeIMQ())
    
    # nu-method (Example C.4)
    nu_estimator = NuMethod(lam=0.00001, kernel=CurlFreeIMQ())
    nu_estimator = NuMethod(iternum=100, kernel=CurlFreeIMQ())
    
    # Spectral cut-off regularization (Theorem 3.2), 
    # equivalent to SSGE (Example 3.6) using 90% eigenvalues
    ssge_estimator = SpectralCutoff(keep_rate=0.9, kernel=DiagonalIMQ())
    
    # Original Stein estimator
    stein_estimator = Stein(lam=0.001)
  • Fit the score estimator using samples

    # manually specify the hyperparameter
    estimator.fit(samples, kernel_hyperparams=kernel_width)
    
    # automatically choose the hyperparameter (using the median trick)
    estimator.fit(samples)
  • Predict the score

    gradient = estimator.compute_gradients(x)
  • Predict the energy (unnormalized log-density)

    log_p = estimator.compute_energy(x)   # only for curl-free kernels
  • Construct other curl-free kernels (see kscore/kernels/curlfree_gaussian.py)

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].