awesome-ebmCollecting research materials on EBM/EBL (Energy Based Models, Energy Based Learning)
Stars: ✭ 143 (+110.29%)
delfiDensity estimation likelihood-free inference. No longer actively developed see https://github.com/mackelab/sbi instead
Stars: ✭ 66 (-2.94%)
SDEditPyTorch implementation for SDEdit: Image Synthesis and Editing with Stochastic Differential Equations
Stars: ✭ 394 (+479.41%)
WaveGrad2PyTorch Implementation of Google Brain's WaveGrad 2: Iterative Refinement for Text-to-Speech Synthesis
Stars: ✭ 55 (-19.12%)
kscoreNonparametric Score Estimators, ICML 2020
Stars: ✭ 32 (-52.94%)
Pylians3Libraries to analyze numerical simulations (python3)
Stars: ✭ 35 (-48.53%)
KernelEstimator.jlThe julia package for nonparametric density estimate and regression
Stars: ✭ 25 (-63.24%)
Gumbel-CRFImplementation of NeurIPS 20 paper: Latent Template Induction with Gumbel-CRFs
Stars: ✭ 51 (-25%)
score sde pytorchPyTorch implementation for Score-Based Generative Modeling through Stochastic Differential Equations (ICLR 2021, Oral)
Stars: ✭ 755 (+1010.29%)
normalizing-flowsPyTorch implementation of normalizing flow models
Stars: ✭ 271 (+298.53%)
NanoFlowPyTorch implementation of the paper "NanoFlow: Scalable Normalizing Flows with Sublinear Parameter Complexity." (NeurIPS 2020)
Stars: ✭ 63 (-7.35%)
naruNeural Relation Understanding: neural cardinality estimators for tabular data
Stars: ✭ 76 (+11.76%)
soft-intro-vae-pytorch[CVPR 2021 Oral] Official PyTorch implementation of Soft-IntroVAE from the paper "Soft-IntroVAE: Analyzing and Improving Introspective Variational Autoencoders"
Stars: ✭ 170 (+150%)
sdpDeep nonparametric estimation of discrete conditional distributions via smoothed dyadic partitioning
Stars: ✭ 15 (-77.94%)
structured-volume-samplingA clean room implementation of Structured Volume Sampling by Bowles and Zimmermann in Unity
Stars: ✭ 27 (-60.29%)
score flowOfficial code for "Maximum Likelihood Training of Score-Based Diffusion Models", NeurIPS 2021 (spotlight)
Stars: ✭ 49 (-27.94%)