All Projects → SamsungLabs → semi-supervised-NFs

SamsungLabs / semi-supervised-NFs

Licence: BSD-2-Clause license
Code for the paper Semi-Conditional Normalizing Flows for Semi-Supervised Learning

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to semi-supervised-NFs

constant-memory-waveglow
PyTorch implementation of NVIDIA WaveGlow with constant memory cost.
Stars: ✭ 36 (+56.52%)
Mutual labels:  normalizing-flows
DeepAtlas
Joint Semi-supervised Learning of Image Registration and Segmentation
Stars: ✭ 38 (+65.22%)
Mutual labels:  semi-supervised-learning
collective-classification-weka-package
Semi-Supervised Learning and Collective Classification
Stars: ✭ 20 (-13.04%)
Mutual labels:  semi-supervised-learning
DualStudent
Code for Paper ''Dual Student: Breaking the Limits of the Teacher in Semi-Supervised Learning'' [ICCV 2019]
Stars: ✭ 106 (+360.87%)
Mutual labels:  semi-supervised-learning
cflow-ad
Official PyTorch code for WACV 2022 paper "CFLOW-AD: Real-Time Unsupervised Anomaly Detection with Localization via Conditional Normalizing Flows"
Stars: ✭ 138 (+500%)
Mutual labels:  normalizing-flows
NeuroAI
NeuroAI-UW seminar, a regular weekly seminar for the UW community, organized by NeuroAI Shlizerman Lab.
Stars: ✭ 36 (+56.52%)
Mutual labels:  icml
InvertibleNetworks.jl
A Julia framework for invertible neural networks
Stars: ✭ 86 (+273.91%)
Mutual labels:  normalizing-flows
SemiSeg-AEL
Semi-Supervised Semantic Segmentation via Adaptive Equalization Learning, NeurIPS 2021 (Spotlight)
Stars: ✭ 79 (+243.48%)
Mutual labels:  semi-supervised-learning
introduction to normalizing flows
Jupyter Notebook corresponding to 'Going with the Flow: An Introduction to Normalizing Flows'
Stars: ✭ 21 (-8.7%)
Mutual labels:  normalizing-flows
normalizing-flows
Implementations of normalizing flows using python and tensorflow
Stars: ✭ 15 (-34.78%)
Mutual labels:  normalizing-flows
ifl-tpp
Implementation of "Intensity-Free Learning of Temporal Point Processes" (Spotlight @ ICLR 2020)
Stars: ✭ 58 (+152.17%)
Mutual labels:  normalizing-flows
ssdg-benchmark
Benchmarks for semi-supervised domain generalization.
Stars: ✭ 46 (+100%)
Mutual labels:  semi-supervised-learning
semantic-parsing-dual
Source code and data for ACL 2019 Long Paper ``Semantic Parsing with Dual Learning".
Stars: ✭ 17 (-26.09%)
Mutual labels:  semi-supervised-learning
pyprophet
PyProphet: Semi-supervised learning and scoring of OpenSWATH results.
Stars: ✭ 23 (+0%)
Mutual labels:  semi-supervised-learning
deviation-network
Source code of the KDD19 paper "Deep anomaly detection with deviation networks", weakly/partially supervised anomaly detection, few-shot anomaly detection
Stars: ✭ 94 (+308.7%)
Mutual labels:  semi-supervised-learning
probnmn-clevr
Code for ICML 2019 paper "Probabilistic Neural-symbolic Models for Interpretable Visual Question Answering" [long-oral]
Stars: ✭ 63 (+173.91%)
Mutual labels:  icml
JCLAL
JCLAL is a general purpose framework developed in Java for Active Learning.
Stars: ✭ 22 (-4.35%)
Mutual labels:  semi-supervised-learning
Context-Aware-Consistency
Semi-supervised Semantic Segmentation with Directional Context-aware Consistency (CVPR 2021)
Stars: ✭ 121 (+426.09%)
Mutual labels:  semi-supervised-learning
ganbert-pytorch
Enhancing the BERT training with Semi-supervised Generative Adversarial Networks in Pytorch/HuggingFace
Stars: ✭ 60 (+160.87%)
Mutual labels:  semi-supervised-learning
Semi-Supervised-Learning-GAN
Semi-supervised Learning GAN
Stars: ✭ 72 (+213.04%)
Mutual labels:  semi-supervised-learning

Semi-Supervised Flows PyTorch

Authors: Andrei Atanov, Alexandra Volokhova, Arsenii Ashukha, Ivan Sosnovik, Dmitry Vetrov

This repo contains code for our INNF workshop paper Semi-Conditional Normalizing Flows for Semi-Supervised Learning

Abstract: This paper proposes a semi-conditional normalizing flow model for semi-supervised learning. The model uses both labelled and unlabeled data to learn an explicit model of joint distribution over objects and labels. Semi-conditional architecture of the model allows us to efficiently compute a value and gradients of the marginal likelihood for unlabeled objects. The conditional part of the model is based on a proposed conditional coupling layer. We demonstrate performance of the model for semi-supervised classification problem on different datasets. The model outperforms the baseline approach based on variational auto-encoders on MNIST dataset.

Poster

Semi-Supervised MNIST classification

Train a Semi-Conditional Normalizing Flows on MNIST with 100 labeled examples:

python train-flow-ssl.py --config config.yaml

You can then find logs at <where-script-launched>/logs/exman-train-flow-ssl.py/runs/<id-date>

For the convenience we also provide pretrained weights pretrained/model.torch, use --pretrained flag for loading.

Credits

Citation

If you found this code useful please cite our paper

@article{atanov2019semi,
  title={Semi-conditional normalizing flows for semi-supervised learning},
  author={Atanov, Andrei and Volokhova, Alexandra and Ashukha, Arsenii and Sosnovik, Ivan and Vetrov, Dmitry},
  journal={arXiv preprint arXiv:1905.00505},
  year={2019}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].