All Projects → wohlert → Semi Supervised Pytorch

wohlert / Semi Supervised Pytorch

Licence: mit
Implementations of various VAE-based semi-supervised and generative models in PyTorch

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Semi Supervised Pytorch

SSL CR Histo
Official code for "Self-Supervised driven Consistency Training for Annotation Efficient Histopathology Image Analysis" Published in Medical Image Analysis (MedIA) Journal, Oct, 2021.
Stars: ✭ 32 (-94.83%)
Mutual labels:  semi-supervised-learning
Tape
Tasks Assessing Protein Embeddings (TAPE), a set of five biologically relevant semi-supervised learning tasks spread across different domains of protein biology.
Stars: ✭ 295 (-52.34%)
Mutual labels:  semi-supervised-learning
Ssgan Tensorflow
A Tensorflow implementation of Semi-supervised Learning Generative Adversarial Networks (NIPS 2016: Improved Techniques for Training GANs).
Stars: ✭ 496 (-19.87%)
Mutual labels:  semi-supervised-learning
DiGCN
Implement of DiGCN, NeurIPS-2020
Stars: ✭ 25 (-95.96%)
Mutual labels:  semi-supervised-learning
Fixmatch Pytorch
Unofficial PyTorch implementation of "FixMatch: Simplifying Semi-Supervised Learning with Consistency and Confidence"
Stars: ✭ 259 (-58.16%)
Mutual labels:  semi-supervised-learning
Imbalanced Semi Self
[NeurIPS 2020] Semi-Supervision (Unlabeled Data) & Self-Supervision Improve Class-Imbalanced / Long-Tailed Learning
Stars: ✭ 379 (-38.77%)
Mutual labels:  semi-supervised-learning
Temporal-Ensembling-for-Semi-Supervised-Learning
Implementation of Temporal Ensembling for Semi-Supervised Learning by Laine et al. with tensorflow eager execution
Stars: ✭ 49 (-92.08%)
Mutual labels:  semi-supervised-learning
Ganomaly
GANomaly: Semi-Supervised Anomaly Detection via Adversarial Training
Stars: ✭ 563 (-9.05%)
Mutual labels:  semi-supervised-learning
Fewshot gan Unet3d
Tensorflow implementation of our paper: Few-shot 3D Multi-modal Medical Image Segmentation using Generative Adversarial Learning
Stars: ✭ 272 (-56.06%)
Mutual labels:  semi-supervised-learning
Stn Ocr
Code for the paper STN-OCR: A single Neural Network for Text Detection and Text Recognition
Stars: ✭ 473 (-23.59%)
Mutual labels:  semi-supervised-learning
SHOT-plus
code for our TPAMI 2021 paper "Source Data-absent Unsupervised Domain Adaptation through Hypothesis Transfer and Labeling Transfer"
Stars: ✭ 46 (-92.57%)
Mutual labels:  semi-supervised-learning
L2c
Learning to Cluster. A deep clustering strategy.
Stars: ✭ 262 (-57.67%)
Mutual labels:  semi-supervised-learning
Mixmatch Pytorch
Code for "MixMatch - A Holistic Approach to Semi-Supervised Learning"
Stars: ✭ 378 (-38.93%)
Mutual labels:  semi-supervised-learning
DST-CBC
Implementation of our paper "DMT: Dynamic Mutual Training for Semi-Supervised Learning"
Stars: ✭ 98 (-84.17%)
Mutual labels:  semi-supervised-learning
Awesome Semi Supervised Learning
📜 An up-to-date & curated list of awesome semi-supervised learning papers, methods & resources.
Stars: ✭ 538 (-13.09%)
Mutual labels:  semi-supervised-learning
CsiGAN
An implementation for our paper: CsiGAN: Robust Channel State Information-based Activity Recognition with GANs (IEEE Internet of Things Journal, 2019), which is the semi-supervised Generative Adversarial Network (GAN) for Channel State Information (CSI) -based activity recognition.
Stars: ✭ 23 (-96.28%)
Mutual labels:  semi-supervised-learning
Ssl4mis
Semi Supervised Learning for Medical Image Segmentation, a collection of literature reviews and code implementations.
Stars: ✭ 336 (-45.72%)
Mutual labels:  semi-supervised-learning
Alibi Detect
Algorithms for outlier and adversarial instance detection, concept drift and metrics.
Stars: ✭ 604 (-2.42%)
Mutual labels:  semi-supervised-learning
See
Code for the AAAI 2018 publication "SEE: Towards Semi-Supervised End-to-End Scene Text Recognition"
Stars: ✭ 545 (-11.95%)
Mutual labels:  semi-supervised-learning
Advsemiseg
Adversarial Learning for Semi-supervised Semantic Segmentation, BMVC 2018
Stars: ✭ 382 (-38.29%)
Mutual labels:  semi-supervised-learning

Semi-supervised PyTorch

A PyTorch-based package containing useful models for modern deep semi-supervised learning and deep generative models. Want to jump right into it? Look into the notebooks.

Latest additions

2018.04.17 - The Gumbel softmax notebook has been added to show how you can use discrete latent variables in VAEs. 2018.02.28 - The β-VAE notebook was added to show how VAEs can learn disentangled representations.

What is semi-supervised learning?

Semi-supervised learning tries to bridge the gap between supervised and unsupervised learning by learning from both labelled and unlabelled data.

Semi-supervised learning can typically be applied to areas where data is easy to get a hold of, but labelling is expensive. Normally, one would either use an unsupervised method, or just the few labelled examples - both of which would be likely to yield bad results.

The current state-of-the-art method in semi-supervised learning achieves an accuracy of over 99% on the MNIST dataset using just 10 labelled examples per class.

Conditional generation

Most semi-supervised models simultaneously train an inference network and a generator network. This means that it is not only possible to query this models for classification, but also to generate new data from trained model. By seperating label information, one can generate a new sample with the given digit as shown in the image below from Kingma 2014.

Conditional generation of samples

Implemented models and methods:

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].