All Projects → YU1ut → Mixmatch Pytorch

YU1ut / Mixmatch Pytorch

Licence: mit
Code for "MixMatch - A Holistic Approach to Semi-Supervised Learning"

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Mixmatch Pytorch

spear
SPEAR: Programmatically label and build training data quickly.
Stars: ✭ 81 (-78.57%)
Mutual labels:  semi-supervised-learning
SSL CR Histo
Official code for "Self-Supervised driven Consistency Training for Annotation Efficient Histopathology Image Analysis" Published in Medical Image Analysis (MedIA) Journal, Oct, 2021.
Stars: ✭ 32 (-91.53%)
Mutual labels:  semi-supervised-learning
Fixmatch Pytorch
Unofficial PyTorch implementation of "FixMatch: Simplifying Semi-Supervised Learning with Consistency and Confidence"
Stars: ✭ 259 (-31.48%)
Mutual labels:  semi-supervised-learning
Pseudo-Label-Keras
Pseudo-Label: Semi-Supervised Learning on CIFAR-10 in Keras
Stars: ✭ 36 (-90.48%)
Mutual labels:  semi-supervised-learning
Temporal-Ensembling-for-Semi-Supervised-Learning
Implementation of Temporal Ensembling for Semi-Supervised Learning by Laine et al. with tensorflow eager execution
Stars: ✭ 49 (-87.04%)
Mutual labels:  semi-supervised-learning
DiGCN
Implement of DiGCN, NeurIPS-2020
Stars: ✭ 25 (-93.39%)
Mutual labels:  semi-supervised-learning
semi-supervised-NFs
Code for the paper Semi-Conditional Normalizing Flows for Semi-Supervised Learning
Stars: ✭ 23 (-93.92%)
Mutual labels:  semi-supervised-learning
Ssl4mis
Semi Supervised Learning for Medical Image Segmentation, a collection of literature reviews and code implementations.
Stars: ✭ 336 (-11.11%)
Mutual labels:  semi-supervised-learning
CsiGAN
An implementation for our paper: CsiGAN: Robust Channel State Information-based Activity Recognition with GANs (IEEE Internet of Things Journal, 2019), which is the semi-supervised Generative Adversarial Network (GAN) for Channel State Information (CSI) -based activity recognition.
Stars: ✭ 23 (-93.92%)
Mutual labels:  semi-supervised-learning
L2c
Learning to Cluster. A deep clustering strategy.
Stars: ✭ 262 (-30.69%)
Mutual labels:  semi-supervised-learning
ProSelfLC-2021
noisy labels; missing labels; semi-supervised learning; entropy; uncertainty; robustness and generalisation.
Stars: ✭ 45 (-88.1%)
Mutual labels:  semi-supervised-learning
catgan pytorch
Unsupervised and Semi-supervised Learning with Categorical Generative Adversarial Networks
Stars: ✭ 50 (-86.77%)
Mutual labels:  semi-supervised-learning
SHOT-plus
code for our TPAMI 2021 paper "Source Data-absent Unsupervised Domain Adaptation through Hypothesis Transfer and Labeling Transfer"
Stars: ✭ 46 (-87.83%)
Mutual labels:  semi-supervised-learning
improving segmentation with selfsupervised depth
[CVPR21] Implementation of our work "Three Ways to Improve Semantic Segmentation with Self-Supervised Depth Estimation"
Stars: ✭ 189 (-50%)
Mutual labels:  semi-supervised-learning
Fewshot gan Unet3d
Tensorflow implementation of our paper: Few-shot 3D Multi-modal Medical Image Segmentation using Generative Adversarial Learning
Stars: ✭ 272 (-28.04%)
Mutual labels:  semi-supervised-learning
AdversarialAudioSeparation
Code accompanying the paper "Semi-supervised adversarial audio source separation applied to singing voice extraction"
Stars: ✭ 70 (-81.48%)
Mutual labels:  semi-supervised-learning
DST-CBC
Implementation of our paper "DMT: Dynamic Mutual Training for Semi-Supervised Learning"
Stars: ✭ 98 (-74.07%)
Mutual labels:  semi-supervised-learning
Imbalanced Semi Self
[NeurIPS 2020] Semi-Supervision (Unlabeled Data) & Self-Supervision Improve Class-Imbalanced / Long-Tailed Learning
Stars: ✭ 379 (+0.26%)
Mutual labels:  semi-supervised-learning
Tape
Tasks Assessing Protein Embeddings (TAPE), a set of five biologically relevant semi-supervised learning tasks spread across different domains of protein biology.
Stars: ✭ 295 (-21.96%)
Mutual labels:  semi-supervised-learning
HyperGBM
A full pipeline AutoML tool for tabular data
Stars: ✭ 172 (-54.5%)
Mutual labels:  semi-supervised-learning

MixMatch

This is an unofficial PyTorch implementation of MixMatch: A Holistic Approach to Semi-Supervised Learning. The official Tensorflow implementation is here.

Now only experiments on CIFAR-10 are available.

This repository carefully implemented important details of the official implementation to reproduce the results.

Requirements

  • Python 3.6+
  • PyTorch 1.0
  • torchvision 0.2.2 (older versions are not compatible with this code)
  • tensorboardX
  • progress
  • matplotlib
  • numpy

Usage

Train

Train the model by 250 labeled data of CIFAR-10 dataset:

python train.py --gpu <gpu_id> --n-labeled 250 --out [email protected]

Train the model by 4000 labeled data of CIFAR-10 dataset:

python train.py --gpu <gpu_id> --n-labeled 4000 --out [email protected]

Monitoring training progress

tensorboard.sh --port 6006 --logdir [email protected]

Results (Accuracy)

#Labels 250 500 1000 2000 4000
Paper 88.92 ± 0.87 90.35 ± 0.94 92.25 ± 0.32 92.97 ± 0.15 93.76 ± 0.06
This code 88.71 88.96 90.52 92.23 93.52

(Results of this code were evaluated on 1 run. Results of 5 runs with different seeds will be updated later. )

References

@article{berthelot2019mixmatch,
  title={MixMatch: A Holistic Approach to Semi-Supervised Learning},
  author={Berthelot, David and Carlini, Nicholas and Goodfellow, Ian and Papernot, Nicolas and Oliver, Avital and Raffel, Colin},
  journal={arXiv preprint arXiv:1905.02249},
  year={2019}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].