All Projects → hiram64 → temporal-ensembling-semi-supervised

hiram64 / temporal-ensembling-semi-supervised

Licence: MIT license
Keras implementation of temporal ensembling(semi-supervised learning)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to temporal-ensembling-semi-supervised

Temporal-Ensembling-for-Semi-Supervised-Learning
Implementation of Temporal Ensembling for Semi-Supervised Learning by Laine et al. with tensorflow eager execution
Stars: ✭ 49 (+122.73%)
Mutual labels:  semi-supervised-learning, temporal-ensembling
Ali Pytorch
PyTorch implementation of Adversarially Learned Inference (BiGAN).
Stars: ✭ 61 (+177.27%)
Mutual labels:  semi-supervised-learning, cifar10
Virtual Adversarial Training
Pytorch implementation of Virtual Adversarial Training
Stars: ✭ 94 (+327.27%)
Mutual labels:  semi-supervised-learning, cifar10
Vosk
VOSK Speech Recognition Toolkit
Stars: ✭ 182 (+727.27%)
Mutual labels:  semi-supervised-learning
Graph Representation Learning
Autoencoders for Link Prediction and Semi-Supervised Node Classification (DSAA 2018)
Stars: ✭ 199 (+804.55%)
Mutual labels:  semi-supervised-learning
realistic-ssl-evaluation-pytorch
Reimplementation of "Realistic Evaluation of Deep Semi-Supervised Learning Algorithms"
Stars: ✭ 79 (+259.09%)
Mutual labels:  semi-supervised-learning
semi-supervised-paper-implementation
Reproduce some methods in semi-supervised papers.
Stars: ✭ 35 (+59.09%)
Mutual labels:  semi-supervised-learning
Stylealign
[ICCV 2019]Aggregation via Separation: Boosting Facial Landmark Detector with Semi-Supervised Style Transition
Stars: ✭ 172 (+681.82%)
Mutual labels:  semi-supervised-learning
pywsl
Python codes for weakly-supervised learning
Stars: ✭ 118 (+436.36%)
Mutual labels:  semi-supervised-learning
DeFMO
[CVPR 2021] DeFMO: Deblurring and Shape Recovery of Fast Moving Objects
Stars: ✭ 144 (+554.55%)
Mutual labels:  semi-supervised-learning
Good Papers
I try my best to keep updated cutting-edge knowledge in Machine Learning/Deep Learning and Natural Language Processing. These are my notes on some good papers
Stars: ✭ 248 (+1027.27%)
Mutual labels:  semi-supervised-learning
Triple Gan
See Triple-GAN-V2 in PyTorch: https://github.com/taufikxu/Triple-GAN
Stars: ✭ 203 (+822.73%)
Mutual labels:  semi-supervised-learning
ST-PlusPlus
[CVPR 2022] ST++: Make Self-training Work Better for Semi-supervised Semantic Segmentation
Stars: ✭ 168 (+663.64%)
Mutual labels:  semi-supervised-learning
Graph Adversarial Learning
A curated collection of adversarial attack and defense on graph data.
Stars: ✭ 188 (+754.55%)
Mutual labels:  semi-supervised-learning
Pro-GNN
Implementation of the KDD 2020 paper "Graph Structure Learning for Robust Graph Neural Networks"
Stars: ✭ 202 (+818.18%)
Mutual labels:  semi-supervised-learning
Cct
[CVPR 2020] Semi-Supervised Semantic Segmentation with Cross-Consistency Training.
Stars: ✭ 171 (+677.27%)
Mutual labels:  semi-supervised-learning
shake-drop pytorch
PyTorch implementation of shake-drop regularization
Stars: ✭ 50 (+127.27%)
Mutual labels:  cifar10
Tricks Of Semi Superviseddeepleanring Pytorch
PseudoLabel 2013, VAT, PI model, Tempens, MeanTeacher, ICT, MixMatch, FixMatch
Stars: ✭ 240 (+990.91%)
Mutual labels:  semi-supervised-learning
Improvedgan Pytorch
Semi-supervised GAN in "Improved Techniques for Training GANs"
Stars: ✭ 228 (+936.36%)
Mutual labels:  semi-supervised-learning
Active-learning-for-object-detection
Active learning for deep object detection using YOLO
Stars: ✭ 35 (+59.09%)
Mutual labels:  semi-supervised-learning

Temporal Ensembling (Keras)

This repository provides keras implementation of the paper "TEMPORAL ENSEMBLING FOR SEMI-SUPERVISED LEARNING" by S. Laine et al.

The implementation includes Temporal Ensembling and PI-Model using CIFAR-10. Both methods are proposed in the paper As the paper, the semi-supervised training is done by 4000 supervised data(400 per class) and 46,000 unsupervised data. 10,000 data are left for evaluation.

Dependencies

keras, tensorflow, scikit-learn

The versions of test environment :
Keras==2.1.6, tensorflow-gpu==1.7.0, scikit-learn==0.19.1

How to use

1. Prepare data

Prepare data and labels to use. For instance, CIFAR-10 is composed of 10 classes and each label should express unique class and be integer. These prepared data should be placed in the data directory.

You can download CIFAR-10 data via :
https://www.kaggle.com/janzenliu/cifar-10-batches-py

Put them in "data" directory and run the following code to compress them into NPZ file.

python make_cifar10_npz.py

After running this code, you can get cifar10.npz under "data" directory.

2. Train & Evaluation

After data is prepared, run the following script to train and evaluation.

Please look into the script about other settable parameters or run "python main.py --help". Although the most of this implementation follows the description of the paper, there are some differences. Please see the note below.

# Temporal Ensembling
python main_temporal_ensembling.py

# PI-model
python main_pi_model.py

Evaluation is done at intervals of 5 epochs. In the test, Temporal ensembling and PI-model achieved about 87.3% accuracy and about 86.7% accuracy respectively at the end of epoch.

Note

The differences between the paper and this implementation:

  • The learning rate is changed to 0.001 instead of 0.003 because of non-convergence issue
  • The training epoch is changed to 350 instead of 300 to achieve higher accuracy

Reference

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].