All Projects → semi-supervised-paper → semi-supervised-paper-implementation

semi-supervised-paper / semi-supervised-paper-implementation

Licence: other
Reproduce some methods in semi-supervised papers.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to semi-supervised-paper-implementation

DualStudent
Code for Paper ''Dual Student: Breaking the Limits of the Teacher in Semi-Supervised Learning'' [ICCV 2019]
Stars: ✭ 106 (+202.86%)
Mutual labels:  semi-supervised-learning, pytorch-implementation
Text-Classification-LSTMs-PyTorch
The aim of this repository is to show a baseline model for text classification by implementing a LSTM-based model coded in PyTorch. In order to provide a better understanding of the model, it will be used a Tweets dataset provided by Kaggle.
Stars: ✭ 45 (+28.57%)
Mutual labels:  pytorch-implementation
CheXpert-Challenge
Code for CheXpert Challenge 2019 Top 1 && Top 2 solution
Stars: ✭ 30 (-14.29%)
Mutual labels:  pytorch-implementation
Representation-Learning-for-Information-Extraction
Pytorch implementation of Paper by Google Research - Representation Learning for Information Extraction from Form-like Documents.
Stars: ✭ 82 (+134.29%)
Mutual labels:  pytorch-implementation
realistic-ssl-evaluation-pytorch
Reimplementation of "Realistic Evaluation of Deep Semi-Supervised Learning Algorithms"
Stars: ✭ 79 (+125.71%)
Mutual labels:  semi-supervised-learning
Env-KB
A custom mechanical keyboard inspired by the CFTKB Mysterium utilizing the Raspberry Pi Pico
Stars: ✭ 203 (+480%)
Mutual labels:  pi
ResUNetPlusPlus-with-CRF-and-TTA
ResUNet++, CRF, and TTA for segmentation of medical images (IEEE JBIHI)
Stars: ✭ 98 (+180%)
Mutual labels:  pytorch-implementation
RandLA-Net-pytorch
🍀 Pytorch Implementation of RandLA-Net (https://arxiv.org/abs/1911.11236)
Stars: ✭ 69 (+97.14%)
Mutual labels:  pytorch-implementation
onn
Online Deep Learning: Learning Deep Neural Networks on the Fly / Non-linear Contextual Bandit Algorithm (ONN_THS)
Stars: ✭ 139 (+297.14%)
Mutual labels:  pytorch-implementation
cycleGAN-PyTorch
A clean and lucid implementation of cycleGAN using PyTorch
Stars: ✭ 107 (+205.71%)
Mutual labels:  pytorch-implementation
MolDQN-pytorch
A PyTorch Implementation of "Optimization of Molecules via Deep Reinforcement Learning".
Stars: ✭ 58 (+65.71%)
Mutual labels:  pytorch-implementation
openpose-pytorch
🔥 OpenPose api wrapper in PyTorch.
Stars: ✭ 52 (+48.57%)
Mutual labels:  pytorch-implementation
pywsl
Python codes for weakly-supervised learning
Stars: ✭ 118 (+237.14%)
Mutual labels:  semi-supervised-learning
vat nmt
Implementation of "Effective Adversarial Regularization for Neural Machine Translation", ACL 2019
Stars: ✭ 22 (-37.14%)
Mutual labels:  vat
Pro-GNN
Implementation of the KDD 2020 paper "Graph Structure Learning for Robust Graph Neural Networks"
Stars: ✭ 202 (+477.14%)
Mutual labels:  semi-supervised-learning
neuro-symbolic-ai-soc
Neuro-Symbolic Visual Question Answering on Sort-of-CLEVR using PyTorch
Stars: ✭ 41 (+17.14%)
Mutual labels:  pytorch-implementation
3D-UNet-PyTorch-Implementation
The implementation of 3D-UNet using PyTorch
Stars: ✭ 78 (+122.86%)
Mutual labels:  pytorch-implementation
NGCF-PyTorch
PyTorch Implementation for Neural Graph Collaborative Filtering
Stars: ✭ 200 (+471.43%)
Mutual labels:  pytorch-implementation
metric-transfer.pytorch
Deep Metric Transfer for Label Propagation with Limited Annotated Data
Stars: ✭ 49 (+40%)
Mutual labels:  semi-supervised-learning
MobileHumanPose
This repo is official PyTorch implementation of MobileHumanPose: Toward real-time 3D human pose estimation in mobile devices(CVPRW 2021).
Stars: ✭ 206 (+488.57%)
Mutual labels:  pytorch-implementation

semi-supervised-paper-implementation

This repository is designed to reproduce the methods in some semi-supervised papers.

Before running the code, you need to install the packages according to the following command.

pip3 install torch==1.1.0
pip3 install torchvision==0.3.0
pip3 install tensorflow # we use tensorboard in the project

Prepare datasets

CIFAR-10

Use the following command to unpack the data and generate labeled data path files.

python3 -m semi_supervised.core.utils.cifar10

Run on CIFAR-10

To reproduce the result in Temporal Ensembling for Semi-Supervised Learning, run

CUDA_VISIBLE_DEVICES=0 python3 -m semi_supervised.experiments.tempens.cifar10_test
CUDA_VISIBLE_DEVICES=0 python3 -m semi_supervised.experiments.pi.cifar10_test

To reproduce the result in Mean teachers are better role models. run

CUDA_VISIBLE_DEVICES=0 python3 -m semi_supervised.experiments.mean_teacher.cifar10_test

Note: This code does not be tested on multiple GPUs, so there is no guarantee that the result is satisfying when using multiple GPUs.

Results on CIFAR-10

Number of Labeled Data 1000 2000 4000 All labels
Pi model (from SNTG) 68.35 ± 1.20 82.43 ± 0.44 87.64 ± 0.31 94.44 ± 0.10
Pi model (this repository) 69.615 ± 1.3013 82.92 ± 0.532 87.925 ± 0.227 ---
Tempens model (from SNTG) 76.69 ± 1.01 84.36 ± 0.39 87.84 ± 0.24 94.4 ± 0.10
Tempens model (this repository) 78.517 ± 1.1653 84.757 ± 0.42445 88.166 ± 0.24324 94.72 ± 0.14758
Mean Teacher (from Mean teachers) 78.45 84.27 87.69 94.06
Mean Teacher (this repository) 80.421 ± 1.0264 85.236 ± 0.655 88.435 ± 0.311 94.482 ± 0.1086

We report the mean and standard deviation of 10 runs using different random seeds(1000 - 1009).

Training strategies in semi-supervised learning

In semi-supervised learning, many papers use common training strategies. This section introduces some strategies I know.

Learning rate

You can find out how to compute rampup_value and rampdown_value in semi_supervised/core/utils/fun_utils.py.

The curve of the learning rate is shown in the figure below.

alt text

Optimizer

Many methods in semi-supervised learning use Adam optimizer with beta1 = 0.9 and beta2 = 0.999. During training, beta1 is dynamically changed.

The curve of beta1 is shown in the figure below.

alt text

Consistency Weight

Some methods use dynamically changed weight to balance supervised loss and unsupervised loss.

The curve of consistency weight is shown in the figure below.

alt text

TODO list

  • Mean Teacher
  • Pi Model
  • Temporal Ensembling Model
  • VAT
  • More....

References

  1. Mean teachers are better role models
  2. Temporal Ensembling for Semi-Supervised Learning
  3. Good Semi-Supervised Learning that Requires a Bad GAN
  4. Smooth Neighbors on Teacher Graphs for Semi-supervised Learning
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].