All Projects → vuptran → sesemi

vuptran / sesemi

Licence: MIT license
supervised and semi-supervised image classification with self-supervision (Keras)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to sesemi

SSL CR Histo
Official code for "Self-Supervised driven Consistency Training for Annotation Efficient Histopathology Image Analysis" Published in Medical Image Analysis (MedIA) Journal, Oct, 2021.
Stars: ✭ 32 (-25.58%)
Mutual labels:  semi-supervised-learning, self-supervised-learning
improving segmentation with selfsupervised depth
[CVPR21] Implementation of our work "Three Ways to Improve Semantic Segmentation with Self-Supervised Depth Estimation"
Stars: ✭ 189 (+339.53%)
Mutual labels:  semi-supervised-learning, self-supervised-learning
exponential-moving-average-normalization
PyTorch implementation of EMAN for self-supervised and semi-supervised learning: https://arxiv.org/abs/2101.08482
Stars: ✭ 76 (+76.74%)
Mutual labels:  semi-supervised-learning, self-supervised-learning
SHOT-plus
code for our TPAMI 2021 paper "Source Data-absent Unsupervised Domain Adaptation through Hypothesis Transfer and Labeling Transfer"
Stars: ✭ 46 (+6.98%)
Mutual labels:  semi-supervised-learning, self-supervised-learning
sinkhorn-label-allocation
Sinkhorn Label Allocation is a label assignment method for semi-supervised self-training algorithms. The SLA algorithm is described in full in this ICML 2021 paper: https://arxiv.org/abs/2102.08622.
Stars: ✭ 49 (+13.95%)
Mutual labels:  semi-supervised-learning
video-pace
code for our ECCV-2020 paper: Self-supervised Video Representation Learning by Pace Prediction
Stars: ✭ 95 (+120.93%)
Mutual labels:  self-supervised-learning
libai
LiBai(李白): A Toolbox for Large-Scale Distributed Parallel Training
Stars: ✭ 284 (+560.47%)
Mutual labels:  self-supervised-learning
Revisiting-Contrastive-SSL
Revisiting Contrastive Methods for Unsupervised Learning of Visual Representations. [NeurIPS 2021]
Stars: ✭ 81 (+88.37%)
Mutual labels:  self-supervised-learning
barlowtwins
Implementation of Barlow Twins paper
Stars: ✭ 84 (+95.35%)
Mutual labels:  self-supervised-learning
Adversarial-Semisupervised-Semantic-Segmentation
Pytorch Implementation of "Adversarial Learning For Semi-Supervised Semantic Segmentation" for ICLR 2018 Reproducibility Challenge
Stars: ✭ 151 (+251.16%)
Mutual labels:  semi-supervised-learning
rankpruning
🧹 Formerly for binary classification with noisy labels. Replaced by cleanlab.
Stars: ✭ 81 (+88.37%)
Mutual labels:  semi-supervised-learning
S2-BNN
S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural Networks via Guided Distribution Calibration (CVPR 2021)
Stars: ✭ 53 (+23.26%)
Mutual labels:  self-supervised-learning
tape-neurips2019
Tasks Assessing Protein Embeddings (TAPE), a set of five biologically relevant semi-supervised learning tasks spread across different domains of protein biology. (DEPRECATED)
Stars: ✭ 117 (+172.09%)
Mutual labels:  semi-supervised-learning
Pro-GNN
Implementation of the KDD 2020 paper "Graph Structure Learning for Robust Graph Neural Networks"
Stars: ✭ 202 (+369.77%)
Mutual labels:  semi-supervised-learning
temporal-ensembling-semi-supervised
Keras implementation of temporal ensembling(semi-supervised learning)
Stars: ✭ 22 (-48.84%)
Mutual labels:  semi-supervised-learning
SelfGNN
A PyTorch implementation of "SelfGNN: Self-supervised Graph Neural Networks without explicit negative sampling" paper, which appeared in The International Workshop on Self-Supervised Learning for the Web (SSL'21) @ the Web Conference 2021 (WWW'21).
Stars: ✭ 24 (-44.19%)
Mutual labels:  self-supervised-learning
SimPLE
Code for the paper: "SimPLE: Similar Pseudo Label Exploitation for Semi-Supervised Classification"
Stars: ✭ 50 (+16.28%)
Mutual labels:  semi-supervised-learning
semi-supervised-paper-implementation
Reproduce some methods in semi-supervised papers.
Stars: ✭ 35 (-18.6%)
Mutual labels:  semi-supervised-learning
metric-transfer.pytorch
Deep Metric Transfer for Label Propagation with Limited Annotated Data
Stars: ✭ 49 (+13.95%)
Mutual labels:  semi-supervised-learning
ccgl
TKDE 22. CCCL: Contrastive Cascade Graph Learning.
Stars: ✭ 20 (-53.49%)
Mutual labels:  self-supervised-learning

Learning with Self-Supervised Regularization

Update: this repo exists only for the reproduction of the experiments presented in the paper below. The user is encouraged to check the PyTorch version of this repo for practical semi-supervised image classification on large realistic datasets using modern CNN backbones, along with the latest developments and features. https://github.com/FlyreelAI/sesemi

This repository contains a Keras implementation of the SESEMI architecture for supervised and semi-supervised image classification, as described in the NeurIPS'19 LIRE Workshop paper:

Tran, Phi Vu (2019) Exploring Self-Supervised Regularization for Supervised and Semi-Supervised Learning.

Approach

schematic

The training and evaluation of the SESEMI architecture for supervised and semi-supervised learning is summarized as follows:

  1. Separate the input data into labeled and unlabeled branches. The unlabeled branch consists of all available training examples, but without ground truth label information;
  2. Perform geometric transformations on unlabeled data to produce six proxy labels defined as image rotations belonging in the set of {0,90,180,270} degrees along with horizontal (left-right) and vertical (up-down) flips;
  3. Apply input data augmentation and noise to each branch independently;
  4. At each training step, sample two mini-batches having the same number of unlabeled and labeled examples as inputs to a shared CNN backbone. Note that labeled examples will repeat in a mini-batch because the number of unlabeled examples is much greater;
  5. Compute the supervised cross-entropy loss using ground truth labels and the self-supervised cross-entropy loss using proxy labels generated from image rotations and flips;
  6. Update CNN parameters via stochastic gradient descent by minimizing the sum of supervised and self-supervised loss components;
  7. At inference time, take the supervised branch of the network to make predictions on test data and discard the self-supervised branch.

Requirements

The code is tested on Ubuntu 16.04 with the following components:

Software

  • Anaconda Python 3.6;
  • Keras 2.2.4 with TensorFlow GPU 1.12.0 backend;
  • CUDA 9.1 with CuDNN 7.1 acceleration.

Hardware

This reference implementation loads all data into system memory and utilizes GPU for model training and evaluation. The following hardware specifications are highly recommended:

  • At least 64GB of system RAM;
  • NVIDIA GeForce GTX TITAN X GPU or better.

Usage

For training and evaluation, execute the following bash commands in the same directory where the code resides. Ensure the datasets have been downloaded into their respective directories.

# Set the PYTHONPATH environment variable.
$ export PYTHONPATH="/path/to/this/repo:$PYTHONPATH"

# Train and evaluate SESEMI.
$ python train_evaluate_sesemi.py
	--network <network_str>
	--dataset <dataset_str>
	--labels <nb_labels>
	--gpu <gpu_id>

# Train and evaluate SESEMI with unlabeled extra data from Tiny Images.
$ python train_evaluate_sesemi_tinyimages.py
	--network <network_str>
	--extra <nb_extra>
	--gpu <gpu_id>

The required flags are:

  • <network_str> refers to one of convnet, wrn, or nin architecture;
  • <dataset_str> refers to one of three supported datasets svhn, cifar10, and cifar100;
  • <nb_labels> is an integer denoting the number of labeled examples;
  • <nb_extra> denotes the amount of unlabeled extra data to sample from Tiny Images;
  • <gpu_id> is a string denoting the GPU device ID, defaults to 0 if not specified.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].