All Projects → microsoft → metric-transfer.pytorch

microsoft / metric-transfer.pytorch

Licence: MIT license
Deep Metric Transfer for Label Propagation with Limited Annotated Data

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to metric-transfer.pytorch

Unsupervised Classification
SCAN: Learning to Classify Images without Labels (ECCV 2020), incl. SimCLR.
Stars: ✭ 605 (+1134.69%)
Mutual labels:  image-classification, unsupervised-learning
deepOF
TensorFlow implementation for "Guided Optical Flow Learning"
Stars: ✭ 26 (-46.94%)
Mutual labels:  semi-supervised-learning, unsupervised-learning
Marta Gan
MARTA GANs: Unsupervised Representation Learning for Remote Sensing Image Classification
Stars: ✭ 75 (+53.06%)
Mutual labels:  image-classification, unsupervised-learning
sinkhorn-label-allocation
Sinkhorn Label Allocation is a label assignment method for semi-supervised self-training algorithms. The SLA algorithm is described in full in this ICML 2021 paper: https://arxiv.org/abs/2102.08622.
Stars: ✭ 49 (+0%)
Mutual labels:  semi-supervised-learning, image-classification
Alibi Detect
Algorithms for outlier and adversarial instance detection, concept drift and metrics.
Stars: ✭ 604 (+1132.65%)
Mutual labels:  semi-supervised-learning, unsupervised-learning
ML2017FALL
Machine Learning (EE 5184) in NTU
Stars: ✭ 66 (+34.69%)
Mutual labels:  image-classification, unsupervised-learning
Billion-scale-semi-supervised-learning
Implementing Billion-scale semi-supervised learning for image classification using Pytorch
Stars: ✭ 81 (+65.31%)
Mutual labels:  semi-supervised-learning, image-classification
al-fk-self-supervision
Official PyTorch code for CVPR 2020 paper "Deep Active Learning for Biased Datasets via Fisher Kernel Self-Supervision"
Stars: ✭ 28 (-42.86%)
Mutual labels:  image-classification, unsupervised-learning
L2c
Learning to Cluster. A deep clustering strategy.
Stars: ✭ 262 (+434.69%)
Mutual labels:  semi-supervised-learning, unsupervised-learning
catgan pytorch
Unsupervised and Semi-supervised Learning with Categorical Generative Adversarial Networks
Stars: ✭ 50 (+2.04%)
Mutual labels:  semi-supervised-learning, unsupervised-learning
spear
SPEAR: Programmatically label and build training data quickly.
Stars: ✭ 81 (+65.31%)
Mutual labels:  semi-supervised-learning, unsupervised-learning
Self Supervised Speech Recognition
speech to text with self-supervised learning based on wav2vec 2.0 framework
Stars: ✭ 106 (+116.33%)
Mutual labels:  semi-supervised-learning, unsupervised-learning
Susi
SuSi: Python package for unsupervised, supervised and semi-supervised self-organizing maps (SOM)
Stars: ✭ 42 (-14.29%)
Mutual labels:  semi-supervised-learning, unsupervised-learning
Cleanlab
The standard package for machine learning with noisy labels, finding mislabeled data, and uncertainty quantification. Works with most datasets and models.
Stars: ✭ 2,526 (+5055.1%)
Mutual labels:  semi-supervised-learning, unsupervised-learning
deepvis
machine learning algorithms in Swift
Stars: ✭ 54 (+10.2%)
Mutual labels:  unsupervised-learning
volumetricPrimitives
Code release for "Learning Shape Abstractions by Assembling Volumetric Primitives " (CVPR 2017)
Stars: ✭ 137 (+179.59%)
Mutual labels:  unsupervised-learning
Image-Classification
Pre-trained VGG-Net Model for image classification using tensorflow
Stars: ✭ 29 (-40.82%)
Mutual labels:  image-classification
shake-drop pytorch
PyTorch implementation of shake-drop regularization
Stars: ✭ 50 (+2.04%)
Mutual labels:  image-classification
UnsupervisedPointCloudReconstruction
Experiments on unsupervised point cloud reconstruction.
Stars: ✭ 133 (+171.43%)
Mutual labels:  unsupervised-learning
TensorFlow-Multiclass-Image-Classification-using-CNN-s
Balanced Multiclass Image Classification with TensorFlow on Python.
Stars: ✭ 57 (+16.33%)
Mutual labels:  image-classification

Deep Metric Transfer for Label Propagation with Limited Annotated Data

This repo contains the pytorch implementation for the semi-supervised learning paper (arxiv).

Requirements

  • Python3: Anaconda is recommended because it already contains a lot of packages:
  • pytorch>=1.0: Refer to https://pytorch.org/get-started/locally/
  • other packages: pip install tensorboardX tensorboard easydict scikit-image

Highlight

  • We formulate semi-supervised learning from a completely different metric transfer perspective.
  • Enjoys the benefit of recent advances self-supervised learning.
  • We hope to draw more attention to unsupervised pretraining for other tasks.

Main results

The test accuracy of our methods and the state-of-the-art methods on CIFAR10 dataset with different number of labeled data.

Method 50 100 250 500 1000 2000 4000 8000
PI-model 27.36 37.20 47.07 56.30 63.70 76.50 84.17 87.30
Mean-Teacher 29.66 36.60 45.49 57.20 65.00 79.00 84.38 87.50
VAT 23.00 35.58 47.61 62.90 72.80 84.00 86.79 88.10
Pseudo-Label 21.00 34.00 45.83 60.30 68.20 78.00 84.79 86.20
Ours 56.34 63.53 71.26 74.77 79.38 82.34 84.52 87.48

Quick start

  • Clone this repo: git clone [email protected]:microsoft/metric-transfer.pytorch.git && cd metric-transfer.pytorch

  • Install pytorch and other packages listed in requirements

  • Download pretrained models and precomputed pseudo labels: bash scripts/download_model.sh . Make sure the checkpoint folder looks like this:

    checkpoint
    |-- pretrain_models
    |   |-- ckpt_instance_cifar10_wrn-28-2_82.12.pth.tar
    |   |-- ... other files
    |   `-- lemniscate_resnet50.pth.tar
    |-- pseudos
    |   |-- instance_nc_wrn-28-2
    |   |   |-- 50.pth.tar
    |   |   |-- ... other files
    |   |   `-- 8000.pth.tar
    |   `-- ... other folders 
    `-- pseudos_imagenet
        `-- instance_imagenet_nc_resnet50
            |-- num_labeled_13000
            |   |-- 10_0.pth.tar
            |   |-- ... other files
            |   `-- 10_9.pth.tar
            `-- ... other folders 
    
  • Supervised finetune on cifar10 dataset or Imagenet dataset. The cifar dataset will be downloaded automatically. For imagenet, refer to here for details of data preparation.

    # Finetune on cifar
    python cifar-semi.py \
    	--gpus 0 \
    	--num-labeled 250 \
    	--pseudo-file checkpoint/pseudos/instance_nc_wrn-28-2/250.pth.tar \
    	--resume checkpoint/pretrain_models/ckpt_instance_cifar10_wrn-28-2_82.12.pth.tar \
     	--pseudo-ratio 0.2
     	
    # For imagenet
    n_labeled=13000  # 1% labeled data
    pseudo_ratio=0.1  # use top 10% pseudo label
    data_dir=/path/to/imagenet/dir
    
    python imagenet-semi.py \
        --arch resnet50 \
        --gpus 0,1,2,3 \
        --num-labeled ${n_labeled} \
        --data-dir ${data_dir} \
        --pretrained checkpoint/pretrain_models/lemniscate_resnet50.pth.tar  \
        --pseudo-dir checkpoint/pseudos_imagenet/instance_imagenet_nc_resnet50/num_labeled_${n_labeled} \
        --pseudo-ratio ${pseudo_ratio} \

Usage

The proposed method contains three main steps: metric pretraining, label propagation, and supervised finetune.

Metric pretraining

The metric pretraining can be unsupervised or supervised, from the same or different dataset.

We provide code for instance discrimination, which is borrowed from the original pytorch release of instance discrimination. You can run the following command in root director of code to train the instance discrimination on cifar10 dataset:

export PYTHONPATH=$PYTHONPATH:$(pwd)
CUDA_VISIBLE_DEVICES=0 python unsupervised/cifar.py \
	--lr-scheduler cosine-with-restart \
	--epochs 1270

For other metric or imagenet dataset, such as colorization on cifar10 dataset, or instance discrimination on imagenet datset, ref to offical released code: colorization, instance discrimination. We also provide the pretrained weight. Refer to scripts/download_model.sh for more details.

Label propagation

We then can propagation the label using the trained metric from the few labeled examples to a vast collection of unannotated images.

We consider two propagation algorithms: K-nearest neighbors(i.e. knn) and spectral clustering(also called normalized cut, i.e nc). The implementation is in notebooks folder, which is in jupyter notebook format. You can simplely run the notebook to load the weight of metric pretraining approach and propagate to get the pseudo label.

We alse provide the pseudo label for cifar10 and imagenet dataset. Refer to scripts/download_model.sh for more details.

Supervised finetune

With the estimated pseudo labels on the unlabeled data, we can train a classifier with more data. For simplicity, we omit the confidence weighted supervised training in the current version. Instead, we only use a portion of the most confident pseudo label to training.

Refer to quickstart part for more command instruction.

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

Citation

If you find this paper useful in your research, please consider citing:

@inproceedings{liu2018deep,
  title={Deep Metric Transfer for Label Propagation with Limited Annotated Data},
  author={Liu, Bin and Wu, Zhirong and Hu, Han and Lin, Stephen},
  journal={arXiv preprint arXiv:1812.08781},
  year={2018}
}

Contact

For any questions, please feel free to create a new issue or reach

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].