All Projects → richardaecn → Class Balanced Loss

richardaecn / Class Balanced Loss

Licence: mit
Class-Balanced Loss Based on Effective Number of Samples. CVPR 2019

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Class Balanced Loss

SKNet-PyTorch
Nearly Perfect & Easily Understandable PyTorch Implementation of SKNet
Stars: ✭ 62 (-85.68%)
Mutual labels:  imagenet, cvpr
Ghostnet
CV backbones including GhostNet, TinyNet and TNT, developed by Huawei Noah's Ark Lab.
Stars: ✭ 1,744 (+302.77%)
Mutual labels:  imagenet, cvpr
Tresnet
Official Pytorch Implementation of "TResNet: High-Performance GPU-Dedicated Architecture" (WACV 2021)
Stars: ✭ 314 (-27.48%)
Mutual labels:  imagenet
Mobilenetv2.pytorch
72.8% MobileNetV2 1.0 model on ImageNet and a spectrum of pre-trained MobileNetV2 models
Stars: ✭ 369 (-14.78%)
Mutual labels:  imagenet
Rexnet
Official Pytorch implementation of ReXNet (Rank eXpansion Network) with pretrained models
Stars: ✭ 319 (-26.33%)
Mutual labels:  imagenet
Natural Adv Examples
A Harder ImageNet Test Set (CVPR 2021)
Stars: ✭ 317 (-26.79%)
Mutual labels:  imagenet
Benchmarking Keras Pytorch
🔥 Reproducibly benchmarking Keras and PyTorch models
Stars: ✭ 346 (-20.09%)
Mutual labels:  imagenet
Pnasnet.pytorch
PyTorch implementation of PNASNet-5 on ImageNet
Stars: ✭ 309 (-28.64%)
Mutual labels:  imagenet
Computer Vision
Programming Assignments and Lectures for Stanford's CS 231: Convolutional Neural Networks for Visual Recognition
Stars: ✭ 408 (-5.77%)
Mutual labels:  imagenet
Php Opencv Examples
Tutorial for computer vision and machine learning in PHP 7/8 by opencv (installation + examples + documentation)
Stars: ✭ 333 (-23.09%)
Mutual labels:  imagenet
Fbrs interactive segmentation
[CVPR2020] f-BRS: Rethinking Backpropagating Refinement for Interactive Segmentation https://arxiv.org/abs/2001.10331
Stars: ✭ 366 (-15.47%)
Mutual labels:  cvpr
Mvpose
Code for "Fast and Robust Multi-Person 3D Pose Estimation from Multiple Views" in CVPR'19
Stars: ✭ 326 (-24.71%)
Mutual labels:  cvpr
Prm
Weakly Supervised Instance Segmentation using Class Peak Response, in CVPR 2018 (Spotlight)
Stars: ✭ 322 (-25.64%)
Mutual labels:  cvpr
Densenet Caffe
DenseNet Caffe Models, converted from https://github.com/liuzhuang13/DenseNet
Stars: ✭ 350 (-19.17%)
Mutual labels:  imagenet
Assembled Cnn
Tensorflow implementation of "Compounding the Performance Improvements of Assembled Techniques in a Convolutional Neural Network"
Stars: ✭ 319 (-26.33%)
Mutual labels:  imagenet
Espnetv2
A light-weight, power efficient, and general purpose convolutional neural network
Stars: ✭ 377 (-12.93%)
Mutual labels:  imagenet
Pytorch Vdsr
VDSR (CVPR2016) pytorch implementation
Stars: ✭ 313 (-27.71%)
Mutual labels:  cvpr
Pytorch Randaugment
Unofficial PyTorch Reimplementation of RandAugment.
Stars: ✭ 323 (-25.4%)
Mutual labels:  imagenet
Rectlabel Support
RectLabel - An image annotation tool to label images for bounding box object detection and segmentation.
Stars: ✭ 338 (-21.94%)
Mutual labels:  imagenet
Neural Backed Decision Trees
Making decision trees competitive with neural networks on CIFAR10, CIFAR100, TinyImagenet200, Imagenet
Stars: ✭ 411 (-5.08%)
Mutual labels:  imagenet

Class-Balanced Loss Based on Effective Number of Samples

Tensorflow code for the paper:

Class-Balanced Loss Based on Effective Number of Samples
Yin Cui, Menglin Jia, Tsung-Yi Lin, Yang Song, Serge Belongie

Dependencies:

  • Python (3.6)
  • Tensorflow (1.14)

Datasets:

  • Long-Tailed CIFAR. We provide a download link that includes all the data used in our paper in .tfrecords format. The data was converted and generated by src/generate_cifar_tfrecords.py (original CIFAR) and src/generate_cifar_tfrecords_im.py (long-tailed CIFAR).

Effective Number of Samples:

For a visualization of the data and effective number of samples, please take a look at data.ipynb.

Key Implementation Details:

Training and Evaluation:

We provide 3 .sh scripts for training and evaluation.

  • On original CIFAR dataset:
./cifar_trainval.sh
  • On long-tailed CIFAR dataset (the hyperparameter IM_FACTOR is the inverse of "Imbalance Factor" in the paper):
./cifar_im_trainval.sh
  • On long-tailed CIFAR dataset using the proposed class-balanced loss (set non-zero BETA):
./cifar_im_trainval_cb.sh
  • Run Tensorboard for visualization:
tensorboard --logdir=./results --port=6006
  • The figure below are the results of running ./cifar_im_trainval.sh and ./cifar_im_trainval_cb.sh:

Training with TPU:

We train networks on iNaturalist and ImageNet datasets using Google's Cloud TPU. The code for this section is in tpu/. Our code is based on the official implementation of Training ResNet on Cloud TPU and forked from https://github.com/tensorflow/tpu.

Data Preparation:

  • Download datasets (except images) from this link and unzip it under tpu/. The unzipped directory tpu/raw_data/ contains the training and validation splits. For raw images, please download from the following links and put them into the corresponding folders in tpu/raw_data/:

  • Convert datasets into .tfrecords format and upload to Google Cloud Storage (gcs) using tpu/tools/datasets/dataset_to_gcs.py:

python dataset_to_gcs.py \
  --project=$PROJECT \
  --gcs_output_path=$GCS_DATA_DIR \
  --local_scratch_dir=$LOCAL_TFRECORD_DIR \
  --raw_data_dir=$LOCAL_RAWDATA_DIR

The following 3 .sh scripts in tpu/ can be used to train and evaluate models on iNaturalist and ImageNet using Cloud TPU. For more details on how to use Cloud TPU, please refer to Training ResNet on Cloud TPU.

Note that the image mean and standard deviation and input size need to be updated accordingly.

  • On ImageNet (ILSVRC 2012):
./run_ILSVRC2012.sh
  • On iNaturalist 2017:
./run_inat2017.sh
  • On iNaturalist 2018:
./run_inat2018.sh
  • The pre-trained models, including all logs viewable on tensorboard, can be downloaded from the following links:
Dataset Network Loss Input Size Download Link
ILSVRC 2012 ResNet-50 Class-Balanced Focal Loss 224 link
iNaturalist 2018 ResNet-50 Class-Balanced Focal Loss 224 link

Citation

If you find our work helpful in your research, please cite it as:

@inproceedings{cui2019classbalancedloss,
  title={Class-Balanced Loss Based on Effective Number of Samples},
  author={Cui, Yin and Jia, Menglin and Lin, Tsung-Yi and Song, Yang and Belongie, Serge},
  booktitle={CVPR},
  year={2019}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].