All Projects → HanxunH → Active-Passive-Losses

HanxunH / Active-Passive-Losses

Licence: MIT license
[ICML2020] Normalized Loss Functions for Deep Learning with Noisy Labels

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to Active-Passive-Losses

Advances-in-Label-Noise-Learning
A curated (most recent) list of resources for Learning with Noisy Labels
Stars: ✭ 360 (+291.3%)
Mutual labels:  noisy-data, label-noise, noisy-labels, unreliable-labels, robust-learning
IDN
AAAI 2021: Beyond Class-Conditional Assumption: A Primary Attempt to Combat Instance-Dependent Label Noise
Stars: ✭ 21 (-77.17%)
Mutual labels:  noisy-data, label-noise, noisy-labels, robust-learning
TaLKConvolutions
Official PyTorch implementation of Time-aware Large Kernel (TaLK) Convolutions (ICML 2020)
Stars: ✭ 26 (-71.74%)
Mutual labels:  icml, icml-2020
noisy label understanding utilizing
ICML 2019: Understanding and Utilizing Deep Neural Networks Trained with Noisy Labels
Stars: ✭ 82 (-10.87%)
Mutual labels:  icml, noisy-labels
rebias
Official Pytorch implementation of ReBias (Learning De-biased Representations with Biased Representations), ICML 2020
Stars: ✭ 125 (+35.87%)
Mutual labels:  icml, icml-2020
Awesome-Computer-Vision-Paper-List
This repository contains all the papers accepted in top conference of computer vision, with convenience to search related papers.
Stars: ✭ 248 (+169.57%)
Mutual labels:  icml
ACE
Code for our paper, Neural Network Attributions: A Causal Perspective (ICML 2019).
Stars: ✭ 47 (-48.91%)
Mutual labels:  icml
FairAI
This is a collection of papers and other resources related to fairness.
Stars: ✭ 55 (-40.22%)
Mutual labels:  icml
semi-supervised-NFs
Code for the paper Semi-Conditional Normalizing Flows for Semi-Supervised Learning
Stars: ✭ 23 (-75%)
Mutual labels:  icml
unicornn
Official code for UnICORNN (ICML 2021)
Stars: ✭ 21 (-77.17%)
Mutual labels:  icml
AIPaperCompleteDownload
Complete download for papers in various top conferences
Stars: ✭ 64 (-30.43%)
Mutual labels:  icml
wrench
WRENCH: Weak supeRvision bENCHmark
Stars: ✭ 185 (+101.09%)
Mutual labels:  robust-learning
Noisy-Labels-with-Bootstrapping
Keras implementation of Training Deep Neural Networks on Noisy Labels with Bootstrapping, Reed et al. 2015
Stars: ✭ 22 (-76.09%)
Mutual labels:  noisy-labels
clean-net
Tensorflow source code for "CleanNet: Transfer Learning for Scalable Image Classifier Training with Label Noise" (CVPR 2018)
Stars: ✭ 86 (-6.52%)
Mutual labels:  label-noise
EgoCNN
Code for "Distributed, Egocentric Representations of Graphs for Detecting Critical Structures" (ICML 2019)
Stars: ✭ 16 (-82.61%)
Mutual labels:  icml
imbalanced-regression
[ICML 2021, Long Talk] Delving into Deep Imbalanced Regression
Stars: ✭ 425 (+361.96%)
Mutual labels:  icml
FedScale
FedScale is a scalable and extensible open-source federated learning (FL) platform.
Stars: ✭ 274 (+197.83%)
Mutual labels:  icml
C2D
PyTorch implementation of "Contrast to Divide: self-supervised pre-training for learning with noisy labels"
Stars: ✭ 59 (-35.87%)
Mutual labels:  noisy-labels
icml-nips-iclr-dataset
Papers, authors and author affiliations from ICML, NeurIPS and ICLR 2006-2021
Stars: ✭ 21 (-77.17%)
Mutual labels:  icml
PackIt
Code for reproducing results in ICML 2020 paper "PackIt: A Virtual Environment for Geometric Planning"
Stars: ✭ 45 (-51.09%)
Mutual labels:  icml-2020

Normalized Loss Functions - Active Passive Losses

Code for ICML2020 Paper "Normalized Loss Functions for Deep Learning with Noisy Labels"

Requirements

Python >= 3.6, PyTorch >= 1.3.1, torchvision >= 0.4.1, mlconfig

How To Run

Configs for the experiment settings

Check '*.yaml' file in the config folder for each experiment.

Arguments
  • noise_rate: noise rate
  • asym: use if it is asymmetric noise, default is symmetric
  • config_path: path to the configs folder
  • version: the config file name
  • exp_name: name of the experiments (as note)
  • seed: random seed

Example for 0.4 Symmetric noise rate with NCE+RCE loss

# CIFAR-10
$  python3  main.py --exp_name      test_exp            \
                    --noise_rate    0.4                 \
                    --version       nce+rce             \
                    --config_path   configs/cifar10/sym \
                    --seed          123

# CIFAR-100
$  python3  main.py --exp_name      test_exp             \
                    --noise_rate    0.4                  \
                    --version       nce+rce              \
                    --config_path   configs/cifar100/sym \
                    --seed          123

Citing this work

If you use this code in your work, please cite the accompanying paper:

@inproceedings{ma2020normalized,
  title={Normalized Loss Functions for Deep Learning with Noisy Labels},
  author={Ma, Xingjun and Huang, Hanxun and Wang, Yisen and Romano, Simone and Erfani, Sarah and Bailey, James},
  booktitle={ICML},
  year={2020}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].