All Projects → subeeshvasu → Awesome Learning With Label Noise

subeeshvasu / Awesome Learning With Label Noise

A curated list of resources for Learning with Noisy Labels

Projects that are alternatives of or similar to Awesome Learning With Label Noise

Deeplearning4j
All DeepLearning4j projects go here.
Stars: ✭ 68 (-94.36%)
Mutual labels:  deep-neural-networks
Sarcasm Detection
Detecting Sarcasm on Twitter using both traditonal machine learning and deep learning techniques.
Stars: ✭ 73 (-93.94%)
Mutual labels:  deep-neural-networks
Dann
Deep Neural Network Sandbox for JavaScript.
Stars: ✭ 75 (-93.78%)
Mutual labels:  deep-neural-networks
Gluon2pytorch
Gluon to PyTorch deep neural network model converter
Stars: ✭ 70 (-94.19%)
Mutual labels:  deep-neural-networks
Rnn Trajmodel
The source of the IJCAI2017 paper "Modeling Trajectory with Recurrent Neural Networks"
Stars: ✭ 72 (-94.02%)
Mutual labels:  deep-neural-networks
Fake news detection deep learning
Fake News Detection using Deep Learning models in Tensorflow
Stars: ✭ 74 (-93.86%)
Mutual labels:  deep-neural-networks
Satellite Image Deep Learning
Resources for deep learning with satellite & aerial imagery
Stars: ✭ 1,141 (-5.31%)
Mutual labels:  deep-neural-networks
Cnn Paper2
🎨 🎨 深度学习 卷积神经网络教程 :图像识别,目标检测,语义分割,实例分割,人脸识别,神经风格转换,GAN等🎨🎨 https://dataxujing.github.io/CNN-paper2/
Stars: ✭ 77 (-93.61%)
Mutual labels:  deep-neural-networks
Channelnets
Tensorflow Implementation of ChannelNets (NeurIPS 18)
Stars: ✭ 73 (-93.94%)
Mutual labels:  deep-neural-networks
Tfjs Core
WebGL-accelerated ML // linear algebra // automatic differentiation for JavaScript.
Stars: ✭ 8,514 (+606.56%)
Mutual labels:  deep-neural-networks
Cs231n
My Solution to Assignments of CS231n in Winter2016
Stars: ✭ 71 (-94.11%)
Mutual labels:  deep-neural-networks
Noreward Rl
[ICML 2017] TensorFlow code for Curiosity-driven Exploration for Deep Reinforcement Learning
Stars: ✭ 1,176 (-2.41%)
Mutual labels:  deep-neural-networks
Mit 6.s094
MIT-6.S094: Deep Learning for Self-Driving Cars Assignments solutions
Stars: ✭ 74 (-93.86%)
Mutual labels:  deep-neural-networks
Blinkdl
A minimalist deep learning library in Javascript using WebGL + asm.js. Run convolutional neural network in your browser.
Stars: ✭ 69 (-94.27%)
Mutual labels:  deep-neural-networks
Swae
Implementation of the Sliced Wasserstein Autoencoders
Stars: ✭ 75 (-93.78%)
Mutual labels:  deep-neural-networks
Onnx Scala
An ONNX (Open Neural Network eXchange) API and Backend for Typeful, Functional Deep Learning in Scala
Stars: ✭ 68 (-94.36%)
Mutual labels:  deep-neural-networks
Awesome System For Machine Learning
A curated list of research in machine learning system. I also summarize some papers if I think they are really interesting.
Stars: ✭ 1,185 (-1.66%)
Mutual labels:  deep-neural-networks
Anrl
ANRL: Attributed Network Representation Learning via Deep Neural Networks(IJCAI-2018)
Stars: ✭ 77 (-93.61%)
Mutual labels:  deep-neural-networks
Deepsequenceclassification
Deep neural network based model for sequence to sequence classification
Stars: ✭ 76 (-93.69%)
Mutual labels:  deep-neural-networks
Caffe2
Caffe2 is a lightweight, modular, and scalable deep learning framework.
Stars: ✭ 8,409 (+597.84%)
Mutual labels:  deep-neural-networks

Learning-with-Label-Noise

A curated list of resources for Learning with Noisy Labels


Papers & Code

  • 2008-NIPS - Whose vote should count more: Optimal integration of labels from labelers of unknown expertise. [Paper][Code]

  • 2009-ICML - Supervised learning from multiple experts: whom to trust when everyone lies a bit. [Paper]

  • 2011-NIPS - Bayesian Bias Mitigation for Crowdsourcing. [Paper]

  • 2012-ICML - Learning to Label Aerial Images from Noisy Data. [Paper]

  • 2013-NIPS - Learning with Multiple Labels. [Paper]

  • 2013-NIPS - Learning with Noisy Labels. [Paper][Code]

  • 2014-ML - Learning from multiple annotators with varying expertise. [Paper]

  • 2014 - A Comprehensive Introduction to Label Noise. [Paper]

  • 2014-Survey - Classification in the Presence of Label Noise: a Survey. [Paper]

  • 2014 - Learning from Noisy Labels with Deep Neural Networks. [Paper]

  • 2015-ICLR_W - Training Convolutional Networks with Noisy Labels. [Paper][Code]

  • 2015-CVPR - Learning from Massive Noisy Labeled Data for Image Classification. [Paper][Code]

  • 2015-CVPR - Visual recognition by learning from web data: A weakly supervised domain generalization approach. [Paper][Code]

  • 2015-CVPR - Training Deep Neural Networks on Noisy Labels with Bootstrapping. [Paper][Loss-Code-Unofficial-1][Loss-Code-Unofficial-2][Code-Keras]

  • 2015-ICCV - Webly supervised learning of convolutional networks. [Paper][Project Pagee]

  • 2015-TPAMI - Classification with noisy labels by importance reweighting. [Paper][Code]

  • 2015-NIPS - Learning with Symmetric Label Noise: The Importance of Being Unhinged. [Paper][Loss-Code-Unofficial]

  • 2015-Arxiv - Making Risk Minimization Tolerant to Label Noise. [Paper]

  • 2015 - Learning Discriminative Reconstructions for Unsupervised Outlier Removal. [Paper][Code]

  • 2015-TNLS - Rboost: label noise-robust boosting algorithm based on a nonconvex loss function and the numerically stable base learners. [Paper]

  • 2016-AAAI - Robust semi-supervised learning through label aggregation. [Paper]

  • 2016-ICLR - Auxiliary Image Regularization for Deep CNNs with Noisy Labels. [Paper][Code]

  • 2016-CVPR - Seeing through the Human Reporting Bias: Visual Classifiers from Noisy Human-Centric Labels. [Paper][Code]

  • 2016-ICML - Loss factorization, weakly supervised learning and label noise robustness. [Paper]

  • 2016-RL - On the convergence of a family of robust losses for stochastic gradient descent. [Paper]

  • 2016-NC - Noise detection in the Meta-Learning Level. [Paper]

  • 2016-ECCV - The Unreasonable Effectiveness of Noisy Data for Fine-Grained Recognition. [Paper][Project Page]

  • 2016-ICASSP - Training deep neural-networks based on unreliable labels. [Paper][Poster][Code-Unofficial]

  • 2016-ICDM - Learning deep networks from noisy labels with dropout regularization. [Paper][Code]

  • 2016-KBS - A robust multi-class AdaBoost algorithm for mislabeled noisy data. [Paper]

  • 2017-AAAI - Robust Loss Functions under Label Noise for Deep Neural Networks. [Paper]

  • 2017-PAKDD - On the Robustness of Decision Tree Learning under Label Noise. [Paper]

  • 2017-ICLR - Training deep neural-networks using a noise adaptation layer. [Paper][Code]

  • 2017-ICLR - Who Said What: Modeling Individual Labelers Improves Classification. [Paper]

  • 2017-CVPR - Making Deep Neural Networks Robust to Label Noise: a Loss Correction Approach. [Paper] [Code]

  • 2017-CVPR - Learning From Noisy Large-Scale Datasets With Minimal Supervision. [Paper]

  • 2017-CVPR - Lean crowdsourcing: Combining humans and machines in an online system. [Paper][Code]

  • 2017-CVPR - Attend in groups: a weakly-supervised deep learning framework for learning from web data. [Paper][Code]

  • 2017-ICML - Robust Probabilistic Modeling with Bayesian Data Reweighting. [Paper][Code]

  • 2017-ICCV - Learning From Noisy Labels With Distillation. [Paper][Code]

  • 2017-NIPS - Toward Robustness against Label Noise in Training Deep Discriminative Neural Networks. [Paper]

  • 2017-NIPS - Active bias: Training more accurate neural networks by emphasizing high variance samples. [Paper][Code]

  • 2017-NIPS - Decoupling" when to update" from" how to update". [Paper][Code]

  • 2017-IEEE-TIFS - A Light CNN for Deep Face Representation with Noisy Labels. [Paper][Code-Pytorch][Code-Keras][Code-Tensorflow]

  • 2017-TNLS - Improving Crowdsourced Label Quality Using Noise Correction. [Paper]

  • 2017-ML - Learning to Learn from Weak Supervision by Full Supervision. [Paper][Code]

  • 2017-ML - Avoiding your teacher's mistakes: Training neural networks with controlled weak supervision. [Paper]

  • 2017-Arxiv - Deep Learning is Robust to Massive Label Noise. [Paper]

  • 2017-Arxiv - Fidelity-weighted learning. [Paper]

  • 2017 - Self-Error-Correcting Convolutional Neural Network for Learning with Noisy Labels. [Paper]

  • 2017-Arxiv - Learning with confident examples: Rank pruning for robust classification with noisy labels. [Paper]

  • 2017-Arxiv - Regularizing neural networks by penalizing confident output distributions. [Paper]

  • 2017 - Learning with Auxiliary Less-Noisy Labels. [Paper]

  • 2018-AAAI - Deep learning from crowds. [Paper]

  • 2018-ICLR - mixup: Beyond Empirical Risk Minimization. [Paper] [Code]

  • 2018-ICLR - Learning From Noisy Singly-labeled Data. [Paper] [Code]

  • 2018-ICLR - Dimensionality Driven Learning for Noisy Labels. [Paper] [Code]

  • 2018-ICLR - Learning From Noisy Singly-labeled Data. [Paper]

  • 2018-ICLR_W - How Do Neural Networks Overcome Label Noise?. [Paper]

  • 2018-CVPR - CleanNet: Transfer Learning for Scalable Image Classifier Training with Label Noise. [Paper] [Code]

  • 2018-CVPR - Joint Optimization Framework for Learning with Noisy Labels. [Paper] [Code][Code-Unofficial-Pytorch]

  • 2018-CVPR - Iterative Learning with Open-set Noisy Labels. [Paper] [Code]

  • 2018-ICML - MentorNet: Learning Data-Driven Curriculum for Very Deep Neural Networks on Corrupted Labels. [Paper] [Code]

  • 2018-ICML - Learning to Reweight Examples for Robust Deep Learning. [Paper] [Code] [Code-Unofficial-PyTorch]

  • 2018-ICML - Dimensionality-Driven Learning with Noisy Labels. [Paper] [Code]

  • 2018-ECCV - CurriculumNet: Weakly Supervised Learning from Large-Scale Web Images. [Paper] [Code]

  • 2018-ECCV - Deep Bilevel Learning. [Paper] [Code]

  • 2018-ECCV - Learning with Biased Complementary Labels. [Paper][Code]

  • 2018-ISBI - Training a neural network based on unreliable human annotation of medical images. [Paper]

  • 2018-WACV - Iterative Cross Learning on Noisy Labels. [Paper]

  • 2018-WACV - A semi-supervised two-stage approach to learning from noisy labels. [Paper]

  • 2018-NIPS - Co-teaching: Robust Training of Deep Neural Networks with Extremely Noisy Labels. [Paper] [Code]

  • 2018-NIPS - Masking: A New Perspective of Noisy Supervision. [Paper] [Code]

  • 2018-NIPS - Using Trusted Data to Train Deep Networks on Labels Corrupted by Severe Noise. [Paper] [Code]

  • 2018-NIPS - Robustness of conditional GANs to noisy labels. [Paper] [Code]

  • 2018-NIPS - Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels. [Paper][Loss-Code-Unofficial]

  • 2018-TIP - Deep learning from noisy image labels with quality embedding. [Paper]

  • 2018-TNLS - Progressive Stochastic Learning for Noisy Labels. [Paper]

  • 2018 - Multiclass Learning with Partially Corrupted Labels. [Paper]

  • 2018-Arxiv- Improving Multi-Person Pose Estimation using Label Correction. [Paper]

  • 2018 - Robust Determinantal Generative Classifier for Noisy Labels and Adversarial Attacks. [Paper]

  • 2019-AAAI - Safeguarded Dynamic Label Regression for Generalized Noisy Supervision. [Paper] [Code][Slides][Poster]

  • 2019-AAAI - Safeguarded Dynamic Label Regression for Noisy Supervision. [Paper] [Code]

  • 2019-ICLR_W - SOSELETO: A Unified Approach to Transfer Learning and Training with Noisy Labels.[Paper][Code]

  • 2019-CVPR - Learning From Noisy Labels by Regularized Estimation of Annotator Confusion. [Paper]

  • 2019-CVPR - Learning to Learn from Noisy Labeled Data. [Paper] [Code]

  • 2019-CVPR - Learning a Deep ConvNet for Multi-label Classification with Partial Labels. [Paper]

  • 2019-CVPR - Label-Noise Robust Generative Adversarial Networks. [Paper] [Code]

  • 2019-CVPR - Learning From Noisy Labels By Regularized Estimation Of Annotator Confusion. [Paper][Code]

  • 2019-CVPR - Probabilistic End-to-end Noise Correction for Learning with Noisy Labels. [Paper][Code]

  • 2019-CVPR - Graph Convolutional Label Noise Cleaner: Train a Plug-and-play Action Classifier for Anomaly Detection. [Paper][Code]

  • 2019-CVPR - Improving Semantic Segmentation via Video Propagation and Label Relaxation. [Paper][Code]

  • 2019-CVPR - Devil is in the Edges: Learning Semantic Boundaries from Noisy Annotations. [Paper] [Code][Project-page]

  • 2019-CVPR - Noise-Tolerant Paradigm for Training Face Recognition CNNs. [Paper] [Code]

  • 2019-CVPR - A Nonlinear, Noise-aware, Quasi-clustering Approach to Learning Deep CNNs from Noisy Labels. [Paper]

  • 2019-IJCAI - Learning Sound Events from Webly Labeled Data. [Paper] [Code]

  • 2019-ICML - Unsupervised Label Noise Modeling and Loss Correction. [Paper] [Code]

  • 2019-ICML - Understanding and Utilizing Deep Neural Networks Trained with Noisy Labels. [Paper] [Code]

  • 2019-ICML - How does Disagreement Help Generalization against Label Corruption?. [Paper] [Code]

  • 2019-ICML - Using Pre-Training Can Improve Model Robustness and Uncertainty. [Paper] [Code]

  • 2019-ICML - On Symmetric Losses for Learning from Corrupted Labels. [Paper] [Poster] [Slides] [Code]

  • 2019-ICML - Combating Label Noise in Deep Learning Using Abstention. [Paper] [Code]

  • 2019-ICML - SELFIE: Refurbishing unclean samples for robust deep learning. [Paper][Code]

  • 2019-ICASSP - Learning Sound Event Classifiers from Web Audio with Noisy Labels. [Paper] [Code]

  • 2019-TGRS - Hyperspectral Image Classification in the Presence of Noisy Labels. [Paper] [Code]

  • 2019-ICCV - NLNL: Negative Learning for Noisy Labels. [Paper][Code]

  • 2019-ICCV - Symmetric Cross Entropy for Robust Learning With Noisy Labels. [Paper][Code]

  • 2019-ICCV - Co-Mining: Deep Face Recognition With Noisy Labels.[Paper]

  • 2019-ICCV - O2U-Net: A Simple Noisy Label Detection Approach for Deep Neural Networks.[Paper][Code]

  • 2019-ICCV - Deep Self-Learning From Noisy Labels. [Paper]

  • 2019-ICCV_W - Photometric Transformer Networks and Label Adjustment for Breast Density Prediction. [Paper]

  • 2019-NIPS - Meta-Weight-Net: Learning an Explicit Mapping For Sample Weighting.[Paper][Code]

  • 2019-TPAMI - Learning from Large-scale Noisy Web Data with Ubiquitous Reweighting for Image Classification. [Paper]

  • 2019-ISBI - Robust Learning at Noisy Labeled Medical Images: Applied to Skin Lesion Classification. [Paper]

  • 2019-NIPS - Are Anchor Points Really Indispensable in Label-Noise Learning?. [Paper][Code]

  • 2019-NIPS - Noise-tolerant fair classification. [Paper][Code]

  • 2019-NIPS - Correlated Uncertainty for Learning Dense Correspondences from Noisy Labels. [Paper]

  • 2019-NIPS - Combinatorial Inference against Label Noise. [Paper][Code]

  • 2019-NIPS - L_DMI: A Novel Information-theoretic Loss Function for Training Deep Nets Robust to Label Noise. [Paper][Code]

  • 2019-Arxiv - ChoiceNet: Robust Learning by Revealing Output Correlations. [Paper]

  • 2019-Arxiv - Robust Learning Under Label Noise With Iterative Noise-Filtering. [Paper]

  • 2019-Arxiv - IMAE for Noise-Robust Learning: Mean Absolute Error Does Not Treat Examples Equally and Gradient Magnitude's Variance Matters. [Paper][Project page]

  • 2019-Arxiv - Confident Learning: Estimating Uncertainty in Dataset Labels. [Paper] [Code]

  • 2019-Arxiv - Derivative Manipulation for General Example Weighting. [Paper] [Code]

  • 2020-ICPR - Towards Robust Learning with Different Label Noise Distributions. [Paper][Code]

  • 2020-AAAI - Reinforcement Learning with Perturbed Rewards. [Paper] [Code]

  • 2020-AAAI - Less Is Better: Unweighted Data Subsampling via Influence Function. [Paper] [Code]

  • 2020-AAAI - Label Error Correction and Generation Through Label Relationships. [Paper]

  • 2020-AAAI - Self-Paced Robust Learning for Leveraging Clean Labels in Noisy Data. [Paper]

  • 2020-AAAI - Coupled-view Deep Classifier Learning from Multiple Noisy Annotators. [Paper]

  • 2020-AAAI - Partial Multi-label Learning with Noisy Label Identification. [Paper]

  • 2020-WACV - A Novel Self-Supervised Re-labeling Approach for Training with Noisy Labels. [Paper]

  • 2020-WACV - Disentangling Human Dynamics for Pedestrian Locomotion Forecasting with Noisy Supervision. [Paper]

  • 2020-WACV - Learning from Noisy Labels via Discrepant Collaborative Training. [Paper]

  • 2020-ICLR - SELF: Learning to Filter Noisy Labels with Self-Ensembling. [Paper]

  • 2020-ICLR - DivideMix: Learning with Noisy Labels as Semi-supervised Learning. [Paper][Code]

  • 2020-ICLR - Can gradient clipping mitigate label noise?. [Paper]

  • 2020-ICLR - Curriculum Loss: Robust Learning and Generalization against Label Corruption. [Paper]

  • 2020-ICLR - Simple and Effective Regularization Methods for Training on Noisily Labeled Data with Generalization Guarantee. [Paper]

  • 2020-ICLR - Learning from Rules Generalizing Labeled Exemplars. [Paper] [Code]

  • 2020-ICLR - Robust training with ensemble consensus. [Paper][Code]

  • 2020-CVPR - Combating noisy labels by agreement: A joint training method with co-regularization. [Paper][Code]

  • 2020-CVPR - Distilling Effective Supervision From Severe Label Noise. [Paper][Code]

  • 2020-CVPR - Learning From Noisy Anchors for One-Stage Object Detection. [Paper]

  • 2020-CVPR - Self-Training With Noisy Student Improves ImageNet Classification. [Paper][Code]

  • 2020-CVPR - Noise Robust Generative Adversarial Networks. [Paper][Code]

  • 2020-CVPR - Noise-Aware Fully Webly Supervised Object Detection. [Paper][Code]

  • 2020-CVPR - Global-Local GCN: Large-Scale Label Noise Cleansing for Face Recognition. [Paper]

  • 2020-CVPR - Training Noise-Robust Deep Neural Networks via Meta-Learning. [Paper]

  • 2020-ICML - Learning with Bounded Instance-and Label-dependent Label Noise. [Paper]

  • 2020-ICML - Label-Noise Robust Domain Adaptation. [Paper]

  • 2020-ICML - LTF: A Label Transformation Framework for Correcting Label Shift. [Papeer]

  • 2020-ICML - Does label smoothing mitigate label noise?. [Paper]

  • 2020-ICML - Error-Bounded Correction of Noisy Labels. [Paper] [Code]

  • 2020-ICML - Deep k-NN for Noisy Labels. [Paper]

  • 2020-ICML - Searching to Exploit Memorization Effect in Learning from Noisy Labels. [Paper]

  • 2020-ICML - Beyond Synthetic Noise: Deep Learning on Controlled Noisy Labels. [Paper]

  • 2020-ICML - Peer Loss Functions: Learning from Noisy Labels without Knowing Noise Rates. [Paper]

  • 2020-ICML - Improving Generalization by Controlling Label-Noise Information in Neural Network Weights. [Paper][Code]

  • 2020-ICML - Training Binary Neural Networks through Learning with Noisy Supervision. [Paperr]

  • 2020-ICML - SIGUA: Forgetting May Make Learning with Noisy Labels More Robust. [Paper][Code]

  • 2020-ICML - Normalized Loss Functions for Deep Learning with Noisy Labels. [Paper][Code]

  • 2020-ICML_W - How does Early Stopping Help Generalization against Label Noise?. [Paper]

  • 2020-IJCAI - learning with Noise: Improving Distantly-Supervised Fine-grained Entity Typing via Automatic Relabeling. [Paper]

  • 2020-IJCAI - Can Cross Entropy Loss Be Robust to Label Noise?. [Paper]

  • 2020-ECCV - Graph convolutional networks for learning with few clean and many noisy labels. [Paper]

  • 2020-ECCV - Learning with Noisy Class Labels for Instance Segmentation. [Paper][Code]

  • 2020-ECCV - Learning Noise-Aware Encoder-Decoder from Noisy Labels by Alternating Back-Propagation for Saliency Detection. [Paper][Code]

  • 2020-ECCV - NoiseRank: Unsupervised Label Noise Reduction with Dependence Models. [Paper]

  • 2020-ECCV - Weakly-Supervised Learning with Side Information for Noisy Labeled Images. [Paper]

  • 2020-ECCV - Sub-center ArcFace: Boosting Face Recognition by Large-scale Noisy Web Faces. [Paper]

  • 2020-TASLP - Audio Tagging by Cross Filtering Noisy Labels. [Paper]

  • 2020-NIPS - Robust Optimization for Fairness with Noisy Protected Groups. [Paper] [Code]

  • 2020-NIPS - A Topological Filter for Learning with Label Noise. [Paper] [Code]

  • 2020-NIPS - Self-Adaptive Training: beyond Empirical Risk Minimization. [Paper] [Code]

  • 2020-NIPS - Parts-dependent Label Noise: Towards Instance-dependent Label Noise. [Paper]

  • 2020-NIPS - Dual T: Reducing Estimation Error for Transition Matrix in Label-noise Learning. [Paper]

  • 2020-NIPS - Early-Learning Regularization Prevents Memorization of Noisy Labels. [Paper][Code]

  • 2020-NIPS - Disentangling Human Error from the Ground Truth in Segmentation of Medical Images. [Paper] [Code]

  • 2020-NIPS - Dual T: Reducing Estimation Error for Transition Matrix in Label-noise Learning. [Paper]

  • 2020-NIPS - Identifying Mislabeled Data using the Area Under the Margin Ranking. [Paper][Code]

  • 2020-NIPS - Coresets for Robust Training of Neural Networks against Noisy Labels. [Paper][Code]
  • 2020-IJCNN - Temporal Calibrated Regularization for Robust Noisy Label Learning. [Paper]

  • 2020-MICCAI - Characterizing Label Errors: Confident Learning for Noisy-labeled Image Segmentation. [Paper][Code]

  • 2020-SIBGRAPI - A Survey on Deep Learning with Noisy Labels:How to train your model when you cannot trust onthe annotations?. [Paper][Code]

  • 2020-ICPR - Meta Soft Label Generation for Noisy Labels. [Paper][Code]

  • 2020-IJCV - Rectifying Pseudo Label Learning via Uncertainty Estimation for Domain Adaptive Semantic Segmentation [Paper] [Code]

  • 2020-IEEEAccess - Limited Gradient Descent: Learning With Noisy Labels. [Paper]

  • 2020-Arxiv - Multi-Class Classification from Noisy-Similarity-Labeled Data. [Paper]

  • 2020-Arxiv - Learning Adaptive Loss for Robust Learning with Noisy Labels. [Paper]

  • 2020-Arxiv - Class2Simi: A New Perspective on Learning with Label Noise. [Paper]

  • 2020-Arxiv - Confidence Scores Make Instance-dependent Label-noise Learning Possible. [Paper]

  • 2020-Arxiv - ProSelfLC: Progressive Self Label Correction for Training Robust Deep Neural Networks. [Paper][Code]

  • 2020-Arxiv - Learning from Noisy Labels with Noise Modeling Network. [Paper]

  • 2020-Arxiv - ExpertNet: Adversarial Learning and Recovery Against Noisy Labels. [Paper]

  • 2020-Arxiv - A Second-Order Approach to Learning with Instance-Dependent Label Noise. [Paper]

  • 2020-Arxiv - Noisy Labels Can Induce Good Representations. [Paper]

  • 2020-Arxiv - Contrast to Divide: self-supervised pre-training for learning with noisy labels. [Paper][Code]

  • 2021-AAAI - Robustness of Accuracy Metric and its Inspirations in Learning with Noisy Labels. [Paper][Code]

  • 2021-AAAI - Beyond Class-Conditional Assumption: A Primary Attempt to Combat Instance-Dependent Label Noise. [Paper][Code]

  • 2021-AAAI - Meta Label Correction for Noisy Label Learning. [Paper]

  • 2021-WACV - Do We Really Need Gold Samples for Sample Weighting Under Label Noise? [Paper][Code]

  • 2021-WACV - EvidentialMix: Learning with Combined Open-set and Closed-set Noisy Labels. [Paper][Code][Blog]

  • 2021-CVPR - Improving Unsupervised Image Clustering With Robust Learning. [Paper]

  • 2021-CVPR - Multi-Objective Interpolation Training for Robustness to Label Noise. [Paper][Code]

Survey

  • 2014-TNLS - Classification in the Presence of Label Noise: a Survey. [Paper]

  • 2019 - Image Classification with Deep Learning in the Presence of Noisy Labels: A Survey. [Paper]

  • 2020 - Deep learning with noisy labels: exploring techniques and remedies in medical image analysis. [Paper]

  • 2020 - Learning from Noisy Labels with Deep Neural Networks: A Survey. [Paper]

Github

Others

Acknowledgements

Some of the above contents are borrowed from Noisy-Labels-Problem-Collection

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].