All Projects → Active-Passive-Losses → Similar Projects or Alternatives

28 Open source projects that are alternatives of or similar to Active-Passive-Losses

Advances-in-Label-Noise-Learning
A curated (most recent) list of resources for Learning with Noisy Labels
Stars: ✭ 360 (+291.3%)
IDN
AAAI 2021: Beyond Class-Conditional Assumption: A Primary Attempt to Combat Instance-Dependent Label Noise
Stars: ✭ 21 (-77.17%)
rebias
Official Pytorch implementation of ReBias (Learning De-biased Representations with Biased Representations), ICML 2020
Stars: ✭ 125 (+35.87%)
Mutual labels:  icml, icml-2020
TaLKConvolutions
Official PyTorch implementation of Time-aware Large Kernel (TaLK) Convolutions (ICML 2020)
Stars: ✭ 26 (-71.74%)
Mutual labels:  icml, icml-2020
noisy label understanding utilizing
ICML 2019: Understanding and Utilizing Deep Neural Networks Trained with Noisy Labels
Stars: ✭ 82 (-10.87%)
Mutual labels:  icml, noisy-labels
PackIt
Code for reproducing results in ICML 2020 paper "PackIt: A Virtual Environment for Geometric Planning"
Stars: ✭ 45 (-51.09%)
Mutual labels:  icml-2020
unicornn
Official code for UnICORNN (ICML 2021)
Stars: ✭ 21 (-77.17%)
Mutual labels:  icml
NeuralPull
Implementation of ICML'2021:Neural-Pull: Learning Signed Distance Functions from Point Clouds by Learning to Pull Space onto Surfaces
Stars: ✭ 149 (+61.96%)
Mutual labels:  icml
EgoCNN
Code for "Distributed, Egocentric Representations of Graphs for Detecting Critical Structures" (ICML 2019)
Stars: ✭ 16 (-82.61%)
Mutual labels:  icml
AIPaperCompleteDownload
Complete download for papers in various top conferences
Stars: ✭ 64 (-30.43%)
Mutual labels:  icml
FedScale
FedScale is a scalable and extensible open-source federated learning (FL) platform.
Stars: ✭ 274 (+197.83%)
Mutual labels:  icml
Noisy-Labels-with-Bootstrapping
Keras implementation of Training Deep Neural Networks on Noisy Labels with Bootstrapping, Reed et al. 2015
Stars: ✭ 22 (-76.09%)
Mutual labels:  noisy-labels
ACE
Code for our paper, Neural Network Attributions: A Causal Perspective (ICML 2019).
Stars: ✭ 47 (-48.91%)
Mutual labels:  icml
icml-nips-iclr-dataset
Papers, authors and author affiliations from ICML, NeurIPS and ICLR 2006-2021
Stars: ✭ 21 (-77.17%)
Mutual labels:  icml
wrench
WRENCH: Weak supeRvision bENCHmark
Stars: ✭ 185 (+101.09%)
Mutual labels:  robust-learning
clean-net
Tensorflow source code for "CleanNet: Transfer Learning for Scalable Image Classifier Training with Label Noise" (CVPR 2018)
Stars: ✭ 86 (-6.52%)
Mutual labels:  label-noise
C2D
PyTorch implementation of "Contrast to Divide: self-supervised pre-training for learning with noisy labels"
Stars: ✭ 59 (-35.87%)
Mutual labels:  noisy-labels
Pwc
Papers with code. Sorted by stars. Updated weekly.
Stars: ✭ 15,288 (+16517.39%)
Mutual labels:  icml
Awesome-Computer-Vision-Paper-List
This repository contains all the papers accepted in top conference of computer vision, with convenience to search related papers.
Stars: ✭ 248 (+169.57%)
Mutual labels:  icml
FairAI
This is a collection of papers and other resources related to fairness.
Stars: ✭ 55 (-40.22%)
Mutual labels:  icml
imbalanced-regression
[ICML 2021, Long Talk] Delving into Deep Imbalanced Regression
Stars: ✭ 425 (+361.96%)
Mutual labels:  icml
semi-supervised-NFs
Code for the paper Semi-Conditional Normalizing Flows for Semi-Supervised Learning
Stars: ✭ 23 (-75%)
Mutual labels:  icml
NeuroAI
NeuroAI-UW seminar, a regular weekly seminar for the UW community, organized by NeuroAI Shlizerman Lab.
Stars: ✭ 36 (-60.87%)
Mutual labels:  icml
probnmn-clevr
Code for ICML 2019 paper "Probabilistic Neural-symbolic Models for Interpretable Visual Question Answering" [long-oral]
Stars: ✭ 63 (-31.52%)
Mutual labels:  icml
Cleanlab
The standard package for machine learning with noisy labels, finding mislabeled data, and uncertainty quantification. Works with most datasets and models.
Stars: ✭ 2,526 (+2645.65%)
Mutual labels:  noisy-data
FactorGraph.jl
The FactorGraph package provides the set of different functions to perform inference over the factor graph with continuous or discrete random variables using the belief propagation algorithm.
Stars: ✭ 17 (-81.52%)
Mutual labels:  noisy-data
NLNL-Negative-Learning-for-Noisy-Labels
NLNL: Negative Learning for Noisy Labels
Stars: ✭ 70 (-23.91%)
Mutual labels:  noisy-labels
ProSelfLC-2021
noisy labels; missing labels; semi-supervised learning; entropy; uncertainty; robustness and generalisation.
Stars: ✭ 45 (-51.09%)
Mutual labels:  noisy-labels
1-28 of 28 similar projects