Giter Site home page Giter Site logo

qianxianyang / deep_learning_with_noisy_labels_literature Goto Github PK

View Code? Open in Web Editor NEW

This project forked from gorkemalgan/deep_learning_with_noisy_labels_literature

0.0 1.0 0.0 15 KB

This repo consists of collection of papers and repos on the topic of deep learning by noisy labels / label noise.

deep_learning_with_noisy_labels_literature's Introduction

Deep Learning with Label Noise / Noisy Labels

This repo consists of collection of papers and repos on the topic of deep learning by noisy labels. All methods listed below are briefly explained in the paper Image Classification with Deep Learning in the Presence of Noisy Labels: A Survey. More information about the topic can also be found on the survey.

Year Type Conf Repo Title
2021 ML Pt MetaLabelNet: Learning to Generate Soft-Labels from Noisy-Labels
2020 ML ICPR Pt Meta Soft Label Generation for Noisy Labels
2020 RL Learning Adaptive Loss for Robust Learning with Noisy Labels
2020 LNC ProSelfLC: Progressive Self Label Correction for Training Robust Deep Neural Networks
2020 DP Identifying Mislabeled Data using the Area Under the Margin Ranking
2020 R Limited Gradient Descent: Learning With Noisy Labels
2020 NC Dual T: Reducing Estimation Error for Transition Matrix in Label-noise Learning
2020 LNC Temporal Calibrated Regularization for Robust Noisy Label Learning
2020 NC Parts-dependent Label Noise: Towards Instance-dependent Label Noise
2020 NC Class2Simi: A New Perspective on Learning with Label Noise
2020 LNC Learning from Noisy Labels with Noise Modeling Network
2020 LNC ExpertNet: Adversarial Learning and Recovery Against Noisy Labels
2020 R Pt Early-Learning Regularization Prevents Memorization of Noisy Labels
2020 LNC ProSelfLC: Progressive Self Label Correction for Training Robust Deep Neural Networks
2020 SC CVPR Pt Combating Noisy Labels by Agreement: A Joint Training Method with Co-Regularization
2020 SIW CVPR Tf Distilling Effective Supervision from Severe Label Noise
2020 NC CVPR Training Noise-Robust Deep Neural Networks via Meta-Learning
2020 LNC CVPR Global-Local GCN: Large-Scale Label Noise Cleansing for Face Recognition
2020 SIW ECCV Graph convolutional networks for learning with few clean and many noisy labels
2020 SIW ECCV NoiseRank: Unsupervised Label Noise Reduction with Dependence Models
2020 R ICLR Simple and Effective Regularization Methods for Training on Noisily Labeled Data with Generalization Guarantee
2020 R ICLR Can Gradient Clipping Mitigate Label Noise?
2020 SSL ICLR Pt DivideMix: Learning with Noisy Labels as Semi-supervised Learning
2020 SC AAAI Self-Paced Robust Learning for Leveraging Clean Labels in Noisy Data
2020 LNC IJCAI Learning with Noise: Improving Distantly-Supervised Fine-grained Entity Typing via Automatic Relabeling
2020 SIW IJCAI Label Distribution for Learning with Noisy Labels
2020 RL IJCAI Can Cross Entropy Loss Be Robust to Label Noise?
2020 SC WACV Learning from noisy labels via discrepant collaborative training
2020 LNC WACV A novel self-supervised re-labeling approach for training with noisy labels
2020 SC ICML Searching to Exploit Memorization Effect in Learning from Corrupted Labels
2020 ML ICML SIGUA: Forgetting May Make Learning with Noisy Labels More Robust
2020 R ICML Pt Improving Generalization by Controlling Label-Noise Information in Neural Network Weights
2020 RL ICML Normalized Loss Functions for Deep Learning with Noisy Labels
2020 RL ICML Peer Loss Functions: Learning from Noisy Labels without Knowing Noise Rates
2020 SC ICML Beyond Synthetic Noise: Deep Learning on Controlled Noisy Labels
2020 O ICML Deep k-NN for Noisy Labels
2020 LNC ICML Error-Bounded Correction of Noisy Labels
2020 O ICML Does label smoothing mitigate label noise?
2020 DP ICML Learning with Bounded Instance- and Label-dependent Label Noise
2020 O ICML Training Binary Neural Networks through Learning with Noisy Supervision
2019 SIW NIPS Pt Meta-Weight-Net: Learning an Explicit Mapping For Sample Weighting
2019 RL ICML On Symmetric Losses for Learning from Corrupted Labels
2019 O ICLR Pt SOSELETO: A Unified Approach to Transfer Learning and Training with Noisy Labels
2019 LNC ICLR An Energy-Based Framework for Arbitrary Label Noise Correction
2019 NC NIPS Pt Are Anchor Points Really Indispensable in Label-Noise Learning?
2019 O NIPS Pt Combinatorial Inference against Label Noise
2019 RL NIPS Pt L_DMI : A Novel Information-theoretic Loss Function for Training Deep Nets Robust to Label Noise
2019 O CVPR MetaCleaner: Learning to Hallucinate Clean Representations for Noisy-Labeled Visual Recognition
2019 LNC ICCV O2U-Net: A Simple Noisy Label Detection Approach for Deep Neural Networks
2019 SC ICCV * Co-Mining: Deep Face Recognition with Noisy Labels
2019 O NLNL: Negative Learning for Noisy Labels
2019 R Pt Using Pre-Training Can Improve Model Robustness and Uncertainty
2019 SSL Robust Learning Under Label Noise With Iterative Noise-Filtering
2019 ML CVPR Pt Learning to Learn from Noisy Labeled Data
2019 ML Pumpout: A Meta Approach for Robustly Training Deep Neural Networks with Noisy Labels
2019 RL Keras Symmetric Cross Entropy for Robust Learning with Noisy Labels
2019 RL Caffe Improved Mean Absolute Error for Learning Meaningful Patterns from Abnormal Training Data
2019 LQA CVPR Learning From Noisy Labels By Regularized Estimation Of Annotator Confusion
2019 SIW CVPR Caffe Noise-Tolerant Paradigm for Training Face Recognition CNNs
2019 SIW ICML Pt Combating Label Noise in Deep Learning Using Abstention
2019 SIW Robust Learning at Noisy Labeled Medical Images: Applied to Skin Lesion Classification
2019 SC ICML Keras Understanding and Utilizing Deep Neural Networks Trained with Noisy Labels
2019 SC ICML Pt How does Disagreement Help Generalization against Label Corruption?
2019 SC CVPR Learning a Deep ConvNet for Multi-label Classification with Partial Labels
2019 SC Curriculum Loss: Robust Learning and Generalization against Label Corruption
2019 SC SELF: Learning to Filter Noisy Labels with Self-Ensembling
2019 LNC CVPR Pt Graph Convolutional Label Noise Cleaner: Train a Plug-and-play Action Classifier for Anomaly Detection
2019 LNC ICCV Photometric Transformer Networks and Label Adjustment for Breast Density Prediction
2019 LNC CVPR Pt Probabilistic End-to-end Noise Correction for Learning with Noisy Labels
2019 LNC TGRS Matlab Hyperspectral image classification in the presence of noisy labels
2019 LNC ICCV Deep Self-Learning From Noisy Labels
2019 NC AAAI Tf Safeguarded Dynamic Label Regression for Noisy Supervision
2019 NC ICML Pt Unsupervised Label Noise Modeling and Loss Correction
2018 O ECCV Learning with Biased Complementary Labels
2018 O Robust Determinantal Generative Classifier for Noisy Labels and Adversarial Attacks
2018 R ICLR Keras Dimensionality Driven Learning for Noisy Labels
2018 R ECCV Deep bilevel learning
2018 SSL WACV A semi-supervised two-stage approach to learning from noisy labels
2018 ML Improving Multi-Person Pose Estimation using Label Correction
2018 RL NIPS Generalized cross entropy loss for training deep neural networks with noisy labels
2018 LQA ICLR Repo Learning From Noisy Singly-Labeled Data
2018 LQA AAAI Deep learning from crowds
2018 SIW CVPR Repo Iterative Learning With Open-Set Noisy Labels
2018 SIW Tf Learning to Reweight Examples for Robust Deep Learning
2018 SIW CVPR Tf Cleannet: Transfer Learning for Scalable Image Classifier Training with Label Noise
2018 SIW ChoiceNet: Robust Learning by Revealing Output Correlations
2018 SIW IEEE Multiclass Learning with Partially Corrupted Labels
2018 SC NIPS Pt Co-teaching: Robust Training of Deep Neural Networks with Extremely Noisy Labels
2018 SC IEEE Progressive Stochastic Learning for Noisy Labels
2018 SC ECCV Sklearn Curriculumnet: Weakly supervised learning from large-scale web images
2018 LNC CVPR Chainer Joint Optimization Framework for Learning with Noisy Labels
2018 LNC TIFS Pt, Caffe, Tf A light CNN for deep face representation with noisy labels
2018 LNC WACV Iterative cross learning on noisy labels
2018 NC NIPS Pt Using trusted data to train deep networks on labels corrupted by severe noise
2018 NC ISBI Training a neural network based on unreliable human annotation of medical images
2018 NC IEEE Deep learning from noisy image labels with quality embedding
2018 NC NIPS Tf Masking: A new perspective of noisy supervision
2017 O Learning with Auxiliary Less-Noisy Labels
2017 R Regularizing neural networks by penalizing confident output distributions
2017 R Pt mixup: Beyond Empirical Risk Minimization
2017 MIL CVPR Attend in groups: a weakly-supervised deep learning framework for learning from web data
2017 ML ICCV Learning from Noisy Labels with Distillation
2017 ML Avoiding your teacher's mistakes: Training neural networks with controlled weak supervision
2017 ML Learning to Learn from Weak Supervision by Full Supervision
2017 RL AAAI Robust Loss Functions under Label Noise for Deep Neural
2017 LQA ICLR Who Said What: Modeling Individual Labelers Improves Classification
2017 LQA CVPR Lean crowdsourcing: Combining humans and machines in an online system
2017 SC NIPS Tf Decoupling" when to update" from" how to update"
2017 SC NIPS Tf* Active bias: Training more accurate neural networks by emphasizing high variance samples
2017 SC Tf MentorNet: Learning Data-Driven Curriculum for Very Deep Neural Networks on Corrupted Labels
2017 SC Sklearn Learning with confident examples: Rank pruning for robust classification with noisy labels
2017 SC NIPS Toward Robustness against Label Noise in Training Deep Discriminative Neural Networks
2017 LNC IEEE Self-Error-Correcting Convolutional Neural Network for Learning with Noisy Labels
2017 LNC IEEE Improving crowdsourced label quality using noise correction
2017 LNC Fidelity-weighted learning
2017 LNC CVPR Learning From Noisy Large-Scale Datasets With Minimal Supervision
2017 NC CVPR Keras Making Deep Neural Networks Robust to Label Noise: a Loss Correction Approach
2017 NC ICLR Keras Training Deep Neural-Networks Using a Noise Adaptation Layer
2016 EM KBS A robust multi-class AdaBoost algorithm for mislabeled noisy data
2016 R CVPR Rethinking the inception architecture for computer vision
2016 SSL AAAI Robust semi-supervised learning through label aggregation
2016 ML NC Noise detection in the Meta-Learning Level
2016 RL On the convergence of a family of robust losses for stochastic gradient descent
2016 RL ICML Loss factorization, weakly supervised learning and label noise robustness
2016 SIW ICLR Matlab Auxiliary image regularization for deep cnns with noisy labels
2016 SIW CVPR Caffe Seeing Through the Human Reporting Bias: Visual Classifiers From Noisy Human-Centric Labels
2016 SC ECCV Repo The Unreasonable Effectiveness of Noisy Data for Fine-Grained Recognition
2016 NC ICDM Matlab Learning deep networks from noisy labels with dropout regularization
2016 NC CASSP Keras Training deep neural-networks based on unreliable labels
2015 O Learning discriminative reconstructions for unsupervised outlier removal
2015 EM Rboost: label noise-robust boosting algorithm based on a nonconvex loss function and the numerically stable base learners
2015 MIL CVPR Visual recognition by learning from web data: A weakly supervised domain generalization approach
2015 RL NIPS Learning with symmetric label noise: The importance of being unhinge
2015 RL NC Making risk minimization tolerant to label noise
2015 LQA Deep classifiers from image tags in the wild
2015 SIW TPAMI Pt Classification with noisy labels by importance reweighting
2015 SC ICCV Website Webly supervised learning of convolutional networks
2015 NC CVPR Caffe Learning From Massive Noisy Labeled Data for Image Classification
2015 NC ICLR Training Convolutional Networks with Noisy Labels
2014 R Explaining and harnessing adversarial examples
2014 R JMLR Dropout: a simple way to prevent neural networks from overfitting
2014 SC Keras Training Deep Neural Networks on Noisy Labels with Bootstrapping
2014 NC Learning from Noisy Labels with Deep Neural Networks
2014 LQA Learning from multiple annotators with varying expertise
2013 EM Boosting in the presence of label noise
2013 RL NIPS Learning with Noisy Labels
2013 RL IEEE Noise tolerance under risk minimization
2012 EM A noise-detection based AdaBoost algorithm for mislabeled data
2012 RL ICML Learning to Label Aerial Images from Noisy Data
2011 EM An empirical comparison of two boosting algorithms on real data sets with artificial class noise
2009 LQA Supervised learning from multiple experts: whom to trust when everyone lies a bit
2008 LQA NIPS Whose vote should count more: Optimal integration of labels from labelers of unknown expertise
2006 RL JASA Convexity, classification, and risk bounds
2000 EM An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization

In order to test label-noise-robust algorithms with benchmark datasets (mnist,mnist-fashion,cifar10,cifar100) synthetic noise generation is a necessary step. Following work provides a feature-dependent synthetic noise generation algorithm and pre-generated synthetic noisy labels for mentioned datasets.

List of papers that shed light to label noise phenomenon for deep learning:

Title Year
Image Classification with Deep Learning in the Presence of Noisy Labels: A Survey 2020
Investigating CNNs' Learning Representation Under Label Noise 2019
How Do Neural Networks Overcome Label Noise? 2018
Deep Learning is Robust to Massive Label Noise 2018
A closer look at memorization in deep networks 2017
Deep Nets Don't Learn via Memorization 2017
On the robustness of convnets to training on noisy labels 2017
A study of the effect of different types of noise on the precision of supervised learning techniques 2017
Understanding deep learning requires rethinking generalization 2016
A comprehensive introduction to label noise 2014
Classification in the Presence of Label Noise: a Survey 2014
Class noise and supervised learning in medical domains: The effect of feature extraction 2006
Class noise vs. attribute noise: A quantitative study 2004

List of works under label noise beside classification

Title Year
Devil is in the Edges: Learning Semantic Boundaries from Noisy Annotations 2019
Improving Semantic Segmentation via Video Propagation and Label Relaxation 2018
Learning from weak and noisy labels for semantic segmentation 2016
Robustness of conditional GANs to noisy labels 2018
Label-Noise Robust Generative Adversarial Networks 2018
Label-Noise Robust Domain Adaptation 2020

Sources on web

Clothing1M is a real-world noisy labeled dataset which is widely used for benchmarking. Below is the test accuracies on this dataset. Note that,clothing1M contains spare 50k clean training data, but most of the methods dont use this data for fair comparison. Therefore, here I only listed methods that do not use extra 50k samples. '?' indicates that given work does not mentipon whether they used 50k clean samples or not.

Title Accuracy
MetaLabelNet: Learning to Generate Soft-Labels from Noisy-Labels 78.20
Meta Soft Label Generation for Noisy Labels 76.02
DivideMix: Learning with Noisy Labels as Semi-supervised Learning? 74.76
Cleannet: Transfer Learning for Scalable Image Classifier Training with Label Noise 74.69
Deep Self-Learning From Noisy Labels 74.45
Limited Gradient Descent: Learning With Noisy Labels 74.36
Are Anchor Points Really Indispensable in Label-Noise Learning? 74.18
NoiseRank: Unsupervised Label Noise Reduction with Dependence Models 73.77
Learning Adaptive Loss for Robust Learning with Noisy Labels 73.76
Meta-Weight-Net: Learning an Explicit Mapping For Sample Weighting 73.72
Probabilistic End-to-end Noise Correction for Learning with Noisy Labels 73.49
Learning to Learn from Noisy Labeled Data 73.47
Improved Mean Absolute Error for Learning Meaningful Patterns from Abnormal Training Data 73.20
Safeguarded Dynamic Label Regression for Noisy Supervision 73.07
Temporal Calibrated Regularization for Robust Noisy Label Learning 72.54
MetaCleaner: Learning to Hallucinate Clean Representations for Noisy-Labeled Visual Recognition 72.50
L_DMI : A Novel Information-theoretic Loss Function for Training Deep Nets Robust to Label Noise 72.46
Joint Optimization Framework for Learning with Noisy Labels 72.23
Error-Bounded Correction of Noisy Labels 71.74
Parts-dependent Label Noise: Towards Instance-dependent Label Noise 71.67
Dual T: Reducing Estimation Error for Transition Matrix in Label-noise Learning? 71.49
Improving Generalization by Controlling Label-Noise Information in Neural Network Weights 71.39
Masking: A new perspective of noisy supervision 71.10
Symmetric Cross Entropy for Robust Learning with Noisy Labels 71.02
Unsupervised Label Noise Modeling and Loss Correction 71.00

Abbreviations for noise types are:

  • NC -> Noisy Channel
  • LNC -> Label Noise Cleansing
  • DP -> Dataset Pruning
  • SC -> Sample Choosing
  • SIW -> Sample Importance Weighting
  • LQA -> Labeler quality assesment
  • RL -> Robust Losses
  • ML -> Meta Learning
  • MIL -> Multiple Instance Learning
  • SSL -> Semi Supervised Learning
  • R -> Regularizers
  • EM -> Ensemble Methods
  • O -> Others

Other abbreviations:

  • NC -> Neurocomputing
  • Tf -> Tensorflow
  • Pt -> PyTorch

Starred (*) repos means code is unoffical!

deep_learning_with_noisy_labels_literature's People

Contributors

gorkemalgan avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.