Giter Site home page Giter Site logo

subeeshvasu / awesome-learning-with-label-noise Goto Github PK

View Code? Open in Web Editor NEW
2.5K 98.0 353.0 317 KB

A curated list of resources for Learning with Noisy Labels

noisy-labels label-noise deep-neural-networks noisy-data unreliable-labels robust-learning

awesome-learning-with-label-noise's People

Contributors

anishathalye avatar arghosh avatar baeheesun avatar bbdamodaran avatar btsmart avatar cgnorthcutt avatar chenpf1025 avatar cocoxili avatar dailing avatar dbp1994 avatar guixianjin avatar hendrycks avatar janus-shiau avatar julilien avatar kentonishi avatar layneh avatar layumi avatar mrchenfeng avatar nolfwin avatar pxiangwu avatar pyjulie avatar ragavsachdeva avatar rtanno21609 avatar shikunli avatar smithliu95 avatar subeeshvasu avatar sungbinlim avatar wenshuoguo avatar ximinng avatar yulv-git avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

awesome-learning-with-label-noise's Issues

adding related paper

Thanks for the outstanding repository. I think "Learning with Twin Noisy Labels for Visible-Infrared Person Re-Identification" from CVPR'22 is also relevant.

learning success rates with variable number of attempts

I am working on a problem where the labels/response variables take the form of #successes / #attempts. Clearly the goodness of the label depends on the number of attempt so I'd like to avoid the model to learn corner cases like y=0, y=1 that essentially occur because not enough attempts have been made.

We generally frame this problem as either a regression task with mse loss and weights given by #attempts or by looking at it as a classification task with label in [0, 1] and weights equal to #attempts - #successes and #successes respectively and trained through binary cross entropy.

Do you have any paper to recommend that tackle this problem?
thanks in advance

Missing references in your list

I would like to point out that our work (Amid et al. 2019a) extends the Generalized CE loss (Zhang and Sabuncu 2018) by introducing two temperatures t1 and t2 which recovers GCE when t1 = q and t2 = 1. Our more recent work, called the bi-tempered loss (Amid et al. 2019b) extends these methods by introducing a proper (unbiased) generalization of the CE loss and is shown to be extremely effective in reducing the effect of noisy examples. Please consider adding these two papers to your list.

Google AI blog post:
https://ai.googleblog.com/2019/08/bi-tempered-logistic-loss-for-training.html
Code:
https://github.com/google/bi-tempered-loss
Demo:
https://google.github.io/bi-tempered-loss/

(Amid et al. 2019a) Amid et al. "Two-temperature logistic regression based on the Tsallis divergence." In The 22nd International Conference on Artificial Intelligence and Statistics (AISTATS), 2019.

(Amid et al. 2019b) Amid et al. "Robust bi-tempered logistic loss based on Bregman divergences." In Advances in Neural Information Processing Systems (NeurIPS), 2019.

(Zhang and Sabuncu 2018) Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels, In Advances in Neural Information Processing Systems (NeurIPS), 2018.

add a paper from 2022 NeurIPS

Thanks for your nice repository!
I hope to share our recent work that investigates the study of learning from noise labels in the multi-view object classification case.
The paper title is 'MVP-N: A Dataset and Benchmark for Real-World Multi-View Object Classification' (link) and accepted by NeurIPS 2022.

Thanks

Recommand our ICLR oral paper

Hi,

Thanks for your wonderful collection of awesome papers with label-noise label learning! We would like to recommend our newly released paper PiCO: Contrastive Label Disambiguation for Partial Label Learning, which has been accepted by ICLR 2022 as an oral presentation. Our work studied an important corrupted-label learning problem that mitigates the inherent label ambiguity in the annotation procedure. We believe our work is appropriate to be included in this repo!

Best.
Haobo

What are the simplest methods for the label noise problem?

If I have enough low quality data from unsupervised methods or rule-based methods.

In detail, I deal with a multi-label classification task. First I crawl web page such as wiki and use regex-based rule to mark the label. The model input is the wiki title and the model output is the rule-matched labels from wiki content. My task is to predict the labels for the wiki title.

Do you think removing the wrong data predicted by trained model is a simple but effective method?

@subeeshvasu Thank you very much!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.