subeeshvasu / awesome-learning-with-label-noise Goto Github PK
View Code? Open in Web Editor NEWA curated list of resources for Learning with Noisy Labels
A curated list of resources for Learning with Noisy Labels
Thanks for the outstanding repository. I think "Learning with Twin Noisy Labels for Visible-Infrared Person Re-Identification" from CVPR'22 is also relevant.
I am working on a problem where the labels/response variables take the form of #successes / #attempts. Clearly the goodness of the label depends on the number of attempt so I'd like to avoid the model to learn corner cases like y=0, y=1 that essentially occur because not enough attempts have been made.
We generally frame this problem as either a regression task with mse loss and weights given by #attempts or by looking at it as a classification task with label in [0, 1] and weights equal to #attempts - #successes and #successes respectively and trained through binary cross entropy.
Do you have any paper to recommend that tackle this problem?
thanks in advance
Code for ICCV2019 "Symmetric Cross Entropy for Robust Learning with Noisy Labels" https://github.com/YisenWang/symmetric_cross_entropy_for_noisy_labels
I think it can help you update!
I think "Learning with Neighbor Consistency for Noisy Labels" from CVPR'22 is also relevant.
I would like to point out that our work (Amid et al. 2019a) extends the Generalized CE loss (Zhang and Sabuncu 2018) by introducing two temperatures t1 and t2 which recovers GCE when t1 = q and t2 = 1. Our more recent work, called the bi-tempered loss (Amid et al. 2019b) extends these methods by introducing a proper (unbiased) generalization of the CE loss and is shown to be extremely effective in reducing the effect of noisy examples. Please consider adding these two papers to your list.
Google AI blog post:
https://ai.googleblog.com/2019/08/bi-tempered-logistic-loss-for-training.html
Code:
https://github.com/google/bi-tempered-loss
Demo:
https://google.github.io/bi-tempered-loss/
(Amid et al. 2019a) Amid et al. "Two-temperature logistic regression based on the Tsallis divergence." In The 22nd International Conference on Artificial Intelligence and Statistics (AISTATS), 2019.
(Amid et al. 2019b) Amid et al. "Robust bi-tempered logistic loss based on Bregman divergences." In Advances in Neural Information Processing Systems (NeurIPS), 2019.
(Zhang and Sabuncu 2018) Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels, In Advances in Neural Information Processing Systems (NeurIPS), 2018.
Hi, do you have the paper list for Noisy Supervision Learning method on object detection?
Most Noisy Supervision Learning method I found can only handel classifications task.
Many thanks
Thanks for your nice repository!
I hope to share our recent work that investigates the study of learning from noise labels in the multi-view object classification case.
The paper title is 'MVP-N: A Dataset and Benchmark for Real-World Multi-View Object Classification' (link) and accepted by NeurIPS 2022.
Thanks
Hi,
Thanks for your wonderful collection of awesome papers with label-noise label learning! We would like to recommend our newly released paper PiCO: Contrastive Label Disambiguation for Partial Label Learning, which has been accepted by ICLR 2022 as an oral presentation. Our work studied an important corrupted-label learning problem that mitigates the inherent label ambiguity in the annotation procedure. We believe our work is appropriate to be included in this repo!
Best.
Haobo
Hi, we want to recommend our recent paper Co-learning: Learning from Noisy Labels with Self-supervision which has been accepted as an oral paper at ACM Multimedia 2021.
If I have enough low quality data from unsupervised methods or rule-based methods.
In detail, I deal with a multi-label classification task. First I crawl web page such as wiki and use regex-based rule to mark the label. The model input is the wiki title and the model output is the rule-matched labels from wiki content. My task is to predict the labels for the wiki title.
Do you think removing the wrong data predicted by trained model is a simple but effective method?
@subeeshvasu Thank you very much!
Hi,
Thank you for maintaining this wonderful repository.
I have found that the paper link of the following paper on your repository has been broken. Could you please fix it with the correct link below.
Paper - Noise-Robust Learning from Multiple Unsupervised Sources of Inferred Labels
Link - https://ojs.aaai.org/index.php/AAAI/article/view/20806/20565
Thank you.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.