Label-Noise Learning Beyond Class-Conditional Noise
[CFAR Rising Star Lecture Series]
Label-Noise Learning Beyond Class-Conditional Noise
The label-noise problem belongs to inaccurate supervision - one of the three typical types of weak supervision. Label noise may exist in many real-world applications where budgets for labeling raw data are limited. However, the famous class-conditional noise (CCN) model, which assumes that the label corruption process (namely, the class-label flipping probability for corrupting the class-posterior probability) is instance-independent and only class-dependent, is not enough in expressing/modeling real-world label noise and thus we need to go beyond it.
In this talk, Dr Niu Gang will introduce his recent advances in robust learning against label noise when the noise is significantly harder than CCN. The first noise model is called instance-dependent noise (IDN), where the label flipping probability is conditioned on not only the true label but also the instance itself; IDN is non-identifiable and needs to be approximated with additional assumptions and/or information. The second noise model is called mutually contaminated distributions (MCD), where what has been corrupted is the class-conditional density for sampling instances rather than the class-posterior probability for labeling instances. The learning methods for handling IDN and MCD show that label-noise learning beyond CCN is at least possible and hopefully there will be new methods making it more and more practical.
Label-Noise Learning Beyond Class-Conditional Noise
16 Sep 2022 | 2.00pm (Singapore Time)
The label-noise problem belongs to inaccurate supervision - one of the three typical types of weak supervision. Label noise may exist in many real-world applications where budgets for labeling raw data are limited. However, the famous class-conditional noise (CCN) model, which assumes that the label corruption process (namely, the class-label flipping probability for corrupting the class-posterior probability) is instance-independent and only class-dependent, is not enough in expressing/modeling real-world label noise and thus we need to go beyond it.
In this talk, Dr Niu Gang will introduce his recent advances in robust learning against label noise when the noise is significantly harder than CCN. The first noise model is called instance-dependent noise (IDN), where the label flipping probability is conditioned on not only the true label but also the instance itself; IDN is non-identifiable and needs to be approximated with additional assumptions and/or information. The second noise model is called mutually contaminated distributions (MCD), where what has been corrupted is the class-conditional density for sampling instances rather than the class-posterior probability for labeling instances. The learning methods for handling IDN and MCD show that label-noise learning beyond CCN is at least possible and hopefully there will be new methods making it more and more practical.
SPEAKER
Dr Niu Gang
Research Scientist, RIKEN Center for Advanced Intelligence Project
Adjunct Professor, School of Computer Science and Engineering
Southeast University
Research Scientist, RIKEN Center for Advanced Intelligence Project
Adjunct Professor, School of Computer Science and Engineering
Southeast University
Niu Gang is currently an indefinite-term research scientist at RIKEN Center for Advanced Intelligence Project. He received his PhD degree in computer science from the Tokyo Institute of Technology in 2013. Before joining RIKEN as a research scientist, he was a senior software engineer at Baidu and then an assistant professor at the University of Tokyo. He has published more than 90 journal articles and conference papers, including 31 ICML, 17 NeurIPS (1 oral and 3 spotlights), and 11 ICLR (1 outstanding paper honourable mention, 2 orals, and 1 spotlight) papers. He has co-authored the book “Machine Learning from Weak Supervision: An Empirical Risk Minimisation Approach” (the MIT Press). On the other hand, he has served as an area chair 18 times, including ICLR 2021-2022, ICML 2019-2022, and NeurIPS 2019-2022. He also serves/has served as an action editor of TMLR and a guest editor of a special issue at MLJ. Moreover, he has served as a publication chair for ICML 2022, and has co-organised 9 workshops, 1 competition, and 2 tutorials.
A*STAR celebrates International Women's Day
From groundbreaking discoveries to cutting-edge research, our researchers are empowering the next generation of female science, technology, engineering and mathematics (STEM) leaders.
Get inspired by our #WomeninSTEM