Unknown

Dataset Information

0

On Robustness of Neural Architecture Search Under Label Noise.


ABSTRACT: Neural architecture search (NAS), which aims at automatically seeking proper neural architectures given a specific task, has attracted extensive attention recently in supervised learning applications. In most real-world situations, the class labels provided in the training data would be noisy due to many reasons, such as subjective judgments, inadequate information, and random human errors. Existing work has demonstrated the adverse effects of label noise on the learning of weights of neural networks. These effects could become more critical in NAS since the architectures are not only trained with noisy labels but are also compared based on their performances on noisy validation sets. In this paper, we systematically explore the robustness of NAS under label noise. We show that label noise in the training and/or validation data can lead to various degrees of performance variations. Through empirical experiments, using robust loss functions can mitigate the performance degradation under symmetric label noise as well as under a simple model of class conditional label noise. We also provide a theoretical justification for this. Both empirical and theoretical results provide a strong argument in favor of employing the robust loss function in NAS under high-level noise.

SUBMITTER: Chen YW 

PROVIDER: S-EPMC7931895 | biostudies-literature | 2020

REPOSITORIES: biostudies-literature

altmetric image

Publications

On Robustness of Neural Architecture Search Under Label Noise.

Chen Yi-Wei YW   Song Qingquan Q   Liu Xi X   Sastry P S PS   Hu Xia X  

Frontiers in big data 20200211


Neural architecture search (NAS), which aims at automatically seeking proper neural architectures given a specific task, has attracted extensive attention recently in supervised learning applications. In most real-world situations, the class labels provided in the training data would be noisy due to many reasons, such as subjective judgments, inadequate information, and random human errors. Existing work has demonstrated the adverse effects of label noise on the learning of weights of neural net  ...[more]

Similar Datasets

| S-EPMC4364615 | biostudies-literature
| S-EPMC9714572 | biostudies-literature
| S-EPMC3280957 | biostudies-literature
| S-EPMC7412935 | biostudies-literature
| S-EPMC7774963 | biostudies-literature
| S-EPMC9884270 | biostudies-literature
| S-EPMC9522011 | biostudies-literature
| S-EPMC10483879 | biostudies-literature
| S-EPMC8043624 | biostudies-literature
| S-EPMC117364 | biostudies-literature