Download PDFOpen PDF in browserBehaviour of Sample Selection Techniques Under Explicit RegularizationEasyChair Preprint 676311 pages•Date: October 3, 2021AbstractThere is a multitude of sample selection-based learning strategies that have been developed for learning with noisy labels. However, It has also been indicated in the literature that perhaps early stopping is better than fully training the model for getting better performance. It leads us to wonder about the behavior of the sample selection strategies under explicit regularization. To this end, we considered four of the most fundamental sample selection-based models MentorNet, Coteaching, Coteaching-plus and JoCor.We provide empirical results of applying explicit L2 regularization to the above-mentioned approaches. We also compared the results with a baseline - a vanilla CNN model trained with just regularization. We show that under explicit regularization, the pre-conceived ranking of the approaches might change. We also show several instances where the baseline was able to outperform some or all of the existing approaches. Moreover, we show that under explicit regularization, the performance gap between the approaches can also reduce. Keyphrases: noisy labels, regularization, sample selection
|