Jump to Content

Analysis of Softmax Approximation for Deep Classifiers under Input-Dependent Label Noise

Mark Patrick Collier
Basil Mustafa
Jesse Berent
(2020)

Abstract

Modelling uncertainty arising from input-dependent label noise is an increasingly important problem. A state-of-the-art approach \cite{kendall2017uncertainties} places a normal distribution over the softmax logits, where the mean and variance of this distribution are learned functions of the inputs. This approach achieves impressive empirical performance but lacks theoretical justification. We show that this model is in fact a special case of a well known and theoretically understood model in the econometrics literature. Under this view the softmax over the logit distribution is a smooth approximation to an argmax, where the approximation is exact in the zero temperature limit. We illustrate that the softmax temperature controls a bias-variance trade-off and the optimal point on this trade-off is not always found at $1.0$. By tuning the temperature and the corresponding bias-variance trade-off, we achieve improved performance on well known image classification benchmarks, where we introduce noisy labels synthetically. For image segmentation, where input-dependent label noise naturally arises, we show that tuning the temperature increases the mean IoU on the PASCAL VOC and Cityscapes datasets by more than 1\% over the state-of-the-art model and a strong baseline that does not model this noise source.