Google Research

Analysis of Softmax Approximation for Deep Classifiers under Input-Dependent Label Noise



Modelling uncertainty arising from input-dependent label noise is an increasingly important problem. A state-of-the-art approach \cite{kendall2017uncertainties} places a normal distribution over the softmax logits, where the mean and variance of this distribution are learned functions of the inputs. This approach achieves impressive empirical performance but lacks theoretical justification. We show that this model is in fact a special case of a well known and theoretically understood model in the econometrics literature. Under this view the softmax over the logit distribution is a smooth approximation to an argmax, where the approximation is exact in the zero temperature limit. We illustrate that the softmax temperature controls a bias-variance trade-off and the optimal point on this trade-off is not always found at $1.0$. By tuning the temperature and the corresponding bias-variance trade-off, we achieve improved performance on well known image classification benchmarks, where we introduce noisy labels synthetically. For image segmentation, where input-dependent label noise naturally arises, we show that tuning the temperature increases the mean IoU on the PASCAL VOC and Cityscapes datasets by more than 1\% over the state-of-the-art model and a strong baseline that does not model this noise source.

Learn more about how we do research

We maintain a portfolio of research projects, providing individuals and teams the freedom to emphasize specific types of work