Nonparametric Uncertainty Quantification for Single Deterministic Neural Network

Abstract

This paper proposes a fast and scalable method for uncertainty quantification of machine learning models’ predictions. First, we show the principled way to measure the uncertainty of predictions for a classifier based on Nadaraya-Watson’s nonparametric estimate of the conditional label distribution. Importantly, the approach allows to disentangle explicitly aleatoric and epistemic uncertainties. The resulting method works directly in the feature space. However, one can apply it to any neural network by considering an embedding of the data induced by the network. We demonstrate the strong performance of the method in uncertainty estimation tasks on text classification problems and a variety of real-world image datasets, such as MNIST, SVHN, CIFAR-100 and several versions of ImageNet.

Publication
In NeurIPS 2021 Workshop on Distribution Shifts - Connecting Methods and Applications
Aleksandr Petiushko Александр Петюшко
Aleksandr Petiushko Александр Петюшко
Director, Head of ML Research / Adjunct Professor / PhD

Principal R&D Researcher (15+ years of experience), R&D Technical Leader (10+ years of experience), and R&D Manager (8+ years of experience). Running and managing industrial research and academic collaboration (35+ publications, 30+ patents). Inspired by theoretical computer science and how it changes the world.