inner-banner-bg

Current Research in Statistics & Mathematics(CRSM)

ISSN: 2994-9459 | DOI: 10.33140/CRSM

A Robust Approach to Uncertainty Quantification in Deep Learning

Abstract

Pierpaolo Massoli

This study proposes a novel approach for quantifying the uncertainty of a deep learning model by investigating the coverage as well as the adaptivity of its prediction intervals in a Conformal Prediction context. The model investigated is designed to impute the equivalent household income by taking both specific household group characteristics and relevant features of the main income gainer into account as it is known that there are well-known correlations in literature. The imputation of such variable is critical as outliers occur or the required information for computing it is not entirely available. Due to the relevance of income in socio-economic policy contexts, the reliability of its imputation constitutes a key aspect. The Conformalized Quantile Regression is adopted in order to evaluate the prediction intervals of the model by incorporating this approach into the same. In this study an improved assessment of the model uncertainty is achieved by separating the aleatoric component from the epistemic one. For this purpose, an appropriate selection of training data is proposed. This non random selection introduces bias which may alter model estimates causing distortions which impair the uncertainty quantification approach. As a consequence, a correction of selection bias is integrated in the uncertainty evaluation process. A real-world case study is considered to demonstrate the potential of the proposed quantification approach.

PDF