Recommended by

Indexed by

On double-descent in uncertainty quantification in overparametrized models (code)

Lucas Clarte1*, Bruno Loureiro2, Florent Krzakala3, Lenka Zdeborova1

1 École Polytechnique Fédérale de Lausanne (EPFL), Statistical Physics of Computation lab., CH-1015 Lausanne, Switzerland

2 Département d’Informatique, École Normale Supérieure - PSL & CNRS, Paris, France

3 École Polytechnique Fédérale de Lausanne (EPFL), Information, Learning and Physics lab., CH-1015 Lausanne, Switzerland

* Corresponding authors emails: lucas.clarte@epfl.ch
DOI10.24435/materialscloud:zb-71 [version v1]

Publication date: Sep 19, 2023

How to cite this record

Lucas Clarte, Bruno Loureiro, Florent Krzakala, Lenka Zdeborova, On double-descent in uncertainty quantification in overparametrized models (code), Materials Cloud Archive 2023.145 (2023), https://doi.org/10.24435/materialscloud:zb-71


Uncertainty quantification is a central challenge in reliable and trustworthy machine learning. Naive measures such as last-layer scores are well-known to yield overconfident estimates in the context of overparametrized neural networks. Several methods, ranging from temperature scaling to different Bayesian treatments of neural networks, have been proposed to mitigate overconfidence, most often supported by the numerical observation that they yield better calibrated uncertainty measures. In this work, we provide a sharp comparison between popular uncertainty measures for binary classification in a mathematically tractable model for overparametrized neural networks: the random features model. We discuss a trade-off between classification accuracy and calibration, unveiling a double descent like behavior in the calibration curve of optimally regularized estimators as a function of overparametrization. This is in contrast with the empirical Bayes method, which we show to be well calibrated in our setting despite the higher generalization error and overparametrization. This record provides the code to reproduce the numerical experiments of the related paper "On double-descent in uncertainty quantification in overparametrized models".

Materials Cloud sections using this data

No Explore or Discover sections associated with this archive record.


File name Size Description
520.1 KiB compressed files contained in the repository https://github.com/SPOC-group/double_descent_uncertainty
1.2 KiB README file describing the structure of the code


Files and data are licensed under the terms of the following license: Creative Commons Attribution 4.0 International.
Metadata, except for email addresses, are licensed under the Creative Commons Attribution Share-Alike 4.0 International license.


MARVEL/P2 uncertainty quantification neural networks numerical simulation

Version history:

2023.145 (version v1) [This version] Sep 19, 2023 DOI10.24435/materialscloud:zb-71