Giter Site home page Giter Site logo

vekteur / probabilistic-calibration-study Goto Github PK

View Code? Open in Web Editor NEW
6.0 2.0 1.0 197 KB

Implementation of "A Large-Scale Study of Probabilistic Calibration in Neural Network Regression" (ICML 2023)

License: GNU General Public License v3.0

Jupyter Notebook 18.51% Python 81.49%
calibration neural-network post-hoc regression regularization uncertainty uncertainty-quantification

probabilistic-calibration-study's Introduction

Accompanying repository for the paper A Large-Scale Study of Probabilistic Calibration in Neural Network Regression. ICML, 2023.

Overview

This repository includes the implementation of:

  • Various calibration methods (quantile recalibration, conformalized quantile prediction, quantile regularization, PCE-KDE, PCE-Sort...) with hyperparameter tuning.
  • Various metrics (NLL, CRPS, quantile score, probabilistic calibration error...).
  • The pipeline to download a total of 57 tabular regression datasets from the UCI Machine Learning Repository and OpenML.
  • Various plots (reliability diagrams, boxen plots, CD diagrams...).

Abstract

Accurate probabilistic predictions are essential for optimal decision making. While neural network miscalibration has been studied primarily in classification, we investigate this in the less-explored domain of regression. We conduct the largest empirical study to date to assess the probabilistic calibration of neural networks. We also analyze the performance of recalibration, conformal, and regularization methods to enhance probabilistic calibration. Additionally, we introduce novel differentiable recalibration and regularization methods, uncovering new insights into their effectiveness. Our findings reveal that regularization methods offer a favorable tradeoff between calibration and sharpness. Post-hoc methods exhibit superior probabilistic calibration, which we attribute to the finite-sample coverage guarantee of conformal prediction. Furthermore, we demonstrate that quantile recalibration can be considered as a specific case of conformal prediction. Our study is fully reproducible and implemented in a common code base for fair comparisons.

Installation

The experiments have been run in python 3.9 with the package versions listed in requirements.txt.

They can be installed using:

pip install -r requirements.txt

Main experiments

The main experiments can be run using:

python run.py name="full" nb_workers=1 repeat_tuning=5 \
        log_base_dir="logs" progress_bar=False \
        save_train_metrics=False save_val_metrics=False remove_checkpoints=True \
        selected_dataset_groups=["uci","oml_297","oml_299","oml_269"] \
        tuning_type="regul_and_posthoc"

Note that interrupting and running the same command again will skip the models that have already been computed. The parameter nb_workers allows to run multiple experiments at the same time.

Then, the corresponding figures can be created in the notebook main_figures.ipynb.

Experiments related to tuning

Experiments related to tuning can be run using:

python run.py name="hparam_tuning" nb_workers=1 repeat_tuning=5 \
        log_base_dir="logs" progress_bar=False \
        save_train_metrics=False save_val_metrics=False remove_checkpoints=True \
        selected_dataset_groups=["uci","oml_297","oml_299","oml_269"] \
        tuning_type="hparam_tuning"

Then, the corresponding figures can be created in the notebook hparam_figures.ipynb.

Cite

Please cite our paper if you use this code in your own work:

@inproceedings{DheurICML2023,
  title     = {A Large-Scale Study of Probabilistic Calibration in Neural Network Regression},
  author    = {Dheur, Victor and Ben taieb, Souhaib},
  booktitle = {The 40th International Conference on Machine Learning},
  year      = {2023},
}

probabilistic-calibration-study's People

Contributors

vekteur avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

Forkers

valeman

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.