Giter Site home page Giter Site logo

ensta-u2is-ai / torch-uncertainty Goto Github PK

View Code? Open in Web Editor NEW
246.0 8.0 18.0 3.09 MB

Open-source framework for uncertainty and deep learning models in PyTorch :seedling:

Home Page: https://torch-uncertainty.github.io

License: Apache License 2.0

Python 100.00%
ensembles uncertainty-quantification predictive-uncertainty uncertainty reliable-ai pytorch bayesian-network neural-networks trustworthy-machine-learning mixup computer-vision

torch-uncertainty's Introduction

TorchUncertaintyLogo

pypi tests Docs PRWelcome Ruff Code Coverage Downloads Discord Badge

TorchUncertainty is a package designed to help you leverage uncertainty quantification techniques and make your deep neural networks more reliable. It aims at being collaborative and including as many methods as possible, so reach out to add yours!

🚧 TorchUncertainty is in early development 🚧 - expect changes, but reach out and contribute if you are interested in the project! Please raise an issue if you have any bugs or difficulties and join the discord server.

πŸ“š Our webpage and documentation is available here: torch-uncertainty.github.io. πŸ“š

TorchUncertainty contains the official implementations of multiple papers from major machine-learning and computer vision conferences and was/will be featured in tutorials at WACV 2024, HAICON 2024 and ECCV 2024.


This package provides a multi-level API, including:

  • easy-to-use ⚑ lightning uncertainty-aware training & evaluation routines for 4 tasks: classification, probabilistic and pointwise regression, and segmentation.
  • ready-to-train baselines on research datasets, such as ImageNet and CIFAR
  • pretrained weights for these baselines on ImageNet and CIFAR ( 🚧 work in progress 🚧 ).
  • layers, models, metrics, & losses available for use in your networks
  • scikit-learn style post-processing methods such as Temperature Scaling.

Have a look at the Reference page or the API reference for a more exhaustive list of the implemented methods, datasets, metrics, etc.

βš™οΈ Installation

TorchUncertainty requires Python 3.10 or greater. Install the desired PyTorch version in your environment. Then, install the package from PyPI:

pip install torch-uncertainty

The installation procedure for contributors is different: have a look at the contribution page.

🐎 Quickstart

We make a quickstart available at torch-uncertainty.github.io/quickstart.

πŸ“š Implemented methods

TorchUncertainty currently supports classification, probabilistic and pointwise regression, segmentation and pixelwise regression (such as monocular depth estimation). It includes the official codes of the following papers:

  • LP-BNN: Encoding the latent posterior of Bayesian Neural Networks for uncertainty quantification - IEEE TPAMI
  • Packed-Ensembles for Efficient Uncertainty Estimation - ICLR 2023 - Tutorial
  • MUAD: Multiple Uncertainties for Autonomous Driving, a benchmark for multiple uncertainty types and tasks - BMVC 2022

We also provide the following methods:

Baselines

To date, the following deep learning baselines have been implemented. Click πŸ“₯ on the methods for tutorials:

Augmentation methods

The following data augmentation methods have been implemented:

  • Mixup, MixupIO, RegMixup, WarpingMixup

Post-processing methods

To date, the following post-processing methods have been implemented:

Tutorials

Check out our tutorials at torch-uncertainty.github.io/auto_tutorials.

πŸ”­ Projects using TorchUncertainty

The following projects use TorchUncertainty:

  • A Symmetry-Aware Exploration of Bayesian Neural Network Posteriors - ICLR 2024

If you are using TorchUncertainty in your project, please let us know, we will add your project to this list!

torch-uncertainty's People

Contributors

alafage avatar badrmarani avatar dependabot[bot] avatar gaetanbrison avatar nshdesai avatar o-laurent avatar pomonam avatar qbouniot avatar xuanlongorz avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

torch-uncertainty's Issues

:sparkles: Enable custom distributions for probabilistic regression

The regression routines should allow the user to choose the distribution of liking. For now, we cannot distinguish Laplace and Gaussian distributions as we rely on the number of parameters to choose the corresponding distributions.

  • Implement at least the following distributions: Dirac (standard), Gaussian, Laplace, NIG

A potential solution was highlighted in this discussion:

"""
I had the same kind of problem with another project of mine, I would suggest creating a function like this:

def get_distribution(dist_name: str, dist_params: Tensor) -> Distribution:
    ...

I think it could be interesting to use the Distribution class from Pytorch, but it might be too much. A dictionary with the parameters should do it nicely too.
"""

Originally posted by @alafage in #46 (comment)

Finetuning techiques

Can you have a tutorial to add uncertainty estimation only with fine-tuning pretrained checkpoints ?

Pretrained parameters

Hi, thanks for your excellent work and sharing codes. I have a small question. If I apply packed ensemble, the pre-trained parameters of the ResNet on the ImageNet can't be used. Is it right?

:bug: Calibration plot error

In one of my latest training, I had the following error in plotting_utils.py, line 95:

val_oh = torch.nn.functional.one_hot(val.long(), num_classes=10)
RuntimeError: Class values must be non-negative.

I don't have much more information about this, but we should check if some cases lead to such an error.

Update pip version

Hello, thank you for your open source code to my research has a great inspiration and help. One small question I have is that PackedLinear and PackedConv2d in packed_layers.py don't seem to correspond to the latest github repository and the folders associated with them don't seem to be updated after pip installation. For example, the PackedLinear function in the source code of pip installation has no alpha parameter. Could you please update it to facilitate the use of pip installation?

:bug: issue concerning the interaction of MC-Dropout and the other methods

I'm running following command on cifar10

python3 resnet.py --version mimo --arch 18 --accelerator gpu --device 1 --benchmark True --max_epochs 75 --precision 16 --root "./data/" --num_estimators 4

but it raises error RuntimeError: The size of tensor a (512) must match the size of tensor b (128) at non-singleton dimension 0. I dive into the code and find that it may be related to this line, should we remove it? @o-laurent Could you help with this?

Edit by maintainer:

  • Either add an "mc_dropout" parameter to the CLI or to the models (we could have dropout during training + MIMO or else, especially for wideresnets)

Add Deep Evidential Regression

Now that we support regression, it could be interesting to implement simple methods, such as Deep Evidential Regression (DER).

This method would be relatively straightforward to implement. Here is a possible roadmap:

  • Add Deep Evidential Regression's loss
  • Change "dist_estimation" parameter of the regression routine by an integer parameter to choose the number of parameters that we have to predict for the distribution
  • Add tests
  • Add a tutorial to explain how to use DER
  • Add the reference to the documentation

Pytorch Lightning Model Question

Hi!

I have built my model using Pytorch Lighntning, thus it has the functions training_step, validation_step etc. I attempted to follow the tutorial here: https://torch-uncertainty.github.io/auto_tutorials/tutorial_der_cubic.html#gathering-everything-and-training-the-model

But it errors with NotImplementedError: Module [CMPNNModel] is missing the required "forward" function (which I guess may be obvious). So does this mean to utilise this package I will need to change my model from a PyTorch Lightning one to a Torch one? Or have I done something incorrect.

Thank you!

:bug: about learning-rate

Hello, after studying your paper, it has been very inspiring for my work. I still have some questions that I would like to consult with you

For the learning rate issue of training, what your code means is that the loss function is the average value of the losses of all experts. If there are four experts, then for each expert, the actual loss is divided by four, which means that when backpropagation is used to calculate the gradient, it will also be divided by four. Do you need to initially set a learning rate that is four times larger than the single model

:sparkles: Add an MCDropout Baseline

I have added dropout to standard ResNet in de31452. It could be interesting to add an MCDropout Ensemble baseline. To do this, the following steps could be considered:

  • Create a new baseline in torch-uncertainty/torch_uncertainty/baselines/classification/resnet.py in cls.ensemble
  • Find a solution to apply dropout in eval mode (potentially change nn.Dropout to F.dropout and replace self.training by true when the version is MCdropout)
  • Add tests
  • Add a tutorial to explain how to use the baseline
  • Add the reference to the documentation
  • Add a parameter to enable last-layer dropout

Add support for CIFAR-N

More information on this dataset on PapersWithCode

This dataset is straightforward to implement. Here is a possible roadmap:

  • Add the dataset to the datasets folder. Get inspiration from CIFAR-H and C, for instance.
  • Check that the dataset is downloaded correctly & automatically
  • Add these datasets to the testing alternatives of the CIFAR datamodule
  • Add the reference to the documentation
  • Add torch-uncertainty in PapersWithCode's dataloaders

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.