All Projects → gpleiss → Temperature_scaling

gpleiss / Temperature_scaling

Licence: mit
A simple way to calibrate your neural network.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Temperature scaling

foosball
Analysis of table soccer games based on video recordings
Stars: ✭ 22 (-95.72%)
Mutual labels:  calibration
extrinsic calibration
Motion Based Multi-Sensor Extrinsic Calibration
Stars: ✭ 49 (-90.47%)
Mutual labels:  calibration
Ddcutil
Query and change Linux monitor settings using DDC/CI and USB
Stars: ✭ 351 (-31.71%)
Mutual labels:  calibration
RGBDCameraExtrinsicCalibration
RGB-D extrinsic parameter automatic calibration based on plane detection
Stars: ✭ 32 (-93.77%)
Mutual labels:  calibration
dvbcss-synctiming
Measuring synchronisation timing accuracy for DVB Compainion Screen Synchronisation TVs and Companions
Stars: ✭ 17 (-96.69%)
Mutual labels:  calibration
plenopticam
Light-field imaging application for plenoptic cameras
Stars: ✭ 111 (-78.4%)
Mutual labels:  calibration
libcalib
calibrate stereo cameras
Stars: ✭ 37 (-92.8%)
Mutual labels:  calibration
Imucalibration Gesture
calibration for Imu and show gesture
Stars: ✭ 382 (-25.68%)
Mutual labels:  calibration
Dell-S2716DGR-Calibration-Guide
Calibration guide for the Dell S2716DG and S2716DGR to get the best picture quality and colors
Stars: ✭ 33 (-93.58%)
Mutual labels:  calibration
Kalibr allan
IMU Allan standard deviation charts for use with Kalibr and inertial kalman filters.
Stars: ✭ 344 (-33.07%)
Mutual labels:  calibration
MetaMorpheus
Proteomics search software with integrated calibration, PTM discovery, bottom-up, top-down and LFQ capabilities
Stars: ✭ 59 (-88.52%)
Mutual labels:  calibration
remote-calibrator
Measure screen size, track viewing distance and gaze, and more!
Stars: ✭ 12 (-97.67%)
Mutual labels:  calibration
oomact
Object Oriented Modular Abstract Calibration Toolbox
Stars: ✭ 21 (-95.91%)
Mutual labels:  calibration
hexrd
A cross-platform, open-source library for the analysis of X-ray diffraction data.
Stars: ✭ 30 (-94.16%)
Mutual labels:  calibration
Easy handeye
Automated, hardware-independent Hand-Eye Calibration
Stars: ✭ 355 (-30.93%)
Mutual labels:  calibration
python-dts-calibration
A Python package to load raw Distributed Temperature Sensing (DTS) files, perform a calibration, and plot the result.
Stars: ✭ 21 (-95.91%)
Mutual labels:  calibration
uncertainty-calibration
A collection of research and application papers of (uncertainty) calibration techniques.
Stars: ✭ 120 (-76.65%)
Mutual labels:  calibration
Autoware.ai
Open-source software for self-driving vehicles
Stars: ✭ 5,044 (+881.32%)
Mutual labels:  calibration
Handeye calib camodocal
Easy to use and accurate hand eye calibration which has been working reliably for years (2016-present) with kinect, kinectv2, rgbd cameras, optical trackers, and several robots including the ur5 and kuka iiwa.
Stars: ✭ 364 (-29.18%)
Mutual labels:  calibration
Outlier Exposure
Deep Anomaly Detection with Outlier Exposure (ICLR 2019)
Stars: ✭ 343 (-33.27%)
Mutual labels:  calibration

Temperature Scaling

A simple way to calibrate your neural network. The temperature_scaling.py module can be easily used to calibrated any trained model.

Based on results from On Calibration of Modern Neural Networks.

Motivation

TLDR: Neural networks tend to output overconfident probabilities. Temperature scaling is a post-processing method that fixes it.

Long:

Neural networks output "confidence" scores along with predictions in classification. Ideally, these confidence scores should match the true correctness likelihood. For example, if we assign 80% confidence to 100 predictions, then we'd expect that 80% of the predictions are actually correct. If this is the case, we say the network is calibrated.

A simple way to visualize calibration is plotting accuracy as a function of confidence. Since confidence should reflect accuracy, we'd like for the plot to be an identity function. If accuracy falls below the main diagonal, then our network is overconfident. This happens to be the case for most neural networks, such as this ResNet trained on CIFAR100.

Uncalibrated ResNet

Temperature scaling is a post-processing technique to make neural networks calibrated. After temperature scaling, you can trust the probabilities output by a neural network:

Calibrated ResNet

Temperature scaling divides the logits (inputs to the softmax function) by a learned scalar parameter. I.e.

softmax = e^(z/T) / sum_i e^(z_i/T)

where z is the logit, and T is the learned parameter. We learn this parameter on a validation set, where T is chosen to minimize NLL.

Demo

First train a DenseNet on CIFAR100, and save the validation indices:

python train.py --data <path_to_data> --save <save_folder_dest>

Then temperature scale it

python demo.py --data <path_to_data> --save <save_folder_dest>

To use in a project

Copy the file temperature_scaling.py to your repo. Train a model, and save the validation set. (You must use the same validation set for training as for temperature scaling). You can do something like this:

from temperature_scaling import ModelWithTemperature

orig_model = ... # create an uncalibrated model somehow
valid_loader = ... # Create a DataLoader from the SAME VALIDATION SET used to train orig_model

scaled_model = ModelWithTemperature(orig_model)
scaled_model.set_temperature(valid_loader)
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].