Skip to content

Commit af5298f

Browse files
DOC: Update The conformity (or "calibration") set page, with an emphasis on MAPIE methods rather than the internal works (#625)
1 parent 7cc0efa commit af5298f

File tree

4 files changed

+16
-15
lines changed

4 files changed

+16
-15
lines changed

doc/images/cp_cross.png

5.21 KB
Loading

doc/images/cp_prefit.png

6.63 KB
Loading

doc/images/cp_split.png

15.9 KB
Loading

doc/split_cross_conformal.rst

Lines changed: 16 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -1,45 +1,46 @@
11
################################################################
2-
The calibration (or "conformity") set
2+
The conformity (or "calibration") set
33
################################################################
44

5-
**MAPIE** is based on two types of techniques:
5+
**MAPIE** is based on two types of techniques for measuring uncertainty in regression and classification:
66

77
- the split-conformal predictions,
88
- the cross-conformal predictions.
99

10-
In all cases, the training/calibration process can be broken down as follows:
10+
In all cases, the training/conformalization process can be broken down as follows:
1111

12-
- Identify a basic model (or pre-trained model).
13-
- Wrap it with the MAPIE class.
14-
- Fit new model to calibration data (or full data if cross-validation) to estimate conformity scores.
15-
- Predict target on test data to obtain prediction intervals/sets based on conformity scores.
12+
- Train a model using the training set (or full dataset if cross-conformal).
13+
- Estimate conformity scores using the conformity set (or full dataset if cross-conformal).
14+
- Predict target on test data to obtain prediction intervals/sets based on these conformity scores.
1615

1716

1817
1. Split conformal predictions
1918
==============================
2019

21-
- Construction of a conformity score.
22-
- Calibration of the conformity score on a calibration set not seen by the model during training.
20+
- Compute conformity scores ("conformalization") on a conformity set not seen by the model during training.
2321

24-
**MAPIE** then uses the calibrated conformity scores to estimate sets associated with the desired coverage on new data with strong theoretical guarantees.
22+
**MAPIE** then uses the conformity scores to estimate sets associated with the desired coverage on new data with strong theoretical guarantees.
2523

26-
.. image:: images/cp_split.png
24+
Split conformal predictions with a pre-trained model
25+
------------------------------------------------------------------------------------
26+
27+
.. image:: images/cp_prefit.png
2728
:width: 600
2829
:align: center
2930

3031

31-
Prefit mode of split conformal predictions
32-
------------------------------------------
32+
Split conformal predictions with an untrained model
33+
------------------------------------------------------------------------------------
3334

34-
.. image:: images/cp_prefit.png
35+
.. image:: images/cp_split.png
3536
:width: 600
3637
:align: center
3738

3839

3940
2. Cross conformal predictions
4041
==============================
4142

42-
- Conformity scores on the whole training set obtained by cross-validation,
43+
- Conformity scores on the whole dataset obtained by cross-validation,
4344
- Perturbed models generated during the cross-validation.
4445

4546
**MAPIE** then combines all these elements in a way that provides prediction intervals on new data with strong theoretical guarantees.

0 commit comments

Comments
 (0)