-
Notifications
You must be signed in to change notification settings - Fork 48
Open
Description
A more sophisticated approach looks at the distribution of predictions, and makes
an informed trade-off between true positive (in this context also known as recall and hit
rate), and accuracy (i.e., the false positive rate).
https://en.wikipedia.org/wiki/Sensitivity_and_specificity
I think you mean specificity, not accuracy. Accuracy is not the false positive rate (accuracy = (TP + TN) / TP + TN + FP + FN). The trade-off when selecting threshold is between TPR (sensitivity) and TNR (specificity).
Metadata
Metadata
Assignees
Labels
No labels