Interactive Tool

Performance Metrics

Calculate classification metrics from confusion matrix values.

Confusion Matrix

Predicted +
Predicted -
Actual +
Actual -
Total Samples
105
Actual Positive
55
Actual Negative
50
Predicted Positive
60
Precision-focusedBalanced (1.0)Recall-focused

Key Metrics

Accuracy
85.71%
(TP + TN) / Total
Error Rate
14.29%
(FP + FN) / Total
Precision
83.33%
TP / (TP + FP)
Recall (TPR)
90.91%
TP / (TP + FN)

F-Scores

F1 Score86.96%
F0.5 Score84.75%
F2 Score89.29%
Fβ (β=1)86.96%

All Metrics

MetricValuePercentageFormulaDescription
Accuracy0.857185.71%(TP + TN) / TotalOverall correctness
Error Rate0.142914.29%(FP + FN) / TotalOverall error
Precision0.833383.33%TP / (TP + FP)Positive predictive value
Recall (TPR)0.909190.91%TP / (TP + FN)Sensitivity / True Positive Rate
Specificity (TNR)0.800080.00%TN / (TN + FP)True Negative Rate
FPR0.200020.00%FP / (FP + TN)False Positive Rate
FNR0.09099.09%FN / (FN + TP)False Negative Rate
F1 Score0.869686.96%2·(P·R)/(P+R)Harmonic mean of P & R
F0.5 Score0.847584.75%1.25·(P·R)/(0.25P+R)Precision-weighted F
F2 Score0.892989.29%5·(P·R)/(4P+R)Recall-weighted F
Fβ (β=1)0.869686.96%(1+β²)·(P·R)/(β²P+R)Custom β F-score

Definitions

True Positive (TP)

Model correctly predicts positive class

True Negative (TN)

Model correctly predicts negative class

False Positive (FP)

Model incorrectly predicts positive (Type I Error)

False Negative (FN)

Model incorrectly predicts negative (Type II Error)