Calculate classification metrics from confusion matrix values.
| Metric | Value | Percentage | Formula | Description |
|---|---|---|---|---|
| Accuracy | 0.8571 | 85.71% | (TP + TN) / Total | Overall correctness |
| Error Rate | 0.1429 | 14.29% | (FP + FN) / Total | Overall error |
| Precision | 0.8333 | 83.33% | TP / (TP + FP) | Positive predictive value |
| Recall (TPR) | 0.9091 | 90.91% | TP / (TP + FN) | Sensitivity / True Positive Rate |
| Specificity (TNR) | 0.8000 | 80.00% | TN / (TN + FP) | True Negative Rate |
| FPR | 0.2000 | 20.00% | FP / (FP + TN) | False Positive Rate |
| FNR | 0.0909 | 9.09% | FN / (FN + TP) | False Negative Rate |
| F1 Score | 0.8696 | 86.96% | 2·(P·R)/(P+R) | Harmonic mean of P & R |
| F0.5 Score | 0.8475 | 84.75% | 1.25·(P·R)/(0.25P+R) | Precision-weighted F |
| F2 Score | 0.8929 | 89.29% | 5·(P·R)/(4P+R) | Recall-weighted F |
| Fβ (β=1) | 0.8696 | 86.96% | (1+β²)·(P·R)/(β²P+R) | Custom β F-score |
Model correctly predicts positive class
Model correctly predicts negative class
Model incorrectly predicts positive (Type I Error)
Model incorrectly predicts negative (Type II Error)