Accuracy = (TP + TN) / (TP + TN + FP + FN) — overall correctness.
Precision = TP / (TP + FP) — of predicted positives, how many are actually positive.
Recall / Sensitivity / TPR = TP / (TP + FN) — of actual positives, how many were caught.
Specificity / TNR = TN / (TN + FP) — of actual negatives, how many were correctly rejected.
F1 Score = 2 · (Precision · Recall) / (Precision + Recall) — harmonic mean of precision and recall.
MCC (Matthews correlation coefficient) — balanced measure that works well even with class imbalance, ranges from −1 to +1.
Balanced Accuracy = (TPR + TNR) / 2 — useful when classes are imbalanced.
Prevalence = (TP + FN) / total — fraction of actual positives in the data.