Matthews Correlation Coefficient Formula:
From: | To: |
Definition: The MCC is a measure of the quality of binary classifications that returns a value between -1 and +1.
Purpose: It provides a balanced evaluation metric even when the classes are of very different sizes, unlike simpler metrics like accuracy.
The calculator uses the formula:
Where:
Explanation: The MCC is essentially a correlation coefficient between the observed and predicted binary classifications.
Details:
Tips: Enter the counts from your confusion matrix (TP, TN, FP, FN). All values must be ≥ 0.
Q1: Why use MCC instead of accuracy?
A: MCC is more informative than accuracy when classes are imbalanced, as it considers all four confusion matrix categories.
Q2: What's considered a "good" MCC value?
A: Generally, MCC > 0.5 is good, > 0.7 is strong, and > 0.9 is excellent, but this depends on your field.
Q3: Can MCC be used for multi-class problems?
A: Yes, through generalizations like the Rₖ statistic, but this calculator handles binary classification only.
Q4: How does MCC compare to F1 score?
A: MCC considers all four confusion matrix categories, while F1 score focuses only on the positive class (TP, FP, FN).
Q5: What if my denominator is zero?
A: The calculator returns 0 in this case, which occurs when any term in the denominator is zero (perfect or worst possible predictions).