Matthews Correlation Coefficient Formula:
From: | To: |
Definition: MCC is a measure of the quality of binary classifications that takes into account true and false positives and negatives.
Purpose: It provides a balanced evaluation metric even when the classes are of very different sizes, ranging from -1 (perfect inverse prediction) to +1 (perfect prediction).
The calculator uses the formula:
Where:
Explanation: MCC considers all four categories in the confusion matrix, making it more informative than metrics that only consider one or two categories.
Details: MCC is particularly useful when classes are imbalanced, as it provides a more reliable metric than accuracy or F1 score in these cases.
Tips: Enter the counts from your confusion matrix (TP, TN, FP, FN). All values must be non-negative integers.
Q1: What does an MCC value of 0 mean?
A: An MCC of 0 indicates that the classifier is no better than random prediction.
Q2: What's considered a good MCC value?
A: Generally, MCC > 0.3 is fair, > 0.5 is moderate, > 0.7 is strong, and > 0.9 is excellent.
Q3: When would MCC be undefined?
A: MCC is undefined when any term in the denominator is zero, which happens when any row or column in the confusion matrix sums to zero.
Q4: How does MCC compare to F1 score?
A: MCC considers all four confusion matrix categories, while F1 only considers precision and recall (TP, FP, FN).
Q5: Can MCC be used for multi-class problems?
A: Yes, there are multi-class generalizations of MCC available.