Matthews Correlation Coefficient Calculator
Use our Matthews correlation coefficient calculator (MCC) to measure the quality of a binary classifier using true positives, true negatives, false positives, and false negatives. Includes the Matthews correlation coefficient formula, MCC interpretation, and what is a good Matthews correlation coefficient.
What Is Matthews Correlation Coefficient?
The Matthews correlation coefficient (MCC) is a single-number score that evaluates a binary classification model using all four outcomes from a confusion matrix: true positives (TP), true negatives (TN), false positives (FP), and false negatives (FN).
MCC is often preferred over accuracy when classes are imbalanced because it accounts for both positive and negative predictions in a balanced way.
The MCC value ranges from -1 to +1: +1 indicates perfect prediction, 0 indicates performance no better than random guessing, and -1 indicates total disagreement between predictions and actual labels.
Matthews Correlation Coefficient Formula
MCC is computed from the confusion matrix counts TP, TN, FP, and FN. The result is a correlation-style score between predicted and actual binary labels.
If any factor in the denominator is 0, MCC can be undefined; many calculators return 0 in that edge case.
Higher is better for classification performance.
Accuracy can look high even with poor minority-class performance; MCC penalizes that.
Definitions
MCC is based on the four counts in a confusion matrix.
- True positive (TP)
- Cases where the model predicted positive and the actual label is positive.
- True negative (TN)
- Cases where the model predicted negative and the actual label is negative.
- False positive (FP)
- Cases where the model predicted positive but the actual label is negative (Type I error).
- False negative (FN)
- Cases where the model predicted negative but the actual label is positive (Type II error).
- Confusion matrix
- A 2×2 table that summarizes prediction outcomes (TP, TN, FP, FN) for binary classification.
How to Calculate MCC
- 1
Enter True Positives (TP): predicted positive and actually positive.
- 2
Enter True Negatives (TN): predicted negative and actually negative.
- 3
Enter False Positives (FP): predicted positive but actually negative.
- 4
Enter False Negatives (FN): predicted negative but actually positive.
Frequently Asked Questions
Matthews correlation coefficient (MCC) is a metric for binary classification that summarizes performance using TP, TN, FP, and FN, producing a score from -1 to 1.
In general, closer to 1 is better. Roughly speaking, values above ~0.5 often indicate a useful model, but what’s “good” depends on your dataset, class imbalance, and real-world costs of FP vs FN.
MCC = 0 means your predictions are about as informative as random guessing (no overall correlation between predicted and actual labels).
No. MCC is a correlation-style score, not a probability. It summarizes classification quality and can be negative.
Many calculators show a correlation coefficient (often r) for linear relationships between two numeric variables. MCC is different: it’s a correlation coefficient specifically designed for binary classification outcomes using TP/TN/FP/FN.
Compute TP·TN − FP·FN, then divide by the square root of (TP+FP)(TP+FN)(TN+FP)(TN+FN).
No. You only need the confusion matrix counts (TP, TN, FP, FN).