Matthew Correlation Coefficient Calculator

Analyze binary outcomes from confusion matrix counts. See MCC, accuracy, precision, recall, and specificity together. Download clean reports and inspect trends through interactive charts.

Enter Confusion Matrix Values

Use the stacked page layout below. The calculator form itself uses three columns on large screens, two on medium screens, and one on mobile.

Reset

Formula Used

The Matthew correlation coefficient, commonly called the Matthews correlation coefficient, measures classification quality using all four confusion matrix cells. It stays informative even when classes are imbalanced.

MCC = (TP × TN − FP × FN) / √[(TP + FP)(TP + FN)(TN + FP)(TN + FN)]
  • TP: actual positives predicted as positive.
  • TN: actual negatives predicted as negative.
  • FP: actual negatives predicted as positive.
  • FN: actual positives predicted as negative.
  • MCC = 1: perfect agreement.
  • MCC = 0: no useful association.
  • MCC = -1: complete inverse agreement.
  • Undefined: one denominator factor equals zero.

How to Use This Calculator

  1. Enter a scenario name for your report.
  2. Fill in TP, TN, FP, and FN from your confusion matrix.
  3. Select how many decimal places you want displayed.
  4. Click Calculate MCC to see the result above the form.
  5. Review MCC, accuracy, recall, precision, specificity, and related measures.
  6. Use the CSV or PDF buttons to export the calculated summary.

Example Data Table

This sample shows how different confusion matrix combinations change MCC and related performance signals.

Scenario TP TN FP FN Total MCC Accuracy
Fraud Detection Model A 42 50 8 6 106 0.7367 86.7925%
Fraud Detection Model B 31 58 17 10 116 0.5341 76.7241%
Fraud Detection Model C 18 61 21 20 120 0.3173 65.8333%

Frequently Asked Questions

1. What does MCC measure?

MCC measures the quality of a binary classifier using all four confusion matrix values. It summarizes balanced agreement between predictions and actual outcomes on a scale from -1 to 1.

2. Why is MCC useful for imbalanced datasets?

Accuracy can look strong when one class dominates. MCC stays more reliable because it includes true positives, true negatives, false positives, and false negatives together.

3. What is a good MCC score?

Higher positive values are better. A score near 1 shows excellent agreement, around 0 suggests little useful association, and negative values imply systematically poor prediction direction.

4. Can MCC be negative?

Yes. Negative MCC means the prediction pattern moves opposite to the true labels more often than expected. Strongly negative values indicate consistent inverse agreement.

5. Why does the calculator sometimes show undefined MCC?

MCC becomes undefined when a denominator factor is zero. This happens when one row or one column in the confusion matrix has no observations.

6. Is MCC better than accuracy?

Not always, but it is often more informative. Accuracy reports overall correctness, while MCC captures balance across both classes and mistakes.

7. Does MCC work for multiclass models?

Yes, extended versions exist for multiclass settings. This page focuses on the binary confusion matrix form because it is the most common manual calculation.

8. Which inputs do I need before using this tool?

You need four counts: true positives, true negatives, false positives, and false negatives. These values usually come from a confusion matrix or model evaluation report.

Related Calculators

Important Note: All the Calculators listed in this site are for educational purpose only and we do not guarentee the accuracy of results. Please do consult with other sources as well.