PsychologyResearch MethodsGCSE
AQAIBCambridgeCBSEEdexcelESOICSE

Cohen's Kappa (Agreement) Calculator

Measure of inter-rater agreement that accounts for chance.

Use the free calculatorCheck the variablesOpen the advanced solver
This is the free calculator preview. Advanced walkthroughs stay in the app.
Result
Ready
Cohen's Kappa

Formula first

Overview

Cohen's Kappa is a statistical measure used to assess the reliability of agreement between two raters who classify items into mutually exclusive categories. Unlike simple percent agreement, it accounts for the agreement that would occur by random chance, providing a more conservative and accurate estimate of inter-rater consistency.

Symbols

Variables

\kappa = Cohen's Kappa, p_o = Observed Agr., p_e = Expected Agr.

Cohen's Kappa
Observed Agr.
Expected Agr.

Apply it well

When To Use

When to use: This equation is used when two independent observers are categorizing data and you need to ensure their observations are reliable. It is specifically designed for nominal or categorical data rather than ordinal or continuous scales.

Why it matters: In fields like clinical psychology, high inter-rater reliability ensures that diagnoses are consistent regardless of the clinician. Without adjusting for chance, researchers might overestimate the validity of their observational data, leading to flawed conclusions.

Avoid these traps

Common Mistakes

  • Using simple percent agreement when chance is high.

One free problem

Practice Problem

Two research assistants are coding video clips for aggressive behavior. They agree on 85% of the clips (po = 0.85). Given the frequency of the behaviors, the expected agreement by chance is calculated to be 40% (pe = 0.40). What is the Cohen's Kappa for these raters?

Observed Agr.0.85
Expected Agr.0.4

Solve for:

Hint: Subtract the chance agreement from the observed agreement, then divide by the difference between 1 and the chance agreement.

The full worked solution stays in the interactive walkthrough.

References

Sources

  1. Wikipedia: Cohen's kappa
  2. Field, A. (2018). Discovering Statistics Using IBM SPSS Statistics (5th ed.). SAGE Publications.
  3. Cohen, J. (1960). A coefficient of agreement for nominal scales. Educational and Psychological Measurement, 20(1), 37-46.
  4. Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics, 33(1), 159-174.
  5. Agresti, A. (2013). Categorical Data Analysis (3rd ed.). John Wiley & Sons.
  6. GCSE Psychology — Research Methods