For optimal use, please visit DATAtab on your desktop PC!

Metric Variables:
Ordinal Variables:
Nominal Variables:

How-to

Cohen’s Kappa Calculator

Example data set

The Cohen's Kappa is a value used for interrater reliability. If you want to calculate the Cohen's Kappa with DATAtab, you only need to select two nominal variables, then the Cohen's Kappa will be calculated online. For the weighted Cohen's Kappa, please select two ordinal variables. You can easily change the scale level in the first row.

Cohen’s Kappa Calculator

Calculate Cohen's Kappa online

Cohen's Kappa is used to perform an inter-rater reliability analysis between the dependent samples Rater 1 and Rater 2. For this purpose, the Cohens Kappa is calculated, which is a measure of the agreement between two dependent categorical samples.

After Cohen's Kappa has been calculated, you can use the following table for interpretation (Landis & Koch (1977)).

Kappa Level of Agreement
> 0,8 Almost perfect
> 0,6 Substantial
> 0,4 Moderate
> 0,2 Fair
> 0 Slight
< 0 No agreement

Cohen's kappa

The Cohens Kappa showed you how high the agreement between Rater 1 and Rater 2 is.

Cohen's kappa (often simply denoted as κ) is a statistic used to measure the agreement between two raters who each classify items into categorical classes, beyond what is expected by chance. It's especially useful when considering that agreement might occur just by chance .Mathematically, it's given by the formula:

kappa = (PO - PE)/(1 - PE)

  • PO is the observed proportion of agreement.
  • PE is the expected proportion of agreement by chance.

The value of kappa ranges from -1 to 1:

  • A value of 1 indicates perfect agreement.
  • A value of 0 indicates agreement no better than chance.
  • A value less than 0 indicates agreement less than chance.

Standard error Cohen's kappa

The standard error for Cohen's kappa is calculated according to Cohen (1960) with the following formula calculated

Standard error Cohen's kappa

Calculate weighted Cohen's Kappa

Cohen's Kappa takes into account the agreement between two raters, but degree of agreement is not considered. However, if the variables are ordinal, it is desirable to include the rank in the calculation. This is exactly what is done in weighted Cohen's Kappa.

If you click on two ordinal variables, the weighted Cohen's Kappa is automatically calculated. You can then choose whether you want to have a linear or a quadratic weighting.

Kappa Online Calculator

Cohens Kappa is calculated in statistics to determine interrater reliability. On DATAtab you can calculate either the Cohen’s Kappa or the Fleiss Kappa online. If you want to calculate the Cohen's Kappa, simply select 2 categorical variables, if you want to calculate the Fleiss Kappa, simply select three variables.

References

Cohen, Jacob. 1960. 'A Coefficient of Agreement for Nominal Scales'. Educational and Psychological Measurement. Vol. 20, No. 1, 37-46.

Cite DATAtab: DATAtab Team (2024). DATAtab: Online Statistics Calculator. DATAtab e.U. Graz, Austria. URL https://datatab.net

Contact & Support FAQ & About Us Privacy Policy Statistics Software