Kappa is a coefficient that measures the proportion of agreement above that expected by chance. kappa calculations are shown below.
Proportion of items in which agreement occurred
The sum of the products of each classification proportion
The maximum value of kappa given the observed lack of symmetry
The standard error to test if kappa is equal to zero (No Agreement)
The significance of kappa is tested with a z-score
kappa Confidence Interval Calculations
Proportion of items in which agreement occurred
The sum of the products of each classification proportion
Used to calculate kappamax
The standard error to test if kappa is equal to zero (No Agreement)
The standard error used for kappa confidence intervals
kappa confidence interval
Proportion of items in which agreement occurred
The sum of the products of each classification proportion
The standard error to test if kappa is equal to zero (No Agreement)
The significance of kappa is tested with a z-score
The standard error used for kappa confidence intervals
kappa confidence interval
nxy = The count of Category x, for Appraiser y or Standard s
This is the statistic used to test for overall Concordance.
It is evaluated as a z test statistic.