Cohen’s kappa coefficient

CR Chrystal M. Reed
KB Kurtis G. Birch
JK Jan Kamiński
SS Shannon Sullivan
JC Jeffrey M. Chung
AM Adam N. Mamelak
UR Ueli Rutishauser
request Request a Protocol
ask Ask a question
Favorite

The Cohen’s kappa coefficient (ƙ) is a statistical measure of inter-rater reliability between two raters7 and was calculated as following30:

where Po is the proportion of observed agreements and Pc is the proportion of agreements expected by chance30. Here, Pc=0.5, since our algorithm identifies two possibilities (SWS and non-SWS). Kappa is considered to be a more robust measure than simple percent agreement calculations because ƙ takes into account only the agreements that occur above chance30. Following the sleep staging literature, the interpretation of kappa coefficients is as follows; values of ≤0.00 indicate poor agreement; 0.00–0.20 indicate slight agreement; 0.21–0.40 indicate fair agreement; 0.41–0.60 indicate moderate agreement; 0.61–0.80 indicate substantial agreement; and more than 0.80 indicates excellent agreement18.

Do you have any questions about this protocol?

Post your question to gather feedback from the community. We will also invite the authors of this article to respond.

0/150

tip Tips for asking effective questions

+ Description

Write a detailed description. Include all information that will help others answer your question including experimental processes, conditions, and relevant images.

post Post a Question
0 Q&A