Cohen's kappa is a widely used association coefficient for summarizing interrater agreement on a nominal scale. Kappa reduces the ratings of the two observers to a single number. With three or more categories it is more informative to summarize the ratings by category coefficients that describe the information for each category separately. Examples of category coefficients are the sensitivity or specificity of a category or the Bloch-Kraemer weighted kappa. However, in many research studies one is often only interested in a single overall number that roughly summarizes the agreement. It is shown that both the overall observed agreement and Cohen's kappa are weighted averages of various category coefficients and thus can be used to summarize these category coefficients.
CITATION STYLE
Warrens, M. J. (2014). New interpretations of cohen’s kappa. Journal of Mathematics, 2014. https://doi.org/10.1155/2014/203907
Mendeley helps you to discover research relevant for your work.