High Agreement and High Prevalence: The Paradox of Cohen’s Kappa

  • Zec S
  • Soriani N
  • Comoretto R
  • et al.
87Citations
Citations of this article
84Readers
Mendeley users who have this article in their library.

Abstract

© 2017 Zec et al. Background: Cohen’s Kappa is the most used agreement statistic in literature. However, under certain conditions, it is affected by a paradox which returns biased estimates of the statistic itself. Objective: The aim of the study is to provide sufficient information which allows the reader to make an informed choice of the correct agreement measure, by underlining some optimal properties of Gwet’s AC1 in comparison to Cohen’s Kappa, using a real data example. Method: During the process of literature review, we have asked a panel of three evaluators to come up with a judgment on the quality of 57 randomized controlled trials assigning a score to each trial using the Jadad scale. The quality was evaluated according to the following dimensions: adopted design, randomization unit, type of primary endpoint. With respect to each of the above described features, the agreement between the three evaluators has been calculated using Cohen’s Kappa statistic and Gwet’s AC1 statistic and, finally, the values have been compared with the observed agreement. Results: The values of the Cohen’s Kappa statistic would lead to believe that the agreement levels for the variables Unit, Design and Primary Endpoints are totally unsatisfactory. The AC1 statistic, on the contrary, shows plausible values which are in line with the respective values of the observed concordance. Conclusion: We conclude that it would always be appropriate to adopt the AC1 statistic, thus bypassing any risk of incurring the paradox and drawing wrong conclusions about the results of agreement analysis.

References Powered by Scopus

A Coefficient of Agreement for Nominal Scales

31890Citations
N/AReaders
Get full text

Intraclass correlations: Uses in assessing rater reliability

19142Citations
N/AReaders
Get full text

Assessing the quality of reports of randomized clinical trials: Is blinding necessary?

14794Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Head-to-head evaluation on diagnostic accuracies of six SARS-CoV-2 serological assays

34Citations
N/AReaders
Get full text

A Comparative Study of AI-Generated (GPT-4) and Human-crafted MCQs in Programming Education

31Citations
N/AReaders
Get full text

Cervical Disc and Ligamentous Injury in Hyperextension Trauma: MRI and Intraoperative Correlation

24Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Zec, S., Soriani, N., Comoretto, R., & Baldi, I. (2017). High Agreement and High Prevalence: The Paradox of Cohen’s Kappa. The Open Nursing Journal, 11(1), 211–218. https://doi.org/10.2174/1874434601711010211

Readers over time

‘17‘18‘19‘20‘21‘22‘23‘24‘2506121824

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 32

60%

Researcher 12

23%

Professor / Associate Prof. 8

15%

Lecturer / Post doc 1

2%

Readers' Discipline

Tooltip

Medicine and Dentistry 14

44%

Nursing and Health Professions 8

25%

Psychology 6

19%

Computer Science 4

13%

Save time finding and organizing research with Mendeley

Sign up for free
0