Inter-observer reliability and intra-observer reproducibility of the Weber classification of ankle fractures

44Citations
Citations of this article
60Readers
Mendeley users who have this article in their library.

Abstract

Our aim was to assess the reproducibility and the reliability of the Weber classification system for fractures of the ankle based on anteroposterior and lateral radiographs. Five observers with varying clinical experience reviewed 50 sets of blinded radiographs. The same observers reviewed the same radiographs again after an interval of four weeks. Inter- and intra-observer agreement was assessed based on the proportion of agreement and the values of the kappa coefficient. For inter-observer agreement, the mean kappa value was 0.61 (0.59 to 0.63) and the proportion of agreement was 78% (76% to 79%) and for intra-observer agreement the mean kappa value was 0.74 (0.39 to 0.86) with an 85% (60% to 93%) observed agreement. These results show that the Weber classification of fractures of the ankle based on two radiological views has substantial inter-observer reliability and intra-observer reproducibility. © 2006 British Editorial Society of Bone and Joint Surgery.

Cite

CITATION STYLE

APA

Malek, I. A., Machani, B., Mevcha, A. M., & Hyder, N. H. (2006, September). Inter-observer reliability and intra-observer reproducibility of the Weber classification of ankle fractures. Journal of Bone and Joint Surgery - Series B. https://doi.org/10.1302/0301-620X.88B9.17954

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free