Abstract
Objective To evaluate the inter- and intraobserver agreement regarding the Walch classification
system for shoulder arthritis.
Methods Computed tomography scans of the shoulder joint of adult patients were selected between
2012 and 2016, and they were classified by physicians with different levels of expertise
in orthopedics. The images were examined at three different times, and the analyses
were evaluated by the Fleiss Kappa index to verify the intra- and interobserver agreement.
Results The Kappa index for the intraobserver agreement ranged from 0.305 to 0.545. The inter-observer
agreement was very low at the end of the three evaluations (κ = 0.132).
Conclusion The intraobserver agreement regarding the modified Walch classification varied from
moderate to poor. The interobserver agreement was low.
Keywords
shoulder joint - osteoarthritis/classification - reproducibility of results