Subscribe to RSS
DOI: 10.1016/j.rbo.2018.04.002
Intra and Interobserver Agreement Regarding the Walch Classification System for Shoulder Joint Arthritis[*]
Article in several languages: português | EnglishPublication History
13 October 2017
03 April 2018
Publication Date:
13 December 2019 (online)
Abstract
Objective To evaluate the inter- and intraobserver agreement regarding the Walch classification system for shoulder arthritis.
Methods Computed tomography scans of the shoulder joint of adult patients were selected between 2012 and 2016, and they were classified by physicians with different levels of expertise in orthopedics. The images were examined at three different times, and the analyses were evaluated by the Fleiss Kappa index to verify the intra- and interobserver agreement.
Results The Kappa index for the intraobserver agreement ranged from 0.305 to 0.545. The inter-observer agreement was very low at the end of the three evaluations (κ = 0.132).
Conclusion The intraobserver agreement regarding the modified Walch classification varied from moderate to poor. The interobserver agreement was low.
* Work developed at Hospital da Pontifícia Universidade Católica de Campinas, Campinas, SP, Brazil. Originally Published by Elsevier.
-
Referências
- 1 Cofield RH, Briggs BT. Glenohumeral arthrodesis. Operative and long-term functional results. J Bone Joint Surg Am 1979; 61 (05) 668-677
- 2 Day JS, Lau E, Ong KL, Williams GR, Ramsey ML, Kurtz SM. Prevalence and projections of total shoulder and elbow arthroplasty in the United States to 2015. J Shoulder Elbow Surg 2010; 19 (08) 1115-1120
- 3 Walch G, Badet R, Boulahia A, Khoury A. Morphologic study of the glenoid in primary glenohumeral osteoarthritis. J Arthroplasty 1999; 14 (06) 756-760
- 4 Bercik MJ, Kruse II K, Yalizis M, Gauci MO, Chaoui J, Walch G. A modification to the Walch classification of the glenoid in primary glenohumeral osteoarthritis using three-dimensional imaging. J Shoulder Elbow Surg 2016; 25 (10) 1601-1606
- 5 Viera AJ, Garrett JM. Understanding interobserver agreement: the kappa statistic. Fam Med 2005; 37 (05) 360-363
- 6 Altman DG. Practical statistics for medical research. 3rd ed. London: Chapman and Hall; 1995
- 7 Svanholm H, Starklint H, Gundersen HJ, Fabricius J, Barlebo H, Olsen S. Reproducibility of histomorphologic diagnoses with special reference to the kappa statistic. APMIS 1989; 97 (08) 689-698
- 8 Matsunaga FT, Tamaoki MJ, Cordeiro EF. , et al. Are classifications of proximal radius fractures reproducible?. BMC Musculoskelet Disord 2009; 10: 120
- 9 Belloti JC, Tamaoki MJ, Franciozi CE. , et al. Are distal radius fracture classifications reproducible? Intra and interobserver agreement. Sao Paulo Med J 2008; 126 (03) 180-185
- 10 Scalise JJ, Codsi MJ, Bryan J, Brems JJ, Iannotti JP. The influence of three-dimensional computed tomography images of the shoulder in preoperative planning for total shoulder arthroplasty. J Bone Joint Surg Am 2008; 90 (11) 2438-2445
- 11 Budge MD, Lewis GS, Schaefer E, Coquia S, Flemming DJ, Armstrong AD. Comparison of standard two-dimensional and three-dimensional corrected glenoid version measurements. J Shoulder Elbow Surg 2011; 20 (04) 577-583