Subscribe to RSS
DOI: 10.1055/s-0044-1792121
Assessment of Reproducibility and Agreement of the IDEAL Classification for Distal Radius Fractures
Article in several languages: português | English Financial Support The authors declare that the did not receive financial support from agencies in the public, private, or non-profit sectors to conduct the present study.Abstract
Objective To analyze the reproducibility and intra- and interobserver agreement of the IDEAL classification for distal radius fractures.
Methods This qualitative, analytical study evaluated 50 pairs of radiographs in two views of patients with distal radius fractures. There were ten observers with different levels of orthopedic training who assessed the radiographs in three different evaluations. The results underwent the Cohen and Fleiss Kappa tests to determine intra- and interobserver agreement levels. Statistical calculations used Excel and SPSS, version 26.0.
Results The Cohen Kappa index values for intraobserver evaluation indicated poor to little agreement (-0.177–0.259), with statistical significance in only one instance. The Fleiss Kappa index values revealed little agreement among the resident group (0.277–0.383) with statistical significance, poor to little agreement among the general orthopedists (0.114–0.225) with statistical significance in most instances, and moderate agreement among hand surgeons (0.449–0.533) with statistical significance.
Conclusion The IDEAL classification had interobserver agreement levels ranging from poor to moderate, influenced by the physicians' training level. Other intraobserver agreement levels ranged from poor to little agreement but without statistical significance.
Publication History
Received: 22 February 2024
Accepted: 05 September 2024
Article published online:
21 December 2024
© 2024. The Author(s). This is an open access article published by Thieme under the terms of the Creative Commons Attribution 4.0 International License, permitting copying and reproduction so long as the original work is given appropriate credit (https://creativecommons.org/licenses/by/4.0/)
Thieme Revinter Publicações Ltda.
Rua do Matoso 170, Rio de Janeiro, RJ, CEP 20270-135, Brazil
-
Referências
- 1 Belloti JC, dos Santos JB, de Moraes VY, Wink FV, Tamaoki MJ, Faloppa F. The IDEAL classification system: a new method for classifying fractures of the distal extremity of the radius - description and reproducibility. Sao Paulo Med J 2013; 131 (04) 252-256
- 2 Azad A, Kang HP, Alluri RK, Vakhshori V, Kay HF, Ghiassi A. Epidemiological and Treatment Trends of Distal Radius Fractures across Multiple Age Groups. J Wrist Surg 2019; 8 (04) 305-311
- 3 Mauck BM, Swigler CW. Evidence-Based Review of Distal Radius Fractures. Orthop Clin North Am 2018; 49 (02) 211-222
- 4 Nellans KW, Kowalski E, Chung KC. The epidemiology of distal radius fractures. Hand Clin 2012; 28 (02) 113-125
- 5 Kleinlugtenbelt YV, Groen SR, Ham SJ. et al. Classification systems for distal radius fractures. Acta Orthop 2017; 88 (06) 681-687
- 6 Jayakumar P, Teunis T, Giménez BB, Verstreken F, Di Mascio L, Jupiter JB. AO Distal Radius Fracture Classification: Global Perspective on Observer Agreement. J Wrist Surg 2017; 6 (01) 46-53
- 7 Siripakarn Y, Niempoog S, Boontanapibul K. The comparative study of reliability and reproducibility of distal radius' fracture classification among: AO frykman and Fernandez classification systems. J Med Assoc Thai 2013; 96 (01) 52-57
- 8 Illarramendi A, González Della Valle A, Segal E, De Carli P, Maignon G, Gallucci G. Evaluation of simplified Frykman and AO classifications of fractures of the distal radius. Assessment of interobserver and intraobserver agreement. Int Orthop 1998; 22 (02) 111-115
- 9 Shehovych A, Salar O, Meyer C, Ford DJ. Adult distal radius fractures classification systems: essential clinical knowledge or abstract memory testing?. Ann R Coll Surg Engl 2016; 98 (08) 525-531
- 10 Naqvi SG, Reynolds T, Kitsis C. Interobserver reliability and intraobserver reproducibility of the Fernandez classification for distal radius fractures. J Hand Surg Eur Vol 2009; 34 (04) 483-485
- 11 Moloney M, Kåredal J, Persson T, Farnebo S, Adolfsson L. Poor reliability and reproducibility of 3 different radiographical classification systems for distal ulna fractures. Acta Orthop 2022; 93: 438-443
- 12 Yinjie Y, Gen W, Hongbo W. et al. A retrospective evaluation of reliability and reproducibility of Arbeitsgemeinschaftfür Osteosynthesefragen classification and Fernandez classification for distal radius fracture. Medicine (Baltimore) 2020; 99 (02) e18508
- 13 Giraudeau B, Mary JY. Planning a reproducibility study: how many subjects and how many replicates per subject for an expected width of the 95 per cent confidence interval of the intraclass correlation coefficient. Stat Med 2001; 20 (21) 3205-3214
- 14 Karanicolas PJ, Bhandari M, Kreder H. et al; Collaboration for Outcome Assessment in Surgical Trials (COAST) Musculoskeletal Group. Evaluating agreement: conducting a reliability study. J Bone Joint Surg Am 2009; 91 (Suppl. 03) 99-106
- 15 Fleiss JL, Cohen J, Everitt BS. Large sample standard errors of Kappa and weighted Kappa. Psychol Bull 1969; 72 (05) 323-327
- 16 Audigé L, Bhandari M, Kellam J. How reliable are reliability studies of fracture classifications? A systematic review of their methodologies. Acta Orthop Scand 2004; 75 (02) 184-194
- 17 Andersen DJ, Blair WF, Steyers Jr CM, Adams BD, el-Khouri GY, Brandser EA. Classification of distal radius fractures: an analysis of interobserver reliability and intraobserver reproducibility. J Hand Surg Am 1996; 21 (04) 574-582
- 18 Belloti JC, Tamaoki MJ, Franciozi CE. et al. Are distal radius fracture classifications reproducible? Intra and interobserver agreement. Sao Paulo Med J 2008; 126 (03) 180-185