Endoscopy 2023; 55(09): 847-856
DOI: 10.1055/a-2041-7546
Systematic Review

Validity evidence for observational ERCP competency assessment tools: a systematic review

 1   Division of Gastroenterology and Hepatology, Department of Medicine, University of Toronto, Toronto, Canada
,
 2   Schulich School of Medicine and Dentistry, Western University, London, Canada
,
Nikko Gimpaya
 3   Division of Gastroenterology, St. Michael’s Hospital, Toronto, Canada
,
James Lisondra
 3   Division of Gastroenterology, St. Michael’s Hospital, Toronto, Canada
,
Nasruddin Sabrie
 4   Department of Medicine, University of Toronto, Toronto, Canada
,
Reza Gholami
 1   Division of Gastroenterology and Hepatology, Department of Medicine, University of Toronto, Toronto, Canada
 3   Division of Gastroenterology, St. Michael’s Hospital, Toronto, Canada
,
Rishi Bansal
 5   Michael G. DeGroote School of Medicine, McMaster University, Hamilton, Canada
,
 6   Department of Medicine, Queen’s University, Kingston, Canada
,
David Lightfoot
 7   Health Science Library, Unity Health Toronto, St. Michael’s Hospital, Toronto, Canada
,
Paul D. James
 8   Division of Gastroenterology, University Health Network, Toronto, Canada
,
 9   Joint Advisory Group on Gastrointestinal Endoscopy, Royal College of Physicians, London, United Kingdom
10   Immunology and Immunotherapy, University of Birmingham College of Medical and Dental Sciences, Birmingham, United Kingdom
,
11   Department of Medicine, Division of Gastroenterology and Hepatology, University of Calgary, Calgary, Alberta, Canada
12   Department of Community Health Sciences, University of Calgary, Calgary, Alberta, Canada
,
Sachin Wani
13   Division of Gastroenterology and Hepatology, University of Colorado Anschutz Medical Campus, Aurora, Colorado, USA
,
Rajesh N. Keswani
14   Division of Gastroenterology, Department of Medicine, Northwestern University, Chicago, Illinois, United States
,
Catharine M. Walsh
15   The Wilson Centre, University of Toronto, Toronto, Canada
16   SickKids Research and Learning Institute, The Hospital for Sick Children, Toronto, Canada
17   Division of Gastroenterology, Hepatology, and Nutrition, The Hospital for Sick Children, Toronto, Canada
18   Department of Paediatrics, Temerty Faculty of Medicine, University of Toronto, Toronto, Canada
,
Samir C. Grover
 1   Division of Gastroenterology and Hepatology, Department of Medicine, University of Toronto, Toronto, Canada
 3   Division of Gastroenterology, St. Michael’s Hospital, Toronto, Canada
 4   Department of Medicine, University of Toronto, Toronto, Canada
19   Li Ka Shing Knowledge Institute, Toronto, Canada
› Author Affiliations


Abstract

Background Assessment of competence in endoscopic retrograde cholangiopancreatography (ERCP) is critical for supporting learning and documenting attainment of skill. Validity evidence supporting ERCP observational assessment tools has not been systematically evaluated.

Methods We conducted a systematic search using electronic databases and hand-searching from inception until August 2021 for studies evaluating observational assessment tools of ERCP performance. We used a unified validity framework to characterize validity evidence from five sources: content, response process, internal structure, relations to other variables, and consequences. Each domain was assigned a score of 0–3 (maximum score 15). We assessed educational utility and methodological quality using the Accreditation Council for Graduate Medical Education framework and the Medical Education Research Quality Instrument, respectively.

Results From 2769 records, we included 17 studies evaluating 7 assessment tools. Five tools were studied for clinical ERCP, one for simulated ERCP, and one for simulated and clinical ERCP. Validity evidence scores ranged from 2 to 12. The Bethesda ERCP Skills Assessment Tool (BESAT), ERCP Direct Observation of Procedural Skills Tool (ERCP DOPS), and The Endoscopic Ultrasound (EUS) and ERCP Skills Assessment Tool (TEESAT) had the strongest validity evidence, with scores of 10, 12, and 11, respectively. Regarding educational utility, most tools were easy to use and interpret, and required minimal additional resources. Overall methodological quality (maximum score 13.5) was strong, with scores ranging from 10 to 12.5.

Conclusions The BESAT, ERCP DOPS, and TEESAT had strong validity evidence compared with other assessments. Integrating tools into training may help drive learners’ development and support competency decision making.

These two senior authors contributed equally to this work.


Tables 1 s–4 s, Fig. 1 s



Publication History

Received: 31 August 2022

Accepted after revision: 23 February 2023

Accepted Manuscript online:
23 February 2023

Article published online:
18 April 2023

© 2023. Thieme. All rights reserved.

Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany

 
  • References

  • 1 Johnson G, Webster G, Boškoski I. et al. Curriculum for ERCP and endoscopic ultrasound training in Europe: European Society of Gastrointestinal Endoscopy (ESGE) position statement. Endoscopy 2021; 53: 1071-1087
  • 2 Dumonceau J-M, Kapral C, Aabakken L. et al. ERCP-related adverse events: European Society of Gastrointestinal Endoscopy (ESGE) Guideline. Endoscopy 2020; 52: 127-149
  • 3 Faigel DO, Baron TH, Lewis B. Ensuring competence in endoscopy – prepared by the ASGE Taskforce on Ensuring Competence in Endoscopy. Oak Brook, IL: American Society for Gastrointestinal Endoscopy. 2006: 1-31 Available at (Accessed 25.05.2022): https://www.asge.org/docs/default-source/education/practice_guidelines/doc-competence.pdf?sfvrsn=1bfd4951_6
  • 4 Cotton PB, Feussner D, Dufault D. et al. A survey of credentialing for ERCP in the United States. Gastrointest Endosc 2017; 86: 866-869
  • 5 Lim BS, Leung JW, Lee J. et al. Effect of ERCP mechanical simulator (EMS) practice on trainees’ ERCP performance in the early learning period: US multicenter randomized controlled trial. Am J Gastroenterol 2011; 106: 300-306
  • 6 Ekkelenkamp VE, Koch AD, Rauws EA. et al. Competence development in ERCP: the learning curve of novice trainees. Endoscopy 2014; 46: 949-955
  • 7 Wani S, Keswani R, Hall M. et al. A prospective multicenter study evaluating learning curves and competence in endoscopic ultrasound and endoscopic retrograde cholangiopancreatography among advanced endoscopy trainees: the Rapid Assessment of Trainee Endoscopy Skills Study. Clin Gastroenterol Hepatol 2017; 15: 1758-1767
  • 8 Domagk D, Oppong KW, Aabakken L. et al. Performance measures for ERCP and endoscopic ultrasound: a European Society of Gastrointestinal Endoscopy (ESGE) quality improvement initiative. Endoscopy 2018; 50: 1116-1127
  • 9 Walsh CM. In-training gastrointestinal endoscopy competency assessment tools: types of tools, validation and impact. Best Pract Res Clin Gastroenterol 2016; 30: 357-374
  • 10 James P, Antonova L, Martel M. et al. Measures of trainee performance in advanced endoscopy: a systematic review. Best Pract Res Clin Gastroenterol 2016; 30: 421-452
  • 11 Voiosu T, Bălănescu P, Voiosu A. et al. Measuring trainee competence in performing endoscopic retrograde cholangiopancreatography: a systematic review of the literature. United European Gastroenterol J 2019; 7: 239-249
  • 12 Khan R, Zheng E, Wani S. et al. Colonoscopy competence assessment tools: a systematic review of validity evidence. Endoscopy 2021; 53: 1235-1245
  • 13 Messick S. Validity of psychological assessment: validation of inferences from persons’ responses and performances as scientific inquiry into score meaning. Am Psychol 1995; 50: 741
  • 14 Cook DA, Beckman TJ. Current concepts in validity and reliability for psychometric instruments: theory and application. Am J Med 2006; 119: 166
  • 15 Wani S, Hall M, Wang AY. et al. Variation in learning curves and competence for ERCP among advanced endoscopy trainees by using cumulative sum analysis. Gastrointest Endosc 2016; 83: 711-719
  • 16 Elmunzer BJ, Walsh CM, Guiton G. et al. Development and initial validation of an instrument for video-based assessment of technical skill in ERCP. Gastrointest Endosc 2021; 93: 914-923
  • 17 Ekkelenkamp VE, Koch AD, Haringsma J. et al. Quality evaluation through self-assessment: a novel method to gain insight into ERCP performance. Frontline Gastroenterol 2014; 5: 10-16
  • 18 Walsh CM, Ling SC, Khanna N. et al. Gastrointestinal Endoscopy Competency Assessment Tool: development of a procedure-specific assessment tool for colonoscopy. Gastrointest Endosc 2014; 79: 798-807
  • 19 Ghaderi I, Manji F, Park YS. et al. Technical skills assessment toolbox: a review using the unitary framework of validity. Ann Surg 2015; 261: 251-262
  • 20 Swing SR, Clyman SG, Holmboe ES. et al. Advancing resident assessment in graduate medical education. J Grad Med Educ 2009; 1: 278-286
  • 21 Cook DA, Reed DA. Appraising the quality of medical education research methods: the medical education research study quality instrument and the Newcastle–Ottawa scale-education. Acad Med 2015; 90: 1067-1076
  • 22 Liu K, Elmunzer BJ, Wani SB. et al. Tu1054 The Bethesda ERCP Skills Assessment Tool identifies variations in procedural skill between novice and experienced ERCPists. Gastrointest Endosc 2020; 91: AB523
  • 23 Liu K, Chak A, Faulx AL. et al. ID: 3523603 A video-based educational intervention does not significantly improve trainees’ ability to assess ERCP skill: results of a randomized control trial. Gastrointest Endosc 2021; 93: AB67-AB68
  • 24 Siau K, Dunckley P, Feeney M. et al. ERCP assessment tool: evidence of validity and competency development during training. Endoscopy 2019; 51: 1017-1026
  • 25 Siau K, Dunckley P, Anderson J. et al. PTU-009 Competency of endoscopic non-technical skills (ENTS) during endoscopy training. Gut 2017; 66: A54-55
  • 26 Jowell PS, Baillie J, Branch MS. et al. Quantitative assessment of procedural competence: a prospective study of training in endoscopic retrograde cholangiopancreatography. Ann Intern Med 1996; 125: 983-989
  • 27 Liao W-C, Leung JW, Wang H-P. et al. Coached practice using ERCP mechanical simulator improves trainees’ ERCP performance: a randomized controlled trial. Endoscopy 2013; 45: 799-805
  • 28 Meng W, Yue P, Leung JW. et al. Impact of mechanical simulator practice on clinical ERCP performance by novice surgical trainees: a randomized controlled trial. Endoscopy 2020; 52: 1004-1013
  • 29 Ekkelenkamp VE, Koch AD, Rauws E. et al. Sa1517 Assessment of ERCP performance in novice trainees. Gastrointest Endosc 2013; 77 : AB234
  • 30 Wani S, Keswani RN, Han S. et al. Competence in endoscopic ultrasound and endoscopic retrograde cholangiopancreatography, from training through independent practice. Gastroenterology 2018; 155: 1483-1494
  • 31 Wani S, Han S, Simon V. et al. Setting minimum standards for training in EUS and ERCP: results from a prospective multicenter study evaluating learning curves and competence among advanced endoscopy trainees. Gastrointest Endosc 2019; 89: 1160-1168
  • 32 von Delius S, Thies P, Meining A. et al. Validation of the X-Vision ERCP training system and technical challenges during early training of sphincterotomy. Clin Gastroenterol Hepatol 2009; 7: 389-396
  • 33 Vogt VY, Givens VM, Keathley CA. et al. Is a resident’s score on a videotaped objective structured assessment of technical skills affected by revealing the resident’s identity?. Am J Obstet Gynecol 2003; 189: 688-691
  • 34 Siau K, Keane MG, Steed H. et al. UK Joint Advisory Group consensus statements for training and certification in endoscopic retrograde cholangiopancreatography. Endosc Int Open 2022; 10: E37-E49
  • 35 Endoscopy Guidance Group for New Zealand. Guidelines for Local Credentialing in Adult Endoscopy. Wellington, New Zealand: Endoscopy Guidance Group for New Zealand; 2021