CC BY-NC-ND 4.0 · Endosc Int Open 2024; 12(12): E1465-E1475
DOI: 10.1055/a-2465-7283
Review

Validity evidence for endoscopic ultrasound competency assessment tools: Systematic review

1   Department of Medicine, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
,
Harneet Hothi
2   Temerty Faculty of Medicine, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
,
Rishad Khan
3   Division of Gastroenterology and Hepatology, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
,
Nikko Gimpaya
4   Scarborough Health Network Research Institute, Scarborough Health Network, Scarborough, Canada (Ringgold ID: RIN507265)
,
Brian P.H. Chan
5   Division of Gastroenterology, Scarborough Health Network, Scarborough, Canada (Ringgold ID: RIN507265)
1   Department of Medicine, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
3   Division of Gastroenterology and Hepatology, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
,
Nauzer Forbes
6   Division of Gastroenterology and Hepatology, University of Calgary, Calgary, Canada (Ringgold ID: RIN2129)
,
Paul James
7   Division of Gastroenterology, University Health Network, Toronto, Canada (Ringgold ID: RIN7989)
1   Department of Medicine, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
3   Division of Gastroenterology and Hepatology, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
,
5   Division of Gastroenterology, Scarborough Health Network, Scarborough, Canada (Ringgold ID: RIN507265)
1   Department of Medicine, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
3   Division of Gastroenterology and Hepatology, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
,
Jeffrey Mosko
8   Division of Gastroenterology, St Michael's Hospital, Toronto, Canada (Ringgold ID: RIN10071)
1   Department of Medicine, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
3   Division of Gastroenterology and Hepatology, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
9   Li Ka Shing Knowledge Institute, Unity Health Toronto, Toronto, Canada (Ringgold ID: RIN508783)
,
Elaine T. Yeung
5   Division of Gastroenterology, Scarborough Health Network, Scarborough, Canada (Ringgold ID: RIN507265)
1   Department of Medicine, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
3   Division of Gastroenterology and Hepatology, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
,
10   Division of Gastroenterology, Hepatology, and Nutrition, and the Research and Learning Institutes, The Hospital for Sick Children, Toronto, Canada (Ringgold ID: RIN7979)
11   Department of Pediatrics and the Wilson Centre, University of Toronto Temerty Faculty of Medicine, Toronto, Canada (Ringgold ID: RIN12366)
,
1   Department of Medicine, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
5   Division of Gastroenterology, Scarborough Health Network, Scarborough, Canada (Ringgold ID: RIN507265)
3   Division of Gastroenterology and Hepatology, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
4   Scarborough Health Network Research Institute, Scarborough Health Network, Scarborough, Canada (Ringgold ID: RIN507265)
› Author Affiliations

Abstract

Background and study aims Competent endoscopic ultrasound (EUS) performance requires a combination of technical, cognitive, and non-technical skills. Direct observation assessment tools can be employed to enhance learning and ascertain clinical competence; however, there is a need to systematically evaluate validity evidence supporting their use. We aimed to evaluate the validity evidence of competency assessment tools for EUS and examine their educational utility.

Methods We systematically searched five databases and gray literature for studies investigating EUS competency assessment tools from inception to May 2023. Data on validity evidence across five domains (content, response process, internal structure, relations to other variables, and consequences) were extracted and graded (maximum score 15). We evaluated educational utility using the Accreditation Council for Graduate Medical Education framework and methodological quality using the Medical Education Research Quality Instrument (MERSQI).

Results From 2081 records, we identified five EUS assessment tools from 10 studies. All tools are formative assessments intended to guide learning, with four employed in clinical settings. Validity evidence scores ranged from 3 to 12. The EUS and ERCP Skills Assessment Tool (TEESAT), Global Assessment of Performance and Skills in EUS (GAPS-EUS), and the EUS Assessment Tool (EUSAT) had the strongest validity evidence with scores of 12, 10, and 10, respectively. Overall educational utility was high given ease of tool use. MERSQI scores ranged from 9.5 to 12 (maximum score 13.5).

Conclusions The TEESAT, GAPS-EUS, and EUSAT demonstrate strong validity evidence for formative assessment of EUS and are easily implemented in educational settings to monitor progress and support learning.

Supplementary Material



Publication History

Received: 19 October 2024

Accepted: 05 November 2024

Accepted Manuscript online:
11 November 2024

Article published online:
17 December 2024

© 2024. The Author(s). This is an open access article published by Thieme under the terms of the Creative Commons Attribution-NonDerivative-NonCommercial-License, permitting copying and reproduction so long as the original work is given appropriate credit. Contents may not be used for commercial purposes, or adapted, remixed, transformed or built upon. (https://creativecommons.org/licenses/by-nc-nd/4.0/).

Georg Thieme Verlag KG
Oswald-Hesse-Straße 50, 70469 Stuttgart, Germany

 
  • References

  • 1 Levine I, Trindade AJ. Endoscopic ultrasound fine needle aspiration vs fine needle biopsy for pancreatic masses, subepithelial lesions, and lymph nodes. World J Gastroenterol 2021; 27: 4194-4207
  • 2 Friedberg SR, Lachter J. Endoscopic ultrasound: Current roles and future directions. World J Gastrointest Endosc 2017; 9: 499-505
  • 3 Forbes N, Coelho-Prabhu N. ASGE Standards of Practice Committee. et al. Adverse events associated with EUS and EUS-guided procedures. Gastrointest Endosc 2022; 95: 16-26.e2
  • 4 Karstensen JG, Nayahangan LJ, Konge L. et al. A core curriculum for basic EUS skills: An international consensus using the Delphi methodology. Endosc Ultrasound 2022; 11: 122-132
  • 5 Cassani L, Aihara H, Anand GS. et al. Core curriculum for EUS. Gastrointest Endosc 2020; 92: 469-473
  • 6 Badaoui A, Teles de Campos S, Fusaroli P. et al. Curriculum for diagnostic endoscopic ultrasound training in Europe: European Society of Gastrointestinal Endoscopy (ESGE) Position Statement. Endoscopy 2023; 56: 222-240
  • 7 Patel SG, Keswani R, Elta G. et al. Status of competency-based medical education in endoscopy training: A nationwide survey of US ACGME-accredited gastroenterology training programs. Am J Gastroenterol 2015; 110: 956-962
  • 8 Johnson G, Webster G, Boškoski I. et al. Curriculum for ERCP and endoscopic ultrasound training in Europe: European Society of Gastrointestinal Endoscopy (ESGE) Position Statement. Endoscopy 2021; 53: 1071-1087
  • 9 Wani S, Coté GA, Keswani R. et al. Learning curves for EUS by using cumulative sum analysis: implications for American Society for Gastrointestinal Endoscopy recommendations for training. Gastrointest Endosc 2013; 77: 558-565
  • 10 Walsh CM. In-training gastrointestinal endoscopy competency assessment tools: Types of tools, validation and impact. Best Pract Res Clin Gastroenterol 2016; 30: 357-374
  • 11 Shiha MG, Ravindran S, Thomas-Gibson S. et al. Importance of non-technical skills: SACRED in advanced endoscopy. Frontline Gastroenterol 2023; 14: 527-529
  • 12 Ravindran S, Haycock A, Woolf K. et al. Development and impact of an endoscopic non-technical skills (ENTS) behavioural marker system. BMJ Simul Technol Enhanc Learn 2021; 7: 17-25
  • 13 Cook DA, Beckman TJ. Current concepts in validity and reliability for psychometric instruments: theory and application. Am J Med 2006; 119: 166.e7-16
  • 14 Messick S. Validity of psychological assessment: Validation of inferences from persons’ responses and performances as scientific inquiry into score meaning. Am Psychol 1995; 50: 741-749
  • 15 Khan R, Zheng E, Wani SB. et al. Colonoscopy competence assessment tools: a systematic review of validity evidence. Endoscopy 2021; 53: 1235-1245
  • 16 Khan R, Homsi H, Gimpaya N. et al. Validity evidence for observational ERCP competency assessment tools: a systematic review. Endoscopy 2023; 55: 847-856
  • 17 Faigel DO, Baron TH, Lewis B. et al. Ensuring competence in endoscopy - prepared by the ASGE Taskforce on Ensuring Competence in Endoscopy. Oak Brook, IL: American College of Gastroenterology Executive and Practice Management Committees. 2006 https://www.asge.org/docs/default-source/education/practice_guidelines/doc-competence.pdf?sfvrsn=1bfd4951_6
  • 18 Ghaderi I, Manji F, Park YS. et al. Technical skills assessment toolbox: a review using the unitary framework of validity. Ann Surg 2015; 261: 251-262
  • 19 Swing SR, Clyman SG, Holmboe ES. et al. Advancing resident assessment in graduate medical education. J Grad Med Educ 2009; 1: 278-286
  • 20 Cook DA, Reed DA. Appraising the quality of medical education research methods: the Medical Education Research Study Quality Instrument and the Newcastle-Ottawa Scale-Education. Acad Med J Assoc Am Med Coll 2015; 90: 1067-1076
  • 21 Konge L, Vilmann P, Clementsen P. et al. Reliable and valid assessment of competence in endoscopic ultrasonography and fine-needle aspiration for mediastinal staging of non-small cell lung cancer. Endoscopy 2012; 44: 928-933
  • 22 Barthet M, Gasmi M, Boustiere C. et al. EUS training in a live pig model: does it improve echo endoscope hands-on and trainee competence?. Endoscopy 2007; 39: 535-539
  • 23 Hedenström P, Lindkvist B, Sadik R. Global assessment of EUS performance skills (GEUSP) - a new tool and approach to objectify and measure the learning curve and technical skills of endosonographists. Gastrointest Endosc 2015; 81: AB442
  • 24 Hedenström P, Marasco G, Eusebi LH. et al. GAPS-EUS: a new and reliable tool for the assessment of basic skills and performance in EUS among endosonography trainees. BMJ Open Gastroenterol 2021; 8: e000660
  • 25 Wani S, Hall M, Keswani RN. et al. Variation in aptitude of trainees in endoscopic ultrasonography, based on cumulative sum analysis. Clin Gastroenterol Hepatol 2015; 13: 1318-1325.e2
  • 26 Wani S, Keswani R, Hall M. et al. A prospective multicenter study evaluating learning curves and competence in endoscopic ultrasound and endoscopic retrograde cholangiopancreatography among advanced endoscopy trainees: The rapid assessment of trainee endoscopy skills study. Clin Gastroenterol Hepatol 2017; 15: 1758-1767.e11
  • 27 Wani S, Keswani RN, Han S. et al. Competence in endoscopic ultrasound and endoscopic retrograde cholangiopancreatography, from training through independent practice. Gastroenterology 2018; 155: 1483-1494.e7
  • 28 Wani S, Han S, Simon V. et al. Setting minimum standards for training in EUS and ERCP: results from a prospective multicenter study evaluating learning curves and competence among advanced endoscopy trainees. Gastrointest Endosc 2019; 89: 1160-1168.e9
  • 29 Meenan J, Anderson S, Tsang S. et al. Training in radial EUS: what is the best approach and is there a role for the nurse endoscopist?. Endoscopy 2003; 35: 1020-1023
  • 30 Wani S, Wallace MB, Cohen J. et al. Quality indicators for EUS. Gastrointest Endosc 2015; 81: 67-80
  • 31 Strandbygaard J, Scheele F, Sørensen JL. Twelve tips for assessing surgical performance and use of technical assessment scales. Med Teach 2017; 39: 32-37