RSS-Feed abonnieren
DOI: 10.1055/a-0576-6667
Changes in scoring of Direct Observation of Procedural Skills (DOPS) forms and the impact on competence assessment
Publikationsverlauf
submitted 06. November 2017
accepted after revision 23. Januar 2018
Publikationsdatum:
03. April 2018 (online)
Abstract
Background Direct Observation of Procedural Skills (DOPS) is an established competence assessment tool in endoscopy. In July 2016, the DOPS scoring format changed from a performance-based scale to a supervision-based scale. We aimed to evaluate the impact of changes to the DOPS scale format on the distribution of scores in novice trainees and on competence assessment.
Methods We performed a prospective, multicenter (n = 276), observational study of formative DOPS assessments in endoscopy trainees with ≤ 100 lifetime procedures. DOPS were submitted in the 6-months before July 2016 (old scale) and after (new scale) for gastroscopy (n = 2998), sigmoidoscopy (n = 1310), colonoscopy (n = 3280), and polypectomy (n = 631). Scores for old and new DOPS were aligned to a 4-point scale and compared.
Results 8219 DOPS (43 % new and 57 % old) submitted for 1300 trainees were analyzed. Compared with old DOPS, the use of the new DOPS was associated with greater utilization of the lowest score (2.4 % vs. 0.9 %; P < 0.001), broader range of scores, and a reduction in competent scores (60.8 % vs. 86.9 %; P < 0.001). The reduction in competent scores was evident on subgroup analysis across all procedure types (P < 0.001) and for each quartile of endoscopy experience. The new DOPS was superior in characterizing the endoscopy learning curve by demonstrating progression of competent scores across quartiles of procedural experience.
Conclusions Endoscopy assessors applied a greater range of scores using the new DOPS scale based on degree of supervision in two cohorts of trainees matched for experience. Our study provides construct validity evidence in support of the new scale format.
-
References
- 1 American Society for Gastrointestinal Endoscopy. Position statement. Maintaining competency in endoscopic skills. Gastrointest Endosc 1995; 42: 620-621
- 2 Joint Advisory Group on GI Endoscopy. JAG certification for trainees – Download centre. JAG; 2017 Available from: https://www.thejag.org.uk/AboutUs/DownloadCentre.aspx Accessed 3 August 2017
- 3 Barton JR, Corbett S, van der Vleuten CP. The validity and reliability of a Direct Observation of Procedural Skills assessment tool: assessing colonoscopic skills of senior endoscopists. Gastrointest Endosc 2012; 75: 591-597
- 4 Gupta S, Anderson J, Bhandari P. et al. Development and validation of a novel method for assessing competency in polypectomy: direct observation of polypectomy skills. Gastrointest Endosc 2011; 73: 1232-1239
- 5 Joint Advisory Group on GI Endoscopy. DOPS forms for international and reference use only – Download centre. JAG; 2017 Available from: https://www.thejag.org.uk/AboutUs/DownloadCentre.aspx Accessed 3 August 2017
- 6 Mehta T, Dowler K, McKaig BC. et al. Development and roll out of the JETS e-portfolio: a web based electronic portfolio for endoscopists. Frontline Gastroenterol 2011; 2: 35-42
- 7 Crossley J, Johnson G, Booth J. et al. Good questions, good answers: construct alignment improves the performance of workplace-based assessment scales. Med Educ 2011; 45: 560-569
- 8 Johnson G, Wade W, Barrett J. et al. The Acute Care Assessment Tool: a new assessment in acute medicine. Clin Teach 2009; 6: 105-109
- 9 Cook DA, Beckman TJ. Current concepts in validity and reliability for psychometric instruments: theory and application. Am J Med 2006; 119: 166.e7-166.e16
- 10 Ward ST, Hancox A, Mohammed MA. et al. The learning curve to achieve satisfactory completion rates in upper GI endoscopy: an analysis of a national training database. Gut 2017; 66: 1022-1030
- 11 Ward ST, Mohammed MA, Walt R. et al. An analysis of the learning curve to achieve competency at colonoscopy using the JETS database. Gut 2014; 63: 1746-1754
- 12 DeVellis RF. Scale development: theory and applications. CA: Sage Publications: Thousand Oaks; 2003
- 13 Cabooter E, Weijters B, Geuens M. et al. Scale format effects on response option interpretation and use. Journal of Business Research 2016; 69: 2574-2584
- 14 Johns R. One size doesn’t fit all: selecting response scales for attitude items. Journal of Elections, Public Opinion and Parties 2005; 15: 237-264
- 15 Tay L, Jebb A. Scale development. 2nd edn. In: Rogelberg S. , ed. The SAGE encyclopedia of industrial and organizational psychology. CA: Sage: Thousand Oaks; 2017