RSS-Feed abonnieren
DOI: 10.1055/s-0041-107976
Collecting Validity Evidence for the Assessment of Mastery Learning in Simulation-Based Ultrasound Training
Erhebung der Validität für die Bewertung des „Mastery Learning“ im simulationsbasierten UltraschalltrainingPublikationsverlauf
27. März 2015
15. Juni 2015
Publikationsdatum:
01. März 2016 (online)
Abstract
Purpose: To collect validity evidence for the assessment of mastery learning on a virtual reality transabdominal ultrasound simulator.
Materials and Methods: We assessed the validity evidence using Messick’s framework for validity. The study included 20 novices and 9 ultrasound experts who all completed 10 obstetric training modules on a transabdominal ultrasound simulator that provided automated measures of performance for each completed module (i. e., simulator metrics). Differences in the performance of the two groups were used to identify simulator metrics with validity evidence for the assessment of mastery learning. The novices continued to practice until they had attained mastery learning level.
Results: One-third of the simulator metrics discriminated between the two groups. The median simulator scores from a maximum of 40 metrics were 17.5 percent (range 0 – 45.0 percent) for novices and 90.0 percent (range 85.0 – 97.5) for experts, p < 0.001. Internal consistency was high, with a Cronbach’s alpha value of 0.98. The test/retest reliability gave an intra-class correlation coefficient (ICC) of 0.62 for novices who reached the mastery learning level twice. Novices reached the mastery learning level within a median of 4 attempts (range 3 – 8) corresponding to a median of 252 minutes of simulator training (range 211 – 394 minutes).
Conclusion: This study found that validity evidence for the assessment of mastery learning in simulation-based ultrasound training can be demonstrated and that ultrasound novices can attain mastery learning levels with less than 5 hours of training. Only one-third of the standard simulator metrics discriminated between different levels of competence.
Zusammenfassung
Ziel: Erhebung der Validität für die Bewertung des „Mastery-Learning“ am transabdominellen Virtual-Reality-Ultraschallsimulator.
Material und Methoden: Wir bewerteten die Validität mit dem „Messick’s Framework of Validity“. Die Studie schloss 20 Anfänger und 9 Ultraschallexperten ein, die allesamt 10 geburtshilfliche Trainingsmodule an einem transabdominellen Ultraschallsimulator abgeschlossen hatten, welcher für jedes abgeschlossene Modul automatische Messungen der Ausführung (Simulatormetrik) lieferte. Die Unterschiede der Ausführung der beiden Gruppen wurden herangezogen, um die Simulatormetrik mit der Validität des „Mastery-Learning“ zu ermitteln. Die Anfänger setzten ihre praktischen Übungen bis zum Erreichen des Mastery-Learning-Levels fort.
Ergebnisse: Ein Drittel der Simulatormetrik unterschied zwischen den beiden Gruppen. Die medianen Simulator-Werte mit einem Maximum von 40 Metriken betrugen bei Anfängern 17,5 Prozent (Bereich 0 – 45,0 Prozent) und bei den Experten 90,0 Prozent (Bereich 85,0 – 97,5; p < 0,001). Die innere Konsistenz war hoch mit einem Cronbachs Alpha von 0,98. Die Test-Retest-Reliabilität ergab einen Intra-Klasse-Korrelationskoeffizienten (ICC) von 0,62 für Anfänger, die zweimal das Mastery-Learning-Level erreicht hatten. Die Neulinge erreichten das Mastery-Learning-Level mit einem Median von 4 Versuchen (Bereich 3 – 8), was einer medianen Dauer von 252 Minuten Simulatortraining (Bereich 211 – 394 Minuten) entspricht.
Schlussfolgerung: Diese Studie zeigte die Validität bei der Bewertung des Mastery-Learning im simulationsbasiertem Ultraschalltraining und dass Ultraschall-Anfänger mit weniger als 5 Stunden Übung das Mastery-Learning-Level erreichen können. Nur ein Drittel der Standard-Simulatormetriken unterschied zwischen den verschiedenen Kompetenzebenen.
-
Literature
- 1 Moore CL, Copel JA. Point-of-care ultrasonography. N Engl J Med 2011; 364: 749-757
- 2 NHS Litigation Authority. Ten Years Of Maternity Claims. 2012
- 3 Tolsgaard MG, Rasmussen MB, Tappert C et al. Which factors are associated with trainees' confidence in performing obstetric and gynecological ultrasound examinations?. Ultrasound Obstet Gynecol 2014; 43: 444-451
- 4 Stunt J, Wulms P, Kerkhoffs G et al. How valid are commercially available medical simulators?. Adv Med Educ Pract 2014; 5: 385-395
- 5 ISUOG Education Committee recommendations for basic training in obstetric and gynecological ultrasound. Ultrasound Obstet Gynecol 2014; 43: 113-116
- 6 Minimum training recommendations for the practice of medical ultrasound. Ultraschall in Med 2006; 27: 79-105
- 7 Konge L, Annema J, Clementsen P et al. Using virtual-reality simulation to assess performance in endobronchial ultrasound. Respiration 2013; 86: 59-65
- 8 Konge L, Albrecht-Beste E, Nielsen MB. Virtual-reality simulation-based training in ultrasound. Ultraschall in Med 2014; 35: 95-97
- 9 Nitsche JF, Brost BC. Obstetric ultrasound simulation. Semin Perinatol 2013; 37: 199-204
- 10 Sidhu HS, Olubaniyi BO, Bhatnagar G et al. Role of simulation-based education in ultrasound practice training. J Ultrasound Med 2012; 31: 785-791
- 11 Teteris E, Fraser K, Wright B et al. Does training learners on simulators benefit real patients?. Adv Health Sci Educ Theory Pract 2012; 17: 137-144
- 12 Tolsgaard MG, Madsen ME, Ringsted C et al. The effect of dyad versus individual simulation-based training on skill transfer. Med Educ In press
- 13 Tolsgaard MG, Ringsted C, Dreisler E et al. Sustained effect of simulation-based ultrasound training on clinical performance: A randomized trial. Ultrasound Obstet Gynecol 2015;
- 14 Messick S. Validity. In: Linn RL, (ed) Validity. New York: American Council on Education and Macmillan, Educational Measurement; 1989
- 15 Kane MT. Validation. In: Brennan RL, (ed) Validation. Westport, Educational Measurement: Praeger; 2006: 17-64
- 16 Tolsgaard MG, Ringsted C, Dreisler E et al. Reliable and valid assessment of ultrasound operator competence in obstetrics and gynecology. Ultrasound Obstet Gynecol 2014; 43: 437-443
- 17 Schijven MP, Jakimowicz JJ. Validation of virtual reality simulators: Key to the successful integration of a novel teaching technology into minimal access surgery. Minim Invasive Ther Allied Technol 2005; 14: 244-246
- 18 Madsen ME, Konge L, Norgaard LN et al. Assessment of performance measures and learning curves for use of a virtual-reality ultrasound simulator in transvaginal ultrasound examination. Ultrasound Obstet Gynecol 2014;
- 19 Kundel HL. History of research in medical image perception. J Am Coll Radiol 2006; 3: 402-408
- 20 Nodine CF, Steuerle NL. Development of perceptual and cognitive strategies for differentiating graphemes. J Exp Psychol 1973; 97: 158-166
- 21 van der Gijp A, van der Schaaf MF, van der Schaaf IC et al. Interpretation of radiological images: towards a framework of knowledge and skills. Adv Health Sci Educ Theory Pract 2014; 19: 565-580
- 22 Stefanidis D, Scerbo MW, Montero PN et al. Simulator training to automaticity leads to improved skill transfer compared with traditional proficiency-based training: a randomized controlled trial. Ann Surg 2012; 255: 30-37
- 23 Birkmeyer JD, Finks JF, O'Reilly A et al. Surgical skill and complication rates after bariatric surgery. N Engl J Med 2013; 369: 1434-1442
- 24 Norman G, Dore K, Grierson L. The minimal relationship between simulation fidelity and transfer of learning. Med Educ 2012; 46: 636-647