Subscribe to RSS
DOI: 10.1055/s-0042-1744388
Design, Usability, and Acceptability of a Needs-Based, Automated Dashboard to Provide Individualized Patient-Care Data to Pediatric Residents
Funding None.Abstract
Background and Objectives Pediatric residency programs are required by the Accreditation Council for Graduate Medical Education to provide residents with patient-care and quality metrics to facilitate self-identification of knowledge gaps to prioritize improvement efforts. Trainees are interested in receiving this data, but this is a largely unmet need. Our objectives were to (1) design and implement an automated dashboard providing individualized data to residents, and (2) examine the usability and acceptability of the dashboard among pediatric residents.
Methods We developed a dashboard containing individualized patient-care data for pediatric residents with emphasis on needs identified by residents and residency leadership. To build the dashboard, we created a connection from a clinical data warehouse to data visualization software. We allocated patients to residents based on note authorship and created individualized reports with masked identities that preserved anonymity. After development, we conducted usability and acceptability testing with 11 resident users utilizing a mixed-methods approach. We conducted interviews and anonymous surveys which evaluated technical features of the application, ease of use, as well as users' attitudes toward using the dashboard. Categories and subcategories from usability interviews were identified using a content analysis approach.
Results Our dashboard provides individualized metrics including diagnosis exposure counts, procedure counts, efficiency metrics, and quality metrics. In content analysis of the usability testing interviews, the most frequently mentioned use of the dashboard was to aid a resident's self-directed learning. Residents had few concerns about the dashboard overall. Surveyed residents found the dashboard easy to use and expressed intention to use the dashboard in the future.
Conclusion Automated dashboards may be a solution to the current challenge of providing trainees with individualized patient-care data. Our usability testing revealed that residents found our dashboard to be useful and that they intended to use this tool to facilitate development of self-directed learning plans.
Keywords
data visualization - interface and usability - dashboard - testing and evaluation - graduate medical education - quality improvementProtection of Human and Animal Subjects
Our institutional review board reviewed and approved this study.
Publication History
Received: 03 October 2021
Accepted: 05 February 2022
Article published online:
16 March 2022
© 2022. Thieme. All rights reserved.
Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany
-
References
- 1 Accreditation Council for Graduate Medical Education.. Common Program Requirements. Accessed January 13, 2021 at: https://www.acgme.org/What-We-Do/Accreditation/Common-Program-Requirements
- 2 Swing SR. The ACGME outcome project: retrospective and prospective. Med Teach 2007; 29 (07) 648-654
- 3 pediatricsmilestones.pdf. Accessed November 23, 2020 at: https://www.acgme.org/portals/0/pdfs/milestones/pediatricsmilestones.pdf
- 4 Dowding D, Randell R, Gardner P. et al. Dashboards for improving patient care: review of the literature. Int J Med Inform 2015; 84 (02) 87-100
- 5 Rosenbluth G, Tong MS, Condor Montes SY, Boscardin C. Trainee and program director perspectives on meaningful patient attribution and clinical outcomes data. J Grad Med Educ 2020; 12 (03) 295-302
- 6 Wright SM, Durbin P, Barker LR. When should learning about hospitalized patients end? Providing housestaff with post-discharge follow-up information. Acad Med 2000; 75 (04) 380-383
- 7 Ehrenfeld JM, McEvoy MD, Furman WR, Snyder D, Sandberg WS. Automated near-real-time clinical performance feedback for anesthesiology residents: one piece of the milestones puzzle. Anesthesiology 2014; 120 (01) 172-184
- 8 Wheeler K, Baxter A, Boet S, Pysyk C, Bryson GL. Performance feedback in anesthesia: a post-implementation survey. Can J Anaesth 2017; 64 (06) 681-682
- 9 Levin JC, Hron J. Automated reporting of trainee metrics using electronic clinical systems. J Grad Med Educ 2017; 9 (03) 361-365
- 10 Bachur RG, Nagler J. Use of an automated electronic case log to assess fellowship training: tracking the pediatric emergency medicine experience. Pediatr Emerg Care 2008; 24 (02) 75-82
- 11 2019 Accreditation Council for Graduate Medical Education (ACGME). ACGME Program Requirements for Graduate Medical Education in Pediatrics. . Published online July 1, 2019. Available at: https://www.acgme.org/globalassets/pfassets/programrequirements/320_pediatrics_2021v2.pdf
- 12 Leyenaar JK, Ralston SL, Shieh MS, Pekow PS, Mangione-Smith R, Lindenauer PK. Epidemiology of pediatric hospitalizations at general hospitals and freestanding children's hospitals in the United States. J Hosp Med 2016; 11 (11) 743-749
- 13 Qlik Sense [Computer Software]. Version 3.1. King of Prussia, PA: Qlik; 2020
- 14 Stucky ER, Ottolini MC, Maniscalco J. Pediatric hospital medicine core competencies: development and methodology. J Hosp Med 2010; 5 (06) 339-343
- 15 Organization WH. ICD-10: International Statistical Classification of Diseases and Related Health Problems: Tenth Revision. World Health Organization; 2004. . Accessed February 19, 2021 at: https://apps.who.int/iris/handle/10665/42980
- 16 Shen MW, Percelay J. Quality measures in pediatric hospital medicine: Moneyball or looking for Fabio?. Hosp Pediatr 2012; 2 (03) 121-125
- 17 Parikh K, Hall M, Mittal V. et al. Establishing benchmarks for the hospitalized care of children with asthma, bronchiolitis, and pneumonia. Pediatrics 2014; 134 (03) 555-562
- 18 Davis FD. Perceived usefulness, perceived ease of use, and user acceptance of information technology. Manage Inf Syst Q 1989; 13 (03) 319-340
- 19 Microsoft Excel for Mac [Computer Software]. Version 16.57. Redmond, WA: Microsoft Corporation; 2020
- 20 Mullangi S, Jagsi R. Imposter syndrome: treat the cause, not the symptom. JAMA 2019; 322 (05) 403-404
- 21 Gottlieb M, Chung A, Battaglioli N, Sebok-Syer SS, Kalantari A. Impostor syndrome among physicians and physicians in training: a scoping review. Med Educ 2020; 54 (02) 116-124
- 22 Liebschutz JM, Darko GO, Finley EP, Cawse JM, Bharel M, Orlander JD. In the minority: black physicians in residency and their experiences. J Natl Med Assoc 2006; 98 (09) 1441-1448
- 23 Nunez-Smith M, Ciarleglio MM, Sandoval-Schaefer T. et al. Institutional variation in the promotion of racial/ethnic minority faculty at US medical schools. Am J Public Health 2012; 102 (05) 852-858
- 24 Dayal A, O'Connor DM, Qadri U, Arora VM. Comparison of male vs female resident milestone evaluations by faculty during emergency medicine residency training. JAMA Intern Med 2017; 177 (05) 651-657
- 25 Schumacher DJ, Wu DTY, Meganathan K. et al. A feasibility study to attribute patients to primary interns on inpatient ward teams using electronic health record data. Acad Med 2019; 94 (09) 1376-1383
- 26 Smirnova A, Sebok-Syer SS, Chahine S. et al. Defining and adopting clinical performance measures in graduate medical education: where are we now and where are we going?. Acad Med 2019; 94 (05) 671-677
- 27 Epstein JA, Noronha C, Berkenblit G. Smarter screen time: integrating clinical dashboards into graduate medical education. J Grad Med Educ 2020; 12 (01) 19-24
- 28 Mai MV, Orenstein EW, Manning JD, Luberti AA, Dziorny AC. Attributing patients to pediatric residents using electronic health record features augmented with audit logs. Appl Clin Inform 2020; 11 (03) 442-451
- 29 Sebok-Syer SS, Pack R, Shepherd L. et al. Elucidating system-level interdependence in electronic health record data: what are the ramifications for trainee assessment?. Med Educ 2020; 54 (08) 738-747
- 30 Schumacher DJ, Holmboe ES, van der Vleuten C, Busari JO, Carraccio C. Developing resident-sensitive quality measures: a model from pediatric emergency medicine. Acad Med 2018; 93 (07) 1071-1078