RSS-Feed abonnieren
DOI: 10.1055/a-2150-8523
Governance of Electronic Health Record Modification at U.S. Academic Medical Centers
- Abstract
- Background and Significance
- Objectives
- Methods
- Results
- Discussion
- Conclusion
- Clinical Relevance Statement
- Multiple-Choice Questions
- References
Abstract
Objectives A key aspect of electronic health record (EHR) governance involves the approach to EHR modification. We report a descriptive study to characterize EHR governance at academic medical centers (AMCs) across the United States.
Methods We conducted interviews with the Chief Medical Information Officers of 18 AMCs about the process of EHR modification for standard requests. Recordings of the interviews were analyzed to identify categories within prespecified domains. Responses were then assigned to categories for each domain.
Results At our AMCs, EHR requests were governed variably, with a similar number of sites using quantitative scoring systems (7, 38.9%), qualitative systems (5, 27.8%), or no scoring system (6, 33.3%). Two (11%) organizations formally review all requests for their impact on health equity. Although 14 (78%) organizations have trained physician builders/architects, their primary role was not for EHR build. Most commonly reported governance challenges included request volume (11, 61%), integrating diverse clinician input (3, 17%), and stakeholder buy-in (3, 17%). The slowest step in the process was clarifying end user requests (14, 78%). Few leaders had identified metrics for the success of EHR governance.
Conclusion Governance approaches for managing EHR modification at AMCs are highly variable, which suggests ongoing efforts to balance EHR standardization and maintenance burden, while dealing with a high volume of requests. Developing metrics to capture the performance of governance and quantify problems may be a key step in identifying best practices.
#
Background and Significance
Governance of information technology (IT) systems is the process by which an enterprise steers effective and efficient use of IT systems to achieve institutional goals.[1] [2] As the U.S. health care system digitized, the first wave of health IT governance efforts and associated research focused on electronic health record (EHR) implementation and establishing oversight for enterprise-wide information systems project management including the funding and approval of new integrated hardware or software systems.[3] As EHRs have matured, an emergent, pressing need in governance is the approach to managing requests for modifying EHR systems to meet patient and provider needs as well as overall enterprise goals. Given the complex and interconnected nature of enterprise EHRs, managing modification requests requires balancing the differing priorities of frontline users, analysts and IT teams, leaders at various levels within the organization, and external stakeholders, in the context of finite EHR resources. These priorities are shaped by diverse organizational goals including patient safety, quality of care, regulatory compliance, and revenue capture, and constrained by practical limitations due to shortages in the nursing/provider informaticist workforce.
While there is a robust literature examining health IT governance as it relates to implementation,[4] [5] few studies focus on the major current challenge of managing the deluge of requests for modifying complex EHRs.[2] [3] [6] [7] [8] [9] [10] Although organizations like Healthcare Information and Management Systems Society may individually collect data about health record governance for their Electronic Medical Record Adoption Model staging certifications, these data are not publicly available for review or comparison.[11] Existing work describes single-institution approaches or panel reflections on the EHR modification process; however, this work does not systematically characterize and compare approaches.[12] [13] [14] [15] Indeed, we are not aware of any key performance index (KPI) or other metrics to compare strategies, which may be key to understanding best practices, and, ultimately, streamlining the process of improving patient care through the EHR.
#
Objectives
We sought to take the first step toward filling these gaps by characterizing the EHR modification process at 18 academic medical centers (AMCs) across the country as well as leaders' impressions of challenges and opportunities in IT governance at their sites.
#
Methods
Setting and Participants
We selected AMCs because of their relative similarities with regard to size, setting, institutional mission, health IT maturity, and anticipated complexity of EHR governance needs. Other settings like community hospital systems or county hospital systems were not included because they may have a different scope of priorities and scale of resources. AMCs were defined as health centers with affiliated residency training programs. We further restricted our cohort to 25 AMCs with Clinical Informatics fellowship programs for two reasons: first, AMCs with an informatics fellowship are more likely to have a more mature EHR system and IT resources; and second, clinical informatics fellows at these programs could facilitate connections with Chief Medical Information Officers (CMIOs) who might otherwise be inaccessible. We excluded county health systems, private for-profit health systems, community hospitals, and Veteran's Affairs medical centers.
We targeted CMIOs for interviews because this role would have the greatest overall insight into the EHR modification process. To mitigate against the possibility that CMIOs might be unfamiliar with the minutia of governance, we instructed CMIOs to delegate the interview to another staff member with more knowledge on the subject if appropriate, provided CMIOs the interview questions in advance to prepare for any specific details they may not be familiar with, and allowed CMIOs to gather necessary information and follow-up after the interview via email if there were any details they felt unequipped to answer in the moment. Interview outreach was conducted by email communication either directly or via hand-off through contacts within the American Medical Informatics Association of Clinical Informatics Fellows.
#
Protocol
Our interview protocol was developed in collaboration with experts in informatics, grouped into themes identified by individuals with experience in EHR governance and pretested with two physician–informaticists who were not connected to the project to ensure content validity. Interview questions were organized in the order in which requests might be handled. Each interview began with a standard scenario to focus the scope of the study:
“A physician at your institution has noticed that most other providers in their practice have not been ordering standard surveillance labs for patients admitted for inpatient administration of a particular chemotherapy. They would like to create a new orderset that groups and pre-selects these labs in addition to the drug.”
We then asked structured and open-ended questions organized into the following areas as they pertain to the governance process impacting their primary/core AMC: (1) the process for requesting a change to the EHR, (2) prioritization and evaluation of each request, (3) building the request and communication with the requester on the status of the build, and (4) postbuild monitoring. We also asked each respondent to provide a summary assessment of the strengths and weakness of the organization's EHR modification process, along with supporting documents like charters and organizational charts. Given the growing attention to socioeconomic disparities in health care as well as a recognition of the role that the EHR can play in this area, health equity considerations were specifically evaluated to better understand the ways organizations may or may not consider equity during the EHR modification process.[16] [17]
Together, 16 dimensions of EHR governance were addressed. Our full protocol is included in [Supplemental Table S1] (available in online version). Of note, not all interview questions correlated directly to a single dimension of governance as some questions were open-ended ([Supplemental Table S1], available in online version, question 21), some questions were aimed at soliciting supplementary documentation ([Supplemental Table S1], available in online version, questions 19–20), and some questions were ultimately not included in the final analysis because of an inability to standardize responses across organizations ([Supplemental Table S1], available in online version, questions 12–14).
Interviews were conducted and recorded by three separate interviewers via videoconferences between February 2022 and July 2022. Recordings were transcribed and analyzed to identify specific categories of responses within each predefined interview dimension.
Identification of categories of responses included a three-step process to ensure consistency of interpretation. First, the primary interviewer summarized each dimension into one or more categories based on the interview transcript, followed by a second review of the transcript by a different interviewer (not on the given interview) to flag any disagreements. Then, any discrepancies between the first two individuals were reconciled after a review of the original video recording and consensus from all three study interviewers. We selected quotes for each dimension that illustrated the different categories, which we integrated into our results reporting. Finally, after all potential categories of responses for a dimension of governance were identified, each interview transcript was reviewed to assign one or more categories per dimension.
For some dimensions (“Method of Intake,” “Type of Scoring System,” “Health Equity Consideration,” etc.), categories of responses were mutually exclusive, and therefore, each AMC could fit into a single category; for other dimensions (“Members of Governance Team,” “Elements of Scoring System,” “Challenges to Governance,” etc.), each AMC could be assigned to multiple categories.
For example, for the dimension Type of Scoring system ([Table 1]), the three-step review process resulted in a categorization of how each AMC approached the governance task: (1) a quantitative scoring system, (2) qualitative scoring system, or (3) no scoring system. We, then, assigned each AMC to the category that best represented the description of their approach.
Section of interview |
Dimension of governance |
Categories of response |
No. of AMCs n (%) |
---|---|---|---|
Request intake |
Method of intake |
Online portal |
15 (83.3) |
Service now |
11 (73.3)[a] |
||
Direct communication (email/phone alone) |
3 (16.7) |
||
Request evaluation |
Type of scoring system |
Quantitative |
7 (38.9) |
Qualitative |
5 (27.8) |
||
No scoring system |
6 (33.3) |
||
Previously had scoring system |
Yes |
5 (45)[b] |
|
Health equity consideration |
Formally considered |
2 (11.1) |
|
Informally considered |
9 (50) |
||
Not considered |
7 (38.9) |
||
Top 3 members of governance team |
Physician informaticists |
18 (100) |
|
Nursing informaticists |
17 (94) |
||
IT staff |
17 (94) |
||
Top 3 elements of scoring system |
Implementation time |
9 (75)[c] |
|
Patient safety |
8 (67.7)[c] |
||
Scale of providers, hospitals, areas impacted |
8 (67.7)[c] |
||
Build and build communication |
Method of communication with requesters |
Online portal |
14 (77.8) |
Service now |
12 (85.7)[a] |
||
Email/phone alone |
3 (16.7) |
||
No standard communication |
1 (5.6) |
||
Use of SLAs |
Break–fix response time |
7 (38.9) |
|
Request to build time |
5 (27.8) |
||
None |
8 (44.4) |
||
Trained physician builders/architects |
Yes |
14 (77.8) |
|
Regularly use physician builders/architects for builds |
Yes |
0 (0) |
|
Monitoring and feedback |
Monitoring builds |
Standard monitoring |
11 (61.1) |
Conditional monitoring |
5 (27.8) |
||
Regular review of ordersets |
5 (27.8) |
||
No monitoring |
3 (16.7) |
||
Feedback on builds |
General feedback |
11 (61.1) |
|
In-process feedback |
2 (11.1) |
||
Surveys |
8 (44.4) |
||
EHR demonstration “road shows” |
1 (5.6) |
||
Specific feedback |
10 (55.6) |
||
In-process feedback |
8 (44.4) |
||
Individual/group solicitation |
2 (11.1) |
||
No channels for feedback |
2 (11.1) |
||
Summary |
Top 3 challenges to governance |
Supply/demand |
11 (61.1) |
Not enough staff |
4 (36.3)[d] |
||
Excess of requests |
6 (54.5)[d] |
||
Unspecified |
1 (9.1)[d] |
||
Diverse clinician representation/input |
3 (16.7) |
||
Stakeholder buy-in to governance process |
3 (16.7) |
||
Top 3 strengths |
Experience/institutional memory |
8 (44.4) |
|
Relationships with SMEs/users |
4 (22.2) |
||
relationships with leadership and IT |
3 (16.7) |
||
Top 3 rate-limiting steps |
Clarifying requests |
14 (77.8) |
|
Negotiation between Stakeholders |
3 (16.7) |
||
Governance process |
2 (11.1) |
||
Measures of the success of governance |
Outcome metrics |
11 (61.1) |
|
Process metrics |
13 (72.2) |
||
Not measured |
4 (30.8) |
Abbreviations: AMC, academic medical center; EHR, electronic health record; IT, information technology; SLA, service level agreement.
a Percentage is out of the AMCs that use an online portal.
b Percentage is out of the AMCs that use a qualitative system or no scoring system.
c Percentage is out of the AMCs that use a qualitative or quantitative scoring system.
d Percentage is out of the AMCs that reported supply and demand as a challenge to governance.
We used data from the American Hospital Association Data & Insights survey and IT supplement survey to characterize participating hospitals.[18] All study procedures were approved by the University of California, San Francisco (UCSF) Institutional Review Board. Qualitative data analysis and summarization was conducted on Atlas.ti version 22 and R version 4.1.2.
#
#
Results
Of the 25 CMIOs invited, 18 (72%) responded and completed an interview ([Table 2]). The majority (n = 14, 78%) of AMCs were classified as Not-for-Profit and used an EHR from Epic Systems (n = 13, 72%). Geographic representation included AMCs from the West, Midwest, Southwest, Southeast, and Northeast of the United States. The median annual hospital admissions and outpatient visits were 33,100 (interquartile range [IQR]: 27,900–43,400) and 973,100 (IQR: 534,800–1,306,700), respectively. Interview results and sample illustrative quotes are summarized by section in [Tables 1] and [3], respectively.
Abbreviations: AMC, academic medical center; EHR, electronic health record; IQR, interquartile range.
Abbreviations: BPA, best practice advisory; EHR, electronic health record; IT, information technology; SLA, service level agreement; SME, subject matter expert.
Request Intake
Requests were most often made through an online portal (n = 15, 83.3%), although some relied on direct communication to IT or an informaticist via email or phone. Of the ones that used an online portal, the majority used ServiceNow (Santa Clara, California, United States; n = 11, 73%). One unique method of request intake was to work with users to reframe requests into a user story: “We'll talk to them and then redo their request with them as a user story, which basically says which type of user is it as a certain type of user, and then what they want, and then what the reason is, what value they plan to get from it.”
#
Request Evaluation
Physician informaticists (n = 18, 100%), nursing informaticists (n = 17, 94%), and IT staff (n = 17, 94%) were most commonly reported as being parts of teams evaluating requests. Twelve (67%) reported using standard scoring systems to evaluate their requests. Scoring systems were classified as quantitative (n = 7, 38.9%) if they produced a numerical score and qualitative (n = 5, 27.8%) if they had clear criteria but did not require a numerical score. The most commonly reported categories considered in scoring systems were implementation time (n = 9, 75%), patient safety (n = 8, 67.7%), and the scale of providers/hospitals/areas impacted (n = 8, 67.7%). Of the 11 CMIOs that reported having a qualitative system or no scoring system, 45% (n = 5) further stated that they previously used a numerical scoring system that was ultimately abandoned. One unique scoring system incorporated a two-tiered methodology: “We assign a designation of A (Safety), B (Regulatory), C (Revenue), or O (Other) for the first part of our priority methodology…and then [for] the second part we do an impact score…[which] allows an item to gather up points from all areas [including direct patient care, compliance, safety, financial impact, efficiency, patient satisfaction], even if it's not receiving a primary designation [in that area].”
Two sites (11.1%) reported standards for reviewing each request's health equity impact; nine used informal methods to improve equity; the remaining sites had no established mechanism in place to evaluate health equity. One CMIO whose site did not have a standard process in place mentioned: “I'd be hard pressed to think about ways in which health equity might be either positively or negatively affected by…the workflow. So in many cases, this is not relevant to the work….”
#
Build and Build Communication
Most (14, 77.8%) of CMIOs reported having trained physician builders/architects on staff, but none of them were frequently (>5% of builds) used for building EHR modifications. One CMIO reported that even if not carrying out EHR build requests, “Physician builders are very helpful to be advisers to the build analysts.”
Most CMIOs reported using an online portal (n = 14, 77.8%) to communicate with the requester on the status of approved requests. Nearly half (8, 44%) of CMIOs reported that their system did not have service level agreements (SLAs) specifying a specific timeframe for communications with their requesters; SLAs were infrequently used for break–fix response times (n = 7, 38.9%) or estimated time from EHR modification request to build completion (n = 5, 27.8%).
#
Monitoring and Feedback
A majority reported some form of standard monitoring (n = 11, 61.1%), which was defined as a post hoc standardized evaluation of the usage of or user feedback about an EHR modification (i.e., How often the modification was used, user feedback about errors, etc.). Five (27.8%) performed ad hoc evaluation of the usage of an EHR modification based on the scenario/situation/requester. One unique approach to monitoring was shared responsibility with requester champions: “[We] would typically want this enhancement to be linked to how we would measure it if we were going to approve it, because it's supposed to fix a problem and then expect that the champion…own[s] [the oversight] and assure[s] that the metric improves.” CMIOs used a combination of survey-based general (e.g., broad EHR usefulness) feedback and specific feedback (e.g., channel for feedback built into an EHR task) about an individual EHR modification build.
#
Strengths and Weaknesses of EHR Governance Systems
Managing the mismatch between demand for EHR modifications and supply of IT resources to manage requests was a substantial concern (n = 11, 61.1%). Clarifying and understanding new requests was identified as the slowest or most the resource-intensive aspect of governance. Commonly reported strengths were related to the informatics team's experience and institutional memory, and strong relationships with subject matter experts, end users, and health system leadership.
Leaders reported using both process metrics (e.g., request volumes, time to completion) and outcome metrics (e.g., effects on patients or end users) to track governance success. Outcome metrics were less clearly defined and included several informal measures like lack of complaints or governance meeting voluntary attendance to indicate engagement and provider satisfaction: “If I hear noise and I hear complaints, then we make changes…but when [our governance] meetings are running smoothly and they're getting work done; they're building stuff and their attendance is not reducing, then it's successful.” However, the use of metrics to evaluate the impact of builds or the governance process itself was rare.
#
#
Discussion
Our cross-sectional study of large AMCs suggests that the approach to managing EHR modification requests is highly variable. Nonetheless, certain shared challenges exist, including the mismatch between demand for EHR modifications and supply of informatics time, and the clarification of details needed for new requests. Reasons for this variability in governance approaches remain unclear but may stem from a lack of best practices and KPIs that leaders and health systems might use to define successful governance for these modern challenges. We highlight unique solutions used by some AMCs and a proposal for an EHR governance KPI.
While prior published studies have described governance for implementing EHR systems, standing up IT projects, clinical decision support, and evaluating these systems for patient safety concerns, our study fills a key gap by examining the more recent governance need for the intake and management of requests for any EHR customization across organizations.[2] [3] [10] [12] [13] [14] [15] With this broader view, we find that despite discussions of governance for over two decades, CMIOs still view the optimal management of EHR modification requests to be a major challenge, which is demonstrated by the stark variability in governance practices across organizations.
While this heterogeneity in approaches may be due to differing needs across organizations, our finding that CMIOs still experience similar challenges in governance argue that these variations may more likely be because there are few data to describe best practices.[12] [19] This is further supported by our finding that nearly half of CMIOs who reported using qualitative evaluation systems or no longer used any scoring system had abandoned a previous quantitative approach, demonstrating that organizations are continuing to modify and experiment with their governance process. Optimization of the EHR modification process will require a better understanding of the landscape of approaches, styles, and strategies across a larger number of sites and a linking these programs' features to outcomes of governance. For example, one of our sites reported the use of a unique two-tiered scoring system that first assigns requests into categories based on organizational priority, and then uses a second tier to quantify the impact of requests within each category. Leaders at this system felt that this model could allow for quantitative prioritization, while also adapting to changing organizational needs, but further comparison of this system to other models in terms of effective request triage, personnel time, modification request turnaround time, and user satisfaction with EHR's operations and governance is needed to identify the best practices. Describing these and other governance strategies is the first step toward discovering optimal approaches and stewarding scarce resources for EHR maintenance while permitting health innovation.
EHR customization through clinical decision support, ordersets, and other tools is thought to improve EHR usability and effectiveness[20] [21] [22] [23] but may exacerbate the most common problem reported by CMIOs: the mismatch between high demand for EHR modification requests and low supply of clinical informaticists or EHR analyst time.[19] From the demand side, we find that requests for customization often involve bottlenecks during request intake, where requestors have high levels of specific domain expertise but limited informatics knowledge/experience. This may lead to inappropriate, infeasible, or unrefined requests and was reflected in a majority of CMIOs reporting that clarifying requests was the slowest step in the governance process. One solution to this issue was modeling the request intake process after “user stories” from Agile software development, in which each request is framed in a narrative format around who the modification is for, what the goal of the modification is, and why this goal is meaningful.[9] [24] From the supply side, although customization is thought to be key, the value of governance on the road to modification may be opaque to both end users and operational leaders. For example, Tokazewski et al describe an innovation to improve medication refill protocols, in which the end user may see a significant improvement in their experience, but not realize the governance effort required to create and monitor such systems.[23] Furthermore, a lack of metrics around the effects of governance in terms of patient or provider outcomes makes it difficult for health systems to justify allocating additional resources to overcome these bottlenecks. One system empowers end users submitting EHR modification requests to be champions throughout the governance process and requires these champions to define and evaluate metrics for success for each such request. Defining, monitoring, and comparing these KPIs may be one solution to justifying the value of this work and addressing the supply side of this mismatch.
Our data cannot yet provide the full picture of what successful EHR governance looks like, but we can provide some initial examples of feasible KPIs that span both process and outcome domains. Although we highlight common challenges, more focused study into subproblems like the specifics of informatics staffing, team composition, and organization by service line or approaches to informatics literacy among end users may provide additional granularity and insight into potential solutions. After examining these challenges, an important next step will be the creation of governance performance metrics. Based on our respondents' answers, one option might combine the process of governance for a request (time to request clarification, time to governance, and person-hours to build), with a valuation of the change effected by that process. For example, a request for a new clinical decision support alert, which took 5 months of governance prior to build handoff due to significant changes in the build specifications to allow for a more tailored, effective build may have higher governance value compared with a request that required 3 months to build with minimal changes. Such an approach could not only allow organizations to justify clinical informaticist/IT staff time for EHR modification; it could also allow them to regularly evaluate their governance process, diagnose problems, and localize the specific steps involved. Such a KPI could inform leaders about the cost/benefit of including a health equity consideration and other steps within the governance process. Standardization of a KPI across systems would also lend to comparisons and insights into best practices and streamline the process of improving patient care through EHR modification.
Our study has several limitations. First, given the sample population interviewed, our findings may not be generalizable to non-AMCs or possibly to AMCs without a clinical informatics fellowship program, including community medical centers, county medical centers, and Veterans Affairs medical centers. Such organizations may have more limited informatics resources or different priorities/constraints for use of those resources. Future studies aimed at these organizations will help provide a more complete understanding of EHR governance practices. Second, given the facilitated interview format of the study, responses are subject to recall bias. Third, interviews were conducted with CMIOs of an organization who may be more familiar with the overarching structure of the EHR modification process but may not know about detailed aspects of governance. Future studies delving deeper into the pain points identified above, investigating the current landscape of EHR governance and differences in resource allocation at non-AMCs, and developing governance performance metrics and incentives could help achieve a more successful EHR modification process for patients, end users, and IT staff.
#
Conclusion
EHR modification processes at large AMCs are marked by substantial variation in the face of common challenges. We highlight several novel solutions to these challenges including a two-tiered scoring system, a more rigorous request intake process, and unique user partnerships. Our study is an important first step in understanding the EHR customization process, a need which will only grow as the digital transformation continues and the breadth of electronic and digital health tools becomes more complex. To meet this need, future studies aimed at further investigating the problems highlighted here, developing governance-related KPIs, and analyzing high-performing AMCs can lead the way to systematic identification of best practices and speed improvements in care.
#
Clinical Relevance Statement
We describe the results from interviews of CMIOs from AMCs across the United States to provide foundational insights into the landscape of EHR modification approaches, styles, and strategies. Key results include wide variations in governance practice patterns despite common challenges, some unique solutions to these problems, and a lack of formalized EHR governance related metrics to help organizations compare strategies to streamline the process of improving patient care. We believe our study is an important first step in understanding the EHR modification process, a need that will only grow as the digital transformation in health care continues.
#
Multiple-Choice Questions
-
Which of the following was not reported as one of the top 3 challenges to EHR governance?
-
Bringing all of the necessary stakeholders to the table and aligning around the governance process
-
Getting representation and input from a diverse array of clinicians into the EHR governance process
-
Negotiating with EHR vendors for access and ability to make certain modifications
-
Low supply of informaticist and analyst time paired with high demand for EHR modifications requests
Correct Answer: The correct answer is option c. According to the interview data summarized in [Table 1], the mismatch between demand for requests and supply of informaticist time was the most commonly mentioned challenge to governance by a large margin. Other commonly mentioned challenges include bringing stakeholders together to buy into the governance process and encouraging a diverse range of clinicians to participate in the governance process. However, working with EHR vendors was NOT a commonly described problem.
-
-
Why would a governance-related metrics or KPI help organizations shape their EHR modification process?
-
Governance metrics could also for more standardized comparison of EHR modification practices across organizations and could help identify best practices
-
Metrics for governance could help better track and identify the value of governance to the EHR modification process to justify allocating more resources to overcome bottlenecks
-
Metrics that are monitored live could help identify problems within the governance process and aim modifications or solutions to the appropriate step in the process
-
All of the above
Correct Answer: The correct answer is option d. A governance metric or KPI could serve multiple purposes within an organization and across organizations. Within an organization, governance metrics could help demonstrate and quantify the value of the EHR modification process to the health system at large to equitably allocate resources to the process. It could also help organizations diagnose problems within their process and identify key steps where bottlenecks arise relative to their value or addition to the process. Finally, having common metrics across organizations would allow for more appropriate comparisons to identify best practices. As all of these are potential uses for a governance metrics, all of the above is correct.
-
#
#
Conflict of Interest
A.A. reports that he is the founder of Kuretic Inc, which has no relationship to this work. S.A. reports receiving consulting fees from AstraZeneca, Diazyme, and Agilent Biotechnologies; none of which have any relationship to the contents of this work. R.K. reports receiving royalties from HillRom, which has no relationship to the contents of this work. The remainder of the authors declare that they have no conflict of interest in the research.
Protection of Human and Animal Subjects
The study was performed in compliance with the World Medical Association Declaration of Helsinki on Ethical Principles for medical Research Involving Human Subjects and was reviewed by UCSF Institutional Review Board.
-
References
- 1 Weill P, Ross JIT. Governance: How Top Performers Manage IT Decision Rights for Superior Results. 1st ed.. Harvard Business Review Press; 2004
- 2 Ash JS, Singh H, Wright A, Chase D, Sittig DF. Essential activities for electronic health record safety: a qualitative study. Health Informatics J 2020; 26 (04) 3140-3151
- 3 Shabot MM, Polaschek JX, Duncan RG, Langberg ML, Jones DT. A novel governance system for enterprise information services. Proc AMIA Symp 1999; 619-623
- 4 Fragidis LL, Chatzoglou PD. Implementation of a nationwide electronic health record (EHR). Int J Health Care Qual Assur 2018; 31 (02) 116-130
- 5 Aguirre RR, Suarez O, Fuentes M, Sanchez-Gonzalez MA. Electronic health record implementation: a review of resources and tools. Cureus 2019; 11 (09) e5649
- 6 Auerbach AD, Neinstein A, Khanna R. Balancing innovation and safety when integrating digital tools into health care. Ann Intern Med 2018; 168 (10) 733-734
- 7 Auerbach A, Burke K, Khanna R. Effective and collegial governance of digital tools in health care. ACI Open 2021; 05 (01) e13-e16
- 8 McKeeby JWS, , S Coffey P, Houston SM. et al. The evolution of information technology governance at the NIH clinical center. Perspect Health Inf Manag 2021; 18 (03) 1c
- 9 Kannan V, Basit MA, Bajaj P. et al. User stories as lightweight requirements for agile clinical decision support development. J Am Med Inform Assoc 2019; 26 (11) 1344-1354
- 10 Chaparro JD, Beus JM, Dziorny AC. et al. Clinical decision support stewardship: best practices and techniques to monitor and improve interruptive alerts. Appl Clin Inform 2022; 13 (03) 560-568
- 11 Electronic Medical Record Adoption Model (EMRAM) | HIMSS. Accessed February 16, 2023 at: https://www.himss.org/what-we-do-solutions/digital-health-transformation/maturity-models/electronic-medical-record-adoption-model-emram
- 12 Chawla N, Lang R, Michael D, McPeek Hinz E. Understanding clinical governance. Presented at: Epic UGM; 2019. Verona, WI:
- 13 Skelton J, Wilson W. Implementing successful prioritization and governance processes for change requests. Presented at: Epic XGM; 2018. Verona, WI:
- 14 Feen J, Bechard L. Request intake governance that defies the eyes. Presented at: Epic UGM; 2017. Verona, WI:
- 15 Jung J, Nelsen D. Planning the governance and communication intake process. Presented at: Epic UGM; 2018. Verona, WI:
- 16 Acholonu RG, Raphael JL. The influence of the electronic health record on achieving equity and eliminating health disparities for children. Pediatr Ann 2022; 51 (03) e112-e117
- 17 James TG, Sullivan MK, Butler JD, McKee MM. Promoting health equity for deaf patients through the electronic health record. J Am Med Inform Assoc 2021; 29 (01) 213-216
- 18 AHA Annual Survey DatabaseTM | AHA Data. Accessed January 29, 2023 at: https://www.ahadata.com/aha-annual-survey-database
- 19 Bansler JP. Challenges in user-driven optimization of EHR: a case study of a large Epic implementation in Denmark. Int J Med Inform 2021; 148: 104394
- 20 Najafi N, Robinson A, Pletcher MJ, Patel S. Effectiveness of an analytics-based intervention for reducing sleep interruption in hospitalized patients: a randomized clinical trial. JAMA Intern Med 2022; 182 (02) 172-177
- 21 Kharbanda EO, Asche SE, Sinaiko AR. et al. Clinical decision support for recognition and management of hypertension: a randomized trial. Pediatrics 2018; 141 (02) e20172954
- 22 Tran AV, Rushakoff RJ, Prasad P, Murray SG, Monash B, Macmaster H. Decreasing hypoglycemia following insulin administration for inpatient hyperkalemia. J Hosp Med 2020; 15 (02) 368-370
- 23 Tokazewski JT, Peifer M, Howell III JT. Leveraging and improving refill protocols at your health system. Appl Clin Inform 2022; 13 (05) 1063-1069
- 24 Cockburn A. Agile software development. Addison-Wesley; 2002
Address for correspondence
Publikationsverlauf
Eingereicht: 06. April 2023
Angenommen: 07. August 2023
Accepted Manuscript online:
08. August 2023
Artikel online veröffentlicht:
25. Oktober 2023
© 2023. Thieme. All rights reserved.
Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany
-
References
- 1 Weill P, Ross JIT. Governance: How Top Performers Manage IT Decision Rights for Superior Results. 1st ed.. Harvard Business Review Press; 2004
- 2 Ash JS, Singh H, Wright A, Chase D, Sittig DF. Essential activities for electronic health record safety: a qualitative study. Health Informatics J 2020; 26 (04) 3140-3151
- 3 Shabot MM, Polaschek JX, Duncan RG, Langberg ML, Jones DT. A novel governance system for enterprise information services. Proc AMIA Symp 1999; 619-623
- 4 Fragidis LL, Chatzoglou PD. Implementation of a nationwide electronic health record (EHR). Int J Health Care Qual Assur 2018; 31 (02) 116-130
- 5 Aguirre RR, Suarez O, Fuentes M, Sanchez-Gonzalez MA. Electronic health record implementation: a review of resources and tools. Cureus 2019; 11 (09) e5649
- 6 Auerbach AD, Neinstein A, Khanna R. Balancing innovation and safety when integrating digital tools into health care. Ann Intern Med 2018; 168 (10) 733-734
- 7 Auerbach A, Burke K, Khanna R. Effective and collegial governance of digital tools in health care. ACI Open 2021; 05 (01) e13-e16
- 8 McKeeby JWS, , S Coffey P, Houston SM. et al. The evolution of information technology governance at the NIH clinical center. Perspect Health Inf Manag 2021; 18 (03) 1c
- 9 Kannan V, Basit MA, Bajaj P. et al. User stories as lightweight requirements for agile clinical decision support development. J Am Med Inform Assoc 2019; 26 (11) 1344-1354
- 10 Chaparro JD, Beus JM, Dziorny AC. et al. Clinical decision support stewardship: best practices and techniques to monitor and improve interruptive alerts. Appl Clin Inform 2022; 13 (03) 560-568
- 11 Electronic Medical Record Adoption Model (EMRAM) | HIMSS. Accessed February 16, 2023 at: https://www.himss.org/what-we-do-solutions/digital-health-transformation/maturity-models/electronic-medical-record-adoption-model-emram
- 12 Chawla N, Lang R, Michael D, McPeek Hinz E. Understanding clinical governance. Presented at: Epic UGM; 2019. Verona, WI:
- 13 Skelton J, Wilson W. Implementing successful prioritization and governance processes for change requests. Presented at: Epic XGM; 2018. Verona, WI:
- 14 Feen J, Bechard L. Request intake governance that defies the eyes. Presented at: Epic UGM; 2017. Verona, WI:
- 15 Jung J, Nelsen D. Planning the governance and communication intake process. Presented at: Epic UGM; 2018. Verona, WI:
- 16 Acholonu RG, Raphael JL. The influence of the electronic health record on achieving equity and eliminating health disparities for children. Pediatr Ann 2022; 51 (03) e112-e117
- 17 James TG, Sullivan MK, Butler JD, McKee MM. Promoting health equity for deaf patients through the electronic health record. J Am Med Inform Assoc 2021; 29 (01) 213-216
- 18 AHA Annual Survey DatabaseTM | AHA Data. Accessed January 29, 2023 at: https://www.ahadata.com/aha-annual-survey-database
- 19 Bansler JP. Challenges in user-driven optimization of EHR: a case study of a large Epic implementation in Denmark. Int J Med Inform 2021; 148: 104394
- 20 Najafi N, Robinson A, Pletcher MJ, Patel S. Effectiveness of an analytics-based intervention for reducing sleep interruption in hospitalized patients: a randomized clinical trial. JAMA Intern Med 2022; 182 (02) 172-177
- 21 Kharbanda EO, Asche SE, Sinaiko AR. et al. Clinical decision support for recognition and management of hypertension: a randomized trial. Pediatrics 2018; 141 (02) e20172954
- 22 Tran AV, Rushakoff RJ, Prasad P, Murray SG, Monash B, Macmaster H. Decreasing hypoglycemia following insulin administration for inpatient hyperkalemia. J Hosp Med 2020; 15 (02) 368-370
- 23 Tokazewski JT, Peifer M, Howell III JT. Leveraging and improving refill protocols at your health system. Appl Clin Inform 2022; 13 (05) 1063-1069
- 24 Cockburn A. Agile software development. Addison-Wesley; 2002