Subscribe to RSS
DOI: 10.1055/s-0040-1713421
Using Electronic Health Record Data to Support Research and Quality Improvement: Practical Guidance from a Qualitative Investigation
- Abstract
- Background and Significance
- Objective
- Methods
- Results
- Discussion
- Conclusion
- Clinical Relevance Statement
- References
Abstract
Objective The aim of the study is to identify how academic health centers (AHCs) have established infrastructures to leverage electronic health record (EHR) data to support research and quality improvement (QI).
Methods Phone interviews of 18 clinical informaticians with expertise gained over three decades at 24 AHCs were transcribed for qualitative analysis on three levels. In Level I, investigators independently used NVivo software to code and identify themes expressed in the transcripts. In Level II, investigators reexamined coded transcripts and notes and contextualized themes in the learning health system paradigm. In Level III, an informant subsample validated and supplemented findings.
Results Level I analysis yielded six key “determinants”—Institutional Relationships, Resource Availability, Data Strategy, Response to Change, Leadership Support, and Degree of Mission Alignment—which, according to local context, affect use of EHR data for research and QI. Level II analysis contextualized these determinants in a practical frame of reference, yielding a model of learning health system maturation, over-arching key concepts, and self-assessment questions to guide AHC progress toward becoming a learning health system. Level III informants validated and supplemented findings.
Discussion Drawn from the collective knowledge of experienced informatics professionals, the findings and tools described offer practical support to help clinical informaticians leverage EHR data for research and QI in AHCs.
Conclusion The learning health system model builds on the tripartite AHC mission of research, education, and patient care. AHCs must deliberately transform into learning health systems to capitalize fully on EHR data as a staple of health learning.
#
Keywords
electronic health records - institutional relationships - secondary use - clinical data management - clinical research informatics - quality improvement - academic health centerBackground and Significance
In addition to transforming the day-to-day work of 21st century health care, electronic health records (EHRs) have made enormous amounts of data available for secondary uses. When the HITECH Act of 2009 provided extensive federal incentive dollars to promote broad adoption of EHRs,[1] a key rationale for offering funds was making EHR data readily accessible for use by researchers to support their studies and by health system analysts and practitioners to improve the quality of care. In fact, the National Academy of Medicine declared in 2011 that electronic clinical data should become the basic staple of what it termed “health learning.”[2]
Traditionally, academic health centers (AHCs) have been hubs of health learning. Here defined as medical schools with owned or affiliated hospital/health systems,[3] AHCs' roles in spearheading health learning have followed naturally from their long-established tripartite mission of research, clinical care, and education. Learning at AHCs consequently translates into the generation of new knowledge through clinical or health services research and leads to improvement of AHC-delivered care through quality improvement (QI).[4] Given this definition, it is not surprising that AHCs are the work locale for many clinical informaticians—clinicians who attempt to integrate health care delivery, technology, and data in a meaningful way.
Since the first decade of this century, the “rapid deployment of technology and the development of new sources and uses of health data” have greatly challenged clinical informaticians and others working at AHCs, presenting issues of interoperability, usability, privacy, security, and data stewardship at a scale beyond any they had previously seen.[5] While groups such as, the Office of the National Coordinator for Health Information Technology have, for example, attempted to guide implementations with its 216 page “Health IT Playbook,”[6] concise, practical information is lacking on how AHCs can best set up the structures and processes that would optimize EHR data for learning. In addition to their day-to-day work keeping EHR systems running and up to date, clinical informaticians also have an important role as part of the larger system of health care learning. However, the “learning about the learning” from AHCs that have succeeded in making EHR data available for research and QI has not been made readily or succinctly available for others to apply in their local contexts.
#
Objective
We undertook a qualitative study to identify how AHCs have established infrastructures to use EHR data to support research and QI.
#
Methods
The investigation was guided by accepted standards of qualitative research methodology.[7]
Sample Recruitment
Investigators interviewed 18 individuals who had informatics, research, and QI expertise gained at 24 institutions over the course of the course of more than 30 years. An initial group of informants was recruited by email via the American Medical Informatics Association Clinical Research Informatics Working Group mailing list and a subsequent group via chain referral sampling.[8] Informant experiences included the roles of chief medical information officer, informatician, informatics researcher, health services researcher, and QI leader. Interviews were 45 to 70 minutes long and conducted via telephone from May to December 2017. Recruitment ceased when data saturation[9] was judged to have been achieved. The University of Vermont and Children’s Hospital of Philadelphia Institutional Review Boards approved this study.
#
Interview Preparation and Process
A semi-structured interview guide was prepared based on a review of relevant publications about clinical informatics and informatics research, supplemented by the investigators' collective knowledge of clinical informatics, research, and organizational behavior. Using qualitative methods, the investigators performed an initial reflexivity exercise[10] to identify themes anticipated to be discovered through the interviews, and that would serve as a template both for the interview and for coding at the initial analysis level described below. This list included the concepts of human, technical, and organizational infrastructures as well as other sociotechnical concepts (e.g., clinical decision support, end-user focus) frequently mentioned in relation to clinical informatics and health care delivery. Informed consent for those interviewed promised anonymity and was obtained at the beginning of the telephone call. Interviews were recorded and transcribed. Prior to analysis, transcriptions were reviewed by informants to verify accuracy.
#
Data Analysis
Analysis was done on three levels and was conducted first sequentially, and then in reciprocating fashion (i.e., analysis on one level sometimes was revised based on analysis at another level). Level I analysis, the most basic, employed grounded theory,[11] involving iterative reviews of transcripts to identify themes beyond those identified in the interview preparation process described above. The investigators independently coded transcripts using NVivo software. Coding was interrupted regularly to document theoretical notes—ideas that occurred to investigators about more abstract concepts triggered by the comments in the transcript.[12] Some concepts initially considered as themes were discarded while the others were distilled into a first set of key themes through the qualitative data analysis style known as editing.[13] Disagreements in coding were resolved by discussion.
Level II was immersion/crystallization analysis.[14] In immersion, the investigators spent hours reexamining the coded transcripts and theoretical notes taken during Level I and revisiting the Level I themes. In crystallization, the investigators periodically suspended immersion to reflect on the analysis and identify patterns that had been noted during immersion.
Level III analysis consisted of informant checking[15]—sharing a six-page summary of results with a subsample of informants who, without prompting, had expressed interest in reviewing the initial findings. This was done with the intent of potentially validating, rejecting, or enhancing both Level I and Level II results. Nine of 18 participants expressed unprompted interest, of whom six provided feedback (two in writing, four via a 30-minute phone call).
#
#
Results
Level I Results
The investigators' initial assumptions about the three infrastructure categories (human, technical, and organizational) underlying the use of EHR data for research and QI were validated throughout all levels of analysis. Six themes emerged in Level I analysis, which the investigators named determinants because of their effect on underlying AHCs infrastructures. Since these determinants always operated according to the local context, they were referred to as local determinants. The local determinants, none of which exerts priority over others due to variations in local contexts, are: Institutional (Intra- or Inter-) Relationships, Resource Availability, Data Strategy, Response to Change, Leadership Support, and Degree of Mission Alignment. Each determinant is discussed below, accompanied by one or more representative quotes from interview transcripts. Quotes were selected for their aptness in illustrating thematic elements of the results. Informants have been assigned one of the letters A-R according to the order of interview, and a letter in brackets follows each quote to indicate its source (e.g., [A]). The presence or absence of quotes from any one informant reflects the suitability of quotes to the expressed theme, rather than the degree to which an informant's interview influenced the findings. [Table 1] provides additional quotes from interview transcripts illustrating how the determinants either positively or negatively affected the use of EHR data. In some cases, quotes have been edited to enhance readability while retaining speaker's intent.
Abbreviations: CEO, Chief Executive Officer; CMO, Chief Medical Officer; CT, computed tomography; PHI, protected health information.
Institutional Relationships details the degree to which groups within the AHCs can work effectively with one another around EHR data, negotiating within the constraints of organizational structure. For example, developing an IRB agreement or a single IRB was frequently described.
Example 1: “We have this concept of a reliant IRB. … But it's the idea of if we approve something, the [other] IRB is just going to look at it really quickly [or vice versa].” [A]
Example 2: “So we're blessed. Well over 15 years ago, the organization…agreed to have one combined IRB….” [I]
Some informants described their AHCs as single institutions, with a medical school owning the health system. Unified AHCs more readily managed conflicts over data access and other matters than did the many separate, but affiliated medical schools and health systems. Even so, solutions were achieved in the nonunified AHCs, as described in the positive example concerning data access in [Table 1], where outside university researchers were made noncompensated employees of a health system to overcome data access barriers.
Resource Availability expresses the degree to which financial, technical, or human resources could be made available to create and sustain data and analytic systems. As expected, wealthier AHCs could more readily invest in data infrastructures, hire and support analytic teams. Independent of institutional wealth, most informants supported investment in people over investment in technology.
Example: “We're right in the middle of this debate as we look to invest in the next phase—should we hire people or should we acquire tools? Most of us agree that if push came to shove, the person is probably the most valuable asset…”[J]
Talented individuals with particular skill sets played key roles over long periods of time, with acknowledgment of the precarious nature of relying only on few individuals.
Example: “They're saying...this is all great...but what if you get hit by a bus? ...There is no one who could take this over. I'm blessed to have a couple of people who have been with me for a long time. But if they left, I'd be in deep weeds, too. So, I would say my strategy is a very risky one….” [J]
Even at wealthy institutions, monetary resources are an ongoing concern. Funding from grants was unreliable and informants stressed institutional investment as an important component for establishing infrastructure.
Example: “…If you can get CTSA money, great. If not, it's got to come from some other place. [Maybe] you can generate enough revenue off of individual grants. But that's sort of living month to month off of soft money hoping that some researcher is going to come to you with a research request that's funded so that you can pay your data analyst. That's a very tricky set up to make financially stable.”[C]
In [Table 1], the quote describing a one-time opportunity for funding in the positive example column shows how serendipity might come into play with resources. In general, all AHCs were attentive towards planning for future resource needs.
Data Strategy describes the planning for EHRs, data platforms, sourcing, storage, and management. Informants indicated that strategy was largely absent in initial efforts by AHCs to harvest data for research and QI. Systems grew up organically and data access was often problematic. The distinction between data access and data usage was frequently mentioned.
Example: “… self-service tools, I would say that's a little bit of a misnomer...We've had a lot of examples where people have gotten the wrong answers and written those in the grant applications or …reports going to surgical outcomes reporting programs. So that flows into the data stewardship thing. I'd much rather have an identified person with a tool that they know how to use; …we can get it to them in a couple minutes using the tools that we have. Trying to teach them all how to use that tool would probably be painful and I would worry about it.” [O]
As described in the positive example under Data Strategy in [Table 1], as AHCs mature, data access is formalized, data dictionaries and metadata are developed, and systematized data governance emerges. Several informants mentioned that they were starting from the beginning in new AHCs, establishing clear data governance, although never an easy process, would be a key place to begin.
Example 1: “By creating a data governance group, that really helped standardize and make our data more secure.”[Q]
Example 2: “There are two meanings of governance in informatics: type one is “who has the rights to use the data,” and type 2 is “what are our standards for calling gender?” And we don't do any of type 2 governance.” [L]
Response to Change expresses the ability of the AHCs to adapt to alterations in the systems and circumstances affecting EHR data. Changes in EHR systems, local environments, and regulations covering health information technology occurred both predictably and unexpectedly. Legacy EHR systems which are abandoned as they become too expensive to support, and solutions are needed to access data from old systems. Health systems merge or are bought or sold, each time affecting the AHCs' ability to use its data. This can bring opportunity or calamity.
Example 1: “As there's been so many changes, we're trying to take advantage and look at the bright side of the really big, large-scale organizational leadership changes, looking for an opportunity to build more bridges between our health services research partners and– the operations of the delivery system.”[R]
Example 2: “...about 5 years of cajoling the institutions that this was something we needed to do...And survived a couple of significant changes in leadership at the C-suites at both of the hospitals, which basically set us back to zero in establishing personal relationships again.”[I]
Leadership Support expresses leadership's advocacy for EHR-related research and QI. Some health system chief executive officers and medical school deans were enthusiastic proponents of using EHR data, lending considerable support to chief medical information officers and informatics researchers. In other AHCs, leadership was absent or not helpful. Types of support included financial, strategic, and political.
Example 1: “[CEO] backed it, which provided resources. He wanted informatics for operations and QI, but also for scholarly activity. “[E]
Example 2: “…If the political will is there, it's very easy to set up data sharing agreements so that the academic researchers can access the clinical data…The places that don't do it or have barriers in place, it is much more of a political willpower issue than it is a technical or regulatory piece.”[C]
Degree of Mission Alignment reflects the degree of overlap of strategies and interests within the AHCs or with key outside organizations that might facilitate or hinder work. Mission alignment presupposes that institutions have clearly expressed, shared missions, which is not always the case, but can be used to guide decision making.
Example 1: [in reference to establishing and hiring for the role of Chief Research Informatics Officer]”…And that's so important to our mission that we needed an individual whose job it is to support research, to lobby, to offer resources, to answer questions more than I have expertise or time to answer. That really was a wise decision on the part of our dean.”[M]
Example 2: “Part of it is trying to get people to understand what the shared mission is and trying to find the alignment among different groups. Sometimes that means rejecting certain research projects.”[C]
#
Level II Results
As described under Methods, the immersion/crystallization process served to review Level I themes and identify higher level patterns from the Level I findings. During this process, the paradigm of the learning health system surfaced as a construct through which many of the Level I themes could be made actionable for those working with EHR data in AHCs. Originally conceptualized by the Institute of Medicine,[16] the learning health system has since been summarized as “an organizational architecture that facilitates formation of communities of patients, families, frontline clinicians, researchers, and health system leaders who collaborate to produce and use big data; large electronic health and health care data sets (big data); QI for each patient at the point of care brought about by the integration of relevant new knowledge generated through research; and observational research and clinical trials done in routine clinical care settings.”[17] By contextualizing the Level I themes in the learning health system paradigm, the Level I analysis results could be made useful to a broad audience—connecting the practical data from transcripts to the aspirational vision.
Three products emerged during Level II analysis: a theoretical conceptual model ([Fig. 1]) to link the local determinants identified in Level I to the construct of learning health system, a group of overarching key concepts, and a set of practical questions, linked to the key concepts, to guide those working in AHCs in assessing their use of EHR data for research and QI.
The first product, a graphic theoretical conceptual model ([Fig. 1]), represents the six local determinants as points on a hexagon, with distance from center to perimeter on a hypothetical five-point maturity scale (e.g., Unaware, Beginner, Intermediate, Advanced, Mature). This theoretical model allows both for variation in maturity among the six determinants and for serial assessments over time, an example of which is described in the caption of [Fig. 1].
The second product was the collection of seven over-arching key concepts (middle column, [Table 2]), generated based on reanalysis of those transcript quotes that featured two or more local determinants within the same or adjacent sentences, thus describing synergistic or antagonistic interactions among the local determinants. For example, when asked about challenges, an interviewee stated, “Right now, our biggest challenge is that the medical school and the hospital have real trouble on agreeing where the indirect costs go and who owns the project,” [J]relaying information about both Institutional Relationships and Resource Availability.
Abbreviation: IRB, Institutional Review Board.
The third product of Level II analysis was a set of practical questions directly linked to the key concepts (left-hand column, [Table 2]). The questions, which could be used ultimately to guide AHC leaders in assessing their use of EHR data for research and QI, were prompted both by the Level II action agenda and by interviewee responses to an item added early on in the interview guide—“Knowing what you know now, if you were starting from scratch, what would be most important?” For reference, [Table 2] also displays the local determinants (right-hand column) associated with key concepts and related self-assessment questions. Illustrative quotes from the interviews that are connected to the self-assessment questions can be seen in [Table 3].
Abbreviations: CFO, Chief Financial Officer; QI, quality improvement.
#
Level III Results
Six informants who had expressed an interest in learning about study findings reviewed and commented on a preliminary draft of results that included the results of Level I and Level II analyses. Generally, they endorsed the results and found the questions clear and valuable. Two examples were: “It makes sense. I like the models that you've created and how you organized things”; [I] “I thought it did a great job of combining quantitative and qualitative and defining themes that I think would be very useful to give bigger picture of issues related to learning health systems and research.” [O] However, feedback also provided insights into the limitations of the findings. One informant expressed skepticism about the practicality of the learning health system model when applied to EHR data: “…how do you get people to do this? I wasn't quite sure what kind of incentives need to come out of this in order for it to actually work.”[G] Another informant pointed out that learning health systems, as portrayed in our preliminary results, consisted of much more than learning to harvest EHR data and make it accessible for analysis: “(the) learning health system is much bigger than the infrastructure components you chose to look at. It's the action arm, the collaboration, the optimization, in addition to the analytics and the data and the databases…—you really need to be very careful that these are the infrastructure needs for only a piece of the puzzle…If someone solved everything that you have here, they would still not have a learning health system.” [I] Several informants reflected on the broad generality of some of the conclusions, citing how many of the determinants could readily be applied to other areas of social endeavor: “It seems like these dimensions are not unique to AHCs nor to the goals and objectives of either research and QI or learning health systems.”[I]
#
#
Discussion
The principal findings of this multilevel qualitative research study were (1) identification of local determinants for understanding how AHCs make use of EHR data in research and QI; (2) a conceptual model, based on the local determinants and contextualized in the learning health system paradigm, to visualize and communicate about AHCs' maturity in using EHR data; and (3) assessment questions, based on over-arching key concepts derived from the analysis, to guide those working in AHCs in using EHR data to support research and QI. These findings, based on the analysis of interviews from the front lines, are of practical utility and of potential guiding importance to the community of clinical informaticians whose work with EHR data now undergirds much health learning.
Informaticians work in multiple roles in their AHCs, influencing clinical care, research and QI, and hospital system strategic planning. Each role's decision-making affects the secondary use of EHR data for health learning, albeit in potentially different ways. Serving in the various roles, informaticians must remain cognizant of their being a part of something “larger”—a local AHC learning health system. We here discuss the local “learning health system” and are careful to distinguish it from the national learning health system envisioned by Friedman et al.[18] [19] As a part of the local learning health system, informaticians acting in operational capacities make EHR modifications to facilitate clinical care. These modifications can, in turn, enable, impair, or leave unaffected research and QI efforts at the learning health system level. Conversely, to facilitate research or QI for the learning health system, informaticians may make changes to the EHR that require additional effort on the part of clinicians (e.g., extra mouse clicks on drop-down menus to furnish discrete data elements) or oversee data extraction for learning health system purposes that represent opportunity costs for information technology analysts. Such trade-offs within the AHC in the service of the learning health system are likely, if not inevitable.
Clinical informaticians in leadership, such as chief medical information and chief research information officers, must guide a variety of informatics-related decisions for the AHC and are often responsible for considering and effectively communicating about the downstream consequences of these decisions on clinical care, improvement, and research. These informatics leaders are not alone in the deliberative process and must work with counterparts in health center business operations and information services. They also advise noninformaticians such as medical school deans, health system CEOs (Chief Executive Officers), and others within the AHC who set policies related to the use of EHR data and who sometimes must choose between very costly competing priorities. As a study interviewee pointed out, the expense of a new EHR system for an AHC may be equivalent to the cost of a tower for the health center's hospital ([Table 1]).
Given these complicated roles, sometimes bestowed on a single individual, clinical informaticians can draw on their own experiences to add weight to their input. However, these individual experiences likely are variable in frequency, duration, and depth, threatening the authoritativeness of an individual informatician's input. The present study results, based on the deep experience of others in the AHC informatics arena, offer practical guidance for clinical informaticians to supplement their own experiences and aid in their decision-making processes. The six determinants of EHR use for research and QI identified in Level I analysis in our study—Institutional Relationships, Resource Availability, Data Strategy, Response to Change, Leadership Support, and Degree of Mission Alignment—along with the key concepts and assessment questions identified in Level II analysis, provide clinical informaticians with a new set of reference points to understand how their work fits into the learning health system.
As was pointed out by a respondent in our Level III analysis, the identified determinants of secondary use of EHR data could be applied readily to other areas of social endeavor, not being unique to the AHC setting, research, or QI. However, it is important to underscore additional distinctions derived from the Level I analysis of interviewee transcripts: the determinants operate differently according to the local context, no single determinant predominates over others, and determinants can exert either positive or negative influence (illustrated in the many examples in the Results text and in [Table 1]). When viewed in terms of leveraging the institution’s EHR data, the determinants can inform consensus-building at AHCs. Level II analysis provides additional tools for this consensus-building process, with a conceptual model of how the local determinants may change over time ([Fig. 1]). [Fig. 1] represents a hypothetical scenario demonstrating the potential value of the maturation conceptual model, with an initial analysis of maturity done at one point in time followed by another analysis conducted at a later point in time. See [Fig. 1] for details of the scenario and explanation of conceptual model use. Also of practical use is a list of key concepts and a set of assessment questions ([Table 2]) to guide informatics professionals in consensus-building with noninformatics professionals and at the same time help them gauge their AHC's progress toward a learning health system.
Several researchers have described the use of informatics for research at AHCs.[20] [21] [22] Our study findings are most consistent with development of a maturity model, as opposed to a deployment model, as findings in the present work are insufficiently granular to be prescriptive, as would be required for a deployment model.[22] A distinguishing feature between maturity and deployment is that maturity models measure organizational capacity to deliver a service, considering multiple factors including culture, policy, and organization, whereas deployment indices measure the degree to which an institution has implemented a technology related to delivering a service.[22] Consistent with this insight, [Fig. 1] developed in this analysis deliberately models maturity without explicitly specifying steps.
In terms of aiding in assessments, the tools developed in this study supplement existing tools for assessing the use of EHR data for research. Examples of known models include the Health Information Management Systems Society tools for EHR adoption and progress[23] and Educause,[24] a maturity model developed for higher education information technology as described in Knosp et al's study of maturity of research IT in academic medicine.[22] In fact, although done through different methodologies, the present study findings are aligned with and complement many results from Knosp et al's work. For example, both studies cite leadership, governance and policies, and mission alignment as maturity factors, and other findings in the present study (e.g., Institutional Relationships) are suggested by Knosp et al's “supportive culture.” In contrast, analysis of the informant transcripts in the present study also yielded Resource Availability and Response to Change as key factors in assessment. Importantly, we argue that the present work offers a more concise, practical, and applied developmental assessment resource, with a maturity conceptual model, key concepts, and assessment questions that are available to clinical informatician leaders for day-to-day guidance. Both studies point to the need for more rigorous future research to create validated tools for broad application. Underlying many decisions is a shared value proposition for health learning that integrates many facets of the AHC agenda. In terms of resources currently available for clinical informaticians about local learning health systems, the 300+ page IOM Workshop Series Summary[25] clearly overlaps with the present work and is considerably more expansive than the present study. A major virtue of that document is its focus on involving patients in a “shared learning environment,” a perspective lacking in the present study. That workshop report does not, however, aspire to be practical. The quotes found in the Results and Tables from the present study's frontlines informants offer practical guidance for individuals on the forefront of learning health system development.
With respect to learning health systems, as observed by a Level III analysis informant, results from our study address only some aspects of a learning health system, namely those involved in harvesting EHR data and making it available for secondary uses and users. As to the learning health systems overall, while disease-specific EHR-based learning health systems such as Improve Care Now[26] and networks funded by the Patient Centered Outcomes Research Institute[27] have begun to appear, there is as yet no national learning health system as envision by Friedman et al.[18]
For the AHC and the goal of a local learning health system, the views of Grumbach et al[28] are most closely aligned to the present work. Alluding to the traditional AHCs missions of research, education, and patient care, we concur with those authors that “AHCs should replace the concept of a tripartite mission with a commitment to a single mission: the improvement of health and health care through advancing, applying, and disseminating knowledge.” In Level III feedback, some of our informants, while believing in the importance and usefulness of EHR data for research and QI, also expressed skepticism about the learning health system concept, doubtful of incentives for action. In fact, multiple incentives exist in the overlap of AHCs' clinical and academic missions with their financial survival. EHR data can be used to inform improvements in care to make for better patient health. EHR data used to define patient populations for clinical trials and for observational comparative effectiveness research can lead both to new knowledge for society and increased research support for the AHCs. Furthermore, as AHCs increasingly enter into alternative payment model arrangements in accountable care organizations, strong business incentives for local learning health systems will arise. In contrast to the skepticism expressed by a few informants, we concur with Grumbach et al[28] that the learning health system construct provides a practical approach for reframing multiple AHCs' goals within the newer concepts of value-based care.
This research has limitations characteristic of qualitative studies. Our sampling strategy of informants was not random. The above-cited principles and techniques of qualitative analysis will be challenging to those unfamiliar with this line of inquiry. However, the topic of this investigation is insufficiently defined for quantitative approaches and requires the thick description and interpretation that only qualitative research would provide. Nevertheless, further work is needed to better define and validate measures of learning health system maturity.
#
Conclusion
We sought to identify from individuals working on the front lines in AHCs how they have established infrastructures to use EHR data to support research and QI, and thereby “learn,” in the broadest sense of the word, from the data collected in day-to-day patient care. We discovered that local conditions are paramount in modifying the determinants of Institutional Relationships, Resource Availability, Response to Change, Data Strategy, Leadership Support, and Degree of Mission Alignment. We have offered several types of practical guidance to those working to help AHCs become local learning health systems. The AHC learning health systems that have been growing organically according to local contexts must now develop and mature more deliberately. Only then can the overarching learning health system envisioned by the National Academy of Medicine be fully realized.
#
Clinical Relevance Statement
Federal incentives have enabled widespread adoption of EHRs in the United States healthcare system, but challenges remain in implementing the processes and infrastructures required to best leverage this electronic data for health learning. Clinical informaticians working in AHCs are part of local learning health systems, and the decisions that they make to modify EHRs to improve care provided by clinicians or to support research or QI often represent trade-offs between different aspects of the tripartite AHC mission: research, education, and clinical care. Based on this investigation, clinical informaticians can be guided by the collective knowledge of peers who have worked in 24 AHCs over the past 30 years overseeing organizational strategy for improving health care learning through secondary uses of EHR data.
#
#
Conflict of Interest
D.F.F. reports other funding from Phrase Health, Inc. outside the submitted work.
Acknowledgments
The research was conducted during a sabbatical leave funded by the University of Vermont Health Network Medical Group and the Robert Larner College of Medicine Pediatrics Department for Dr Wasserman, with additional in-kind and tangible supports from the Children’s Hospital of Philadelphia. The authors acknowledge 18 anonymous individuals who generously agreed to be interviewed for this research, with special thanks to the six who provided additional feedback about the preliminary study results. The authors also wish to acknowledge the work of our transcriptionist, April Henderson, and extend special gratitude to several anonymous members of the Clinical Informatics Section of the Children’s Hospital of Philadelphia for advice on the manuscript. Portions of this work were presented in Toronto on May 6, 2018 at the Pediatric Academic Societies Meetings.
Protection of Human and Animal Subjects
The University of Vermont and Children’s Hospital of Philadelphia Institutional Review Boards approved this study.
-
References
- 1 Blumenthal D. Launching HITECH. N Engl J Med 2010; 362 (05) 382-385
- 2 Grossman C, Goolsby WA, Olsen L, McGinnis JM. Clinical Data as the Basic Staple of Health Learning: Creating and Protecting a Public Good. Washington, DC: Institute of Medicine; 2011
- 3 Wartman SA, Zhou Y, Knettel AJ. Health reform and academic health centers: commentary on an evolving paradigm. Acad Med 2015; 90 (12) 1587-1590
- 4 Roper WL, Newton WP. The role of academic health centers in improving health. Ann Fam Med 2006; (04) (Suppl. 01) S55-S60
- 5 Washington V, DeSalvo K, Mostashari F, Blumenthal D. The HITECH era and the path forward. N Engl J Med 2017; 377 (10) 904-906
- 6 Office of the National Coordinator for Health Information Technology. Health IT Playbook. Available at: https://www.healthit.gov/playbook/ . Accessed March 18, 2019
- 7 O'Brien BC, Harris IB, Beckman TJ, Reed DA, Cook DA. Standards for reporting qualitative research: a synthesis of recommendations. Acad Med 2014; 89 (09) 1245-1251
- 8 Suri H. Purposeful sampling in qualitative research synthesis. Qual Res J 2011; 11 (02) 63-75
- 9 Fusch P, Ness L. Are we there yet? Data saturation in qualitative research. Qual Rep 2015; 20 (09) 1408-1416
- 10 Malterud K. Qualitative research: standards, challenges, and guidelines. Lancet 2001; 358 (9280): 483-488
- 11 Berg BL, Lune H, Lune H. Qualitative Research Methods for the Social Sciences. Boston, MA: Pearson; 2004
- 12 Tufford L, Newman P. Bracketing in qualitative research. Qual Soc Work 2012; 11 (01) 80-96
- 13 Ash JS, Smith AC, Starvi PZ. Performing subjectivist studies in the qualitative traditions responsive to users. In: Kathryn JH, Marion JB. Evaluation Methods in Biomedical Informatics. New York, NY: Springer; 2006: 267-300
- 14 Borkan J. Immersion/Crystallization. In: BF Crabtree, WL Miller, eds. Doing Qualitative Research. 2nd ed. Thousand Oaks, CA: Sage Publications; 1999
- 15 Angen MJ. Evaluating interpretive inquiry: reviewing the validity debate and opening the dialogue. Qual Health Res 2000; 10 (03) 378-395
- 16 Smith MD. , Institute of Medicine (US), Committee on the Learning Health Care System in America. Best Care at Lower Cost: the path to continuously learning health care in America. Washington, DC: National Academies Press; 2012
- 17 Forrest CBMP, Margolis P, Seid M, Colletti RB. PEDSnet: how a prototype pediatric learning health system is being expanded into a national network. Health Aff (Millwood) 2014; 33 (07) 1171-1177
- 18 Friedman CP, Wong AK, Blumenthal D. Achieving a nationwide learning health system. Sci Transl Med 2010; 2 (57) 57cm29
- 19 Friedman C, Rubin J, Brown J. , et al. Toward a science of learning systems: a research agenda for the high-functioning Learning Health System. J Am Med Inform Assoc 2015; 22 (01) 43-50
- 20 DiLaura R, Turisco F, McGrew C, Reel S, Glaser J, Crowley Jr WF. Use of informatics and information technologies in the clinical research enterprise within US academic medical centers: progress and challenges from 2005 to 2007. J Investig Med 2008; 56 (05) 770-779
- 21 Murphy SN, Dubey A, Embi PJ. , et al. Current state of information technologies for the clinical research enterprise across academic medical centers. Clin Transl Sci 2012; 5 (03) 281-284
- 22 Knosp BM, Barnett WK, Anderson NR, Embi PJ. Research IT maturity models for academic health centers: early development and initial evaluation. J Clin Transl Sci 2018; 2 (05) 289-294
- 23 (HIMSS) HIMSS. Electronic Medical Record Adoption Model (EMRAM). Secondary Electronic Medical Record Adoption Model (EMRAM); 2018 . Available at: https://www.himssanalytics.org/sites/himssanalytics/files/North_America_EMRAM_Information_2018.pdf . Accessed March 15, 2019
- 24 Grajek S. The digitization of higher education: charting the course. EDUCAUSE Review 2016 . Available at: https://er.educause.edu/articles/2016/12/the-digitization-of-higher-education-charting-the-course . Accessed March 22, 2019
- 25 McGinnis JM, Powers B, Grossmann C. Digital Infrastructure for the Learning Health System: the Foundation for Continuous Improvement in Health and Health Care: Workshop Series Summary. Washington, DC: National Academies Press; 2011
- 26 Egberg MD, Kappelman MD, Gulati AS. Improving care in pediatric inflammatory bowel disease. Gastroenterol Clin North Am 2018; 47 (04) 909-919
- 27 Fleurence RL, Curtis LH, Califf RM, Platt R, Selby JV, Brown JS. Launching PCORnet, a national patient-centered clinical research network. J Am Med Inform Assoc 2014; 21 (04) 578-582
- 28 Grumbach K, Lucey CR, Johnston SC. Transforming from centers of learning to learning health systems: the challenge for academic health centers. JAMA 2014; 311 (11) 1109-1110
Address for correspondence
-
References
- 1 Blumenthal D. Launching HITECH. N Engl J Med 2010; 362 (05) 382-385
- 2 Grossman C, Goolsby WA, Olsen L, McGinnis JM. Clinical Data as the Basic Staple of Health Learning: Creating and Protecting a Public Good. Washington, DC: Institute of Medicine; 2011
- 3 Wartman SA, Zhou Y, Knettel AJ. Health reform and academic health centers: commentary on an evolving paradigm. Acad Med 2015; 90 (12) 1587-1590
- 4 Roper WL, Newton WP. The role of academic health centers in improving health. Ann Fam Med 2006; (04) (Suppl. 01) S55-S60
- 5 Washington V, DeSalvo K, Mostashari F, Blumenthal D. The HITECH era and the path forward. N Engl J Med 2017; 377 (10) 904-906
- 6 Office of the National Coordinator for Health Information Technology. Health IT Playbook. Available at: https://www.healthit.gov/playbook/ . Accessed March 18, 2019
- 7 O'Brien BC, Harris IB, Beckman TJ, Reed DA, Cook DA. Standards for reporting qualitative research: a synthesis of recommendations. Acad Med 2014; 89 (09) 1245-1251
- 8 Suri H. Purposeful sampling in qualitative research synthesis. Qual Res J 2011; 11 (02) 63-75
- 9 Fusch P, Ness L. Are we there yet? Data saturation in qualitative research. Qual Rep 2015; 20 (09) 1408-1416
- 10 Malterud K. Qualitative research: standards, challenges, and guidelines. Lancet 2001; 358 (9280): 483-488
- 11 Berg BL, Lune H, Lune H. Qualitative Research Methods for the Social Sciences. Boston, MA: Pearson; 2004
- 12 Tufford L, Newman P. Bracketing in qualitative research. Qual Soc Work 2012; 11 (01) 80-96
- 13 Ash JS, Smith AC, Starvi PZ. Performing subjectivist studies in the qualitative traditions responsive to users. In: Kathryn JH, Marion JB. Evaluation Methods in Biomedical Informatics. New York, NY: Springer; 2006: 267-300
- 14 Borkan J. Immersion/Crystallization. In: BF Crabtree, WL Miller, eds. Doing Qualitative Research. 2nd ed. Thousand Oaks, CA: Sage Publications; 1999
- 15 Angen MJ. Evaluating interpretive inquiry: reviewing the validity debate and opening the dialogue. Qual Health Res 2000; 10 (03) 378-395
- 16 Smith MD. , Institute of Medicine (US), Committee on the Learning Health Care System in America. Best Care at Lower Cost: the path to continuously learning health care in America. Washington, DC: National Academies Press; 2012
- 17 Forrest CBMP, Margolis P, Seid M, Colletti RB. PEDSnet: how a prototype pediatric learning health system is being expanded into a national network. Health Aff (Millwood) 2014; 33 (07) 1171-1177
- 18 Friedman CP, Wong AK, Blumenthal D. Achieving a nationwide learning health system. Sci Transl Med 2010; 2 (57) 57cm29
- 19 Friedman C, Rubin J, Brown J. , et al. Toward a science of learning systems: a research agenda for the high-functioning Learning Health System. J Am Med Inform Assoc 2015; 22 (01) 43-50
- 20 DiLaura R, Turisco F, McGrew C, Reel S, Glaser J, Crowley Jr WF. Use of informatics and information technologies in the clinical research enterprise within US academic medical centers: progress and challenges from 2005 to 2007. J Investig Med 2008; 56 (05) 770-779
- 21 Murphy SN, Dubey A, Embi PJ. , et al. Current state of information technologies for the clinical research enterprise across academic medical centers. Clin Transl Sci 2012; 5 (03) 281-284
- 22 Knosp BM, Barnett WK, Anderson NR, Embi PJ. Research IT maturity models for academic health centers: early development and initial evaluation. J Clin Transl Sci 2018; 2 (05) 289-294
- 23 (HIMSS) HIMSS. Electronic Medical Record Adoption Model (EMRAM). Secondary Electronic Medical Record Adoption Model (EMRAM); 2018 . Available at: https://www.himssanalytics.org/sites/himssanalytics/files/North_America_EMRAM_Information_2018.pdf . Accessed March 15, 2019
- 24 Grajek S. The digitization of higher education: charting the course. EDUCAUSE Review 2016 . Available at: https://er.educause.edu/articles/2016/12/the-digitization-of-higher-education-charting-the-course . Accessed March 22, 2019
- 25 McGinnis JM, Powers B, Grossmann C. Digital Infrastructure for the Learning Health System: the Foundation for Continuous Improvement in Health and Health Care: Workshop Series Summary. Washington, DC: National Academies Press; 2011
- 26 Egberg MD, Kappelman MD, Gulati AS. Improving care in pediatric inflammatory bowel disease. Gastroenterol Clin North Am 2018; 47 (04) 909-919
- 27 Fleurence RL, Curtis LH, Califf RM, Platt R, Selby JV, Brown JS. Launching PCORnet, a national patient-centered clinical research network. J Am Med Inform Assoc 2014; 21 (04) 578-582
- 28 Grumbach K, Lucey CR, Johnston SC. Transforming from centers of learning to learning health systems: the challenge for academic health centers. JAMA 2014; 311 (11) 1109-1110