Subscribe to RSS
DOI: 10.1055/s-0042-1748145
Generating and Reporting Electronic Clinical Quality Measures from Electronic Health Records: Strategies from EvidenceNOW Cooperatives
- Abstract
- Background and Significance
- Objective
- Methods
- Results
- Discussion
- Limitations
- Conclusion
- Clinical Relevance Statement
- Multiple Choice Questions
- References
Abstract
Background Electronic clinical quality measures (eCQMs) from electronic health records (EHRs) are a key component of quality improvement (QI) initiatives in small-to-medium size primary care practices, but using eCQMs for QI can be challenging. Organizational strategies are needed to effectively operationalize eCQMs for QI in these practice settings.
Objective This study aimed to characterize strategies that seven regional cooperatives participating in the EvidenceNOW initiative developed to generate and report EHR-based eCQMs for QI in small-to-medium size practices.
Methods A qualitative study comprised of 17 interviews with representatives from all seven EvidenceNOW cooperatives was conducted. Interviewees included administrators were with both strategic and cooperative-level operational responsibilities and external practice facilitators were with hands-on experience helping practices use EHRs and eCQMs. A subteam conducted 1-hour semistructured telephone interviews with administrators and practice facilitators, then analyzed interview transcripts using immersion crystallization. The analysis and a conceptual model were vetted and approved by the larger group of coauthors.
Results Cooperative strategies consisted of efforts in four key domains. First, cooperative adaptation shaped overall strategies for calculating eCQMs whether using EHRs, a centralized source, or a “hybrid strategy” of the two. Second, the eCQM generation described how EHR data were extracted, validated, and reported for calculating eCQMs. Third, practice facilitation characterized how facilitators with backgrounds in health information technology (IT) delivered services and solutions for data capture and quality and practice support. Fourth, performance reporting strategies and tools informed QI efforts and how cooperatives could alter their approaches to eCQMs.
Conclusion Cooperatives ultimately generated and reported eCQMs using hybrid strategies because they determined neither EHRs alone nor centralized sources alone could operationalize eCQMs for QI. This required cooperatives to devise solutions and utilize resources that often are unavailable to typical small-to-medium-sized practices. The experiences from EvidenceNOW cooperatives provide insights into how organizations can plan for challenges and operationalize EHR-based eCQMs.
#
Keywords
quality improvement - primary health care - electronic health records - quality assurance - health careBackground and Significance
A decades long goal in the United States has been to link reimbursement for primary care services to the measurable quality of those services.[1] Central to this goal has been the development and implementation of electronic clinical quality measures (eCQMs) by which primary care providers, when using electronic health records (EHRs), could effectively and accurately report on the quality of the care they provide.[2] [3] EHRs are mechanisms by which eCQMs can be generated and reported to measure clinical performance.[4] Ideally, EHRs capture the key clinical data necessary to generate eCQMs through routine clinical care for accurate and reliable performance results.[5] [6] In practice, however, generating and reporting eCQMs can be challenging due to factors such as variable data quality and limits to EHR functionality.[7]
In the primary care domain, several studies found that substantial proportions of EHR-based eCQMs had incomplete or incorrect results[8] [9] [10]; and Balasubramanian et al surveyed 1,181 medical practices and found that almost 20% reported being unable to create eCQM-based quality reports.[11] Cohen et al reported eCQM-related challenges with EHRs that included a lack of functionality for developing performance reports that could meet user needs, discordance between clinical guidelines and measures available in reports, and delays in developing regional data infrastructure to support reporting.[12] In addition to the technical challenges, generating accurate eCQMs from EHRs may require significant behavioral changes on the part of clinicians and investments in people, technology, and processes.[13]
Navigating these changes may be especially difficult for small-to-medium-sized primary care practices, as they routinely have limited access to formal health information technology (IT), health IT support staff within practices, and are late adopters of EHRs.[14] Strategies and tools tailored to the unique environment of small- and medium-sized primary care practices are necessary to achieve successful and sustainable quality improvement (QI) in primary care.
In an effort to utilize eCQMs to promote QI strategies, build capacity in practices for delivering evidence-based care, and improve cardiovascular disease (CVD)-related patient outcomes in small- and medium-sized primary care practices, the Agency for Healthcare Research and Quality (AHRQ) funded the EvidenceNOW initiative.[15] The initiative focused on improving the “ABCS” from the Million Hearts[16] initiative, that is, aspirin prescribing (A); blood pressure control (B); cholesterol management (C); and smoking cessation (S). EvidenceNOW was a real-world experiment in primary care QI that included seven regional “cooperatives”; each enrolled approximately 250 small-to-medium-sized practices and aimed to use EHRs, as well as support from external practice facilitators (PFs) to generate, aggregate, and transmit eCQMs on a quarterly basis.
#
Objective
Our objective was to identify strategies that could help future QI initiatives generate and report eCQMs using EHRs in primary care practices. These strategies came directly from the seven cooperatives engaged in EvidenceNOW who were charged with developing methods for measuring the ABCS eCQMs from the primary care practices.
#
Methods
We conducted the study over a 10-month period in 2018 in the following three stages: (1) identify concepts for eCQMs and EHRs that informed the design of semistructured interview guides, (2) conduct semistructured telephone interviews with administrators and PFs who worked within EvidenceNOW cooperatives, and (3) analyze interview transcripts using an immersion-crystallization method to qualitatively determine key strategies for generating and reporting eCQMs for QI. Northwestern University's Institutional Review Board approved this study.
Developing a Working Framework
The primary research team was comprised of health informaticians (J.E.R. and L.V.R.) and a health services graduate student (A.R.). To promote reflexivity and protect against bias, the three gained a joint understanding of the literature in eCQMs and EHRs, determined key concepts, then arranged those key concepts into a working framework that was vetted and approved by the larger group of coauthors. From that effort the team developed two semistructured interview guides: (1) one for cooperative leaders and (2) one for PFs. Each guide included open-ended questions and probes ([Supplementary Appendix A], available in the online version). We iteratively revised the guides based on feedback from the coauthors and piloted the questions with an external researcher from one cooperative.
#
Recruiting and Conducting Semistructured Interviews
To gain a holistic picture of implementation approaches, we identified roles across cooperatives that could describe the macro-level strategic plans, as well as the micro-level interactions with practices and providers. We recruited a purposive sample from each cooperative of at least one leader with both health IT strategic and cooperative-level operational responsibilities (macro view), and one PF with hands-on experience helping practices enrolled in EvidenceNOW use EHRs and eCQMs (micro view). As our focus was on organizational-level strategies, we did not interview practicing clinicians from practices participating in EvidenceNOW. The cooperatives' principal investigators and research committee members recommended participants from their teams, after which we used snowball sampling to add other relevant participants based on the sample criteria.
We conducted interviews with cooperative leaders and/or senior researchers and PFs between April and July 2018. Interviews included at least a lead interviewer and secondary interviewer who provided follow-up or clarification questions as needed. All interviewees provided verbal consent. The research team debriefed after all interviews but one. Any notes from interviews and debriefs were stored for analysis. Interview recordings were transcribed by a third-party organization, and we imported transcripts and notes into qualitative software to conduct the analysis (Dedoose v8, SocioCultural Research Consultants, LLC, Manhattan Beach, CA).
#
Analysis
The primary research team used an immersion-crystallization method for identifying strategies that EvidenceNOW cooperatives employed for generating and reporting eCQMs from EHRs for QI in CVD preventive care.[17] The method employed inductive analyses of primary data (“immersion”), then iteratively reflected on the analyses and formed “themes and categories” based on patterns found in the immersion process (“crystallization”).[18] The primary team members independently coded each interview transcript, compared codes, resolved coding disagreements by referring to transcripts and notes, and iteratively updated a codebook and a conceptual framework as new themes emerged. The team conducted analyses until it reached thematic saturation and internally triangulated its findings by vetting with the larger group of coauthors. The coauthors critiqued and dialoged with the primary team throughout the analysis phase and helped to finalize an overall conceptual model.
#
#
Results
We conducted 17 audio-recorded telephone interviews out of 22 individuals (77%) who were invited to participate. Of the 17 interviewees, 10 were cooperative leaders and/or senior researchers and 7 were PFs with EHR technical experience; the interviews averaged 53.7 minutes (range: 37.9–67.3 minutes).
We determined that cooperatives' approaches for generating and reporting eCQMs hinged on four fundamental and interrelated strategies as follows: (1) cooperative adaptation that directed resources and staffing (2) eCQM generation by means of extraction, reporting, and validation; (3) practice facilitation via PFs who were skilled in EHR reporting and practice engagement; and (4) use of performance reporting by which eCQM performance was provided back to practices. We provide descriptions for each of the domains in [Table 1] along with supporting interviewee quotes ([Supplementary Appendix B] [available in the online version] for a complete glossary).
Abbreviations: eCQM, electronic clinical quality measures; EHR, electronic health record; HIT, health information technology; IT, information technology; PFs, practice facilitators.
Cooperative Adaptation
The seven EvidenceNOW cooperatives initially intended to generate eCQMs from EHRs based on one of three strategies. Cooperatives either planned to use the EHRs as the primary means of generating and reporting eCQMs (eCQM-level reporting, n = 3), use the EHRs as a means of patient data collection, and then relay those data to third parties to generate and report the eCQMs (patient-level reporting, n = 2); or use a hybrid of those approaches (hybrid reporting, n = 2). Interviewees described that their EHR vendors had fewer “off-the-shelf” solutions for eCQM reporting than they originally anticipated. Therefore, cooperative teams iteratively revised their overall strategies. By the end of the project period, all cooperatives had adopted hybrid strategies that combined native EHR functionalities in conjunction with some form of third-party services and solutions to generate and report eCQMs. Integral to their evolution, cooperatives learned key lessons around staffing, resources in reserve, and adjusting research objectives.
Electronic Health Records to Generate and Report Electronic Clinical Quality Measures
Three cooperatives planned for their practices to use existing certified EHRs for generating eCQMs. Two of these three cooperatives planned to have practice staff generate and report eCQMs to the cooperative, whereas a third cooperative delegated those tasks to PFs. A participant explained that this approach could be, “better aligned … with the current pay-for-performance and reporting strategies … that exist.” The cooperatives utilized technical infrastructures that supported over 20 different EHR vendors, each using as many as 12 staff members and reportedly contracted with “many dozen more” third-party vendors. To manage eCQMs from EHRs across practices, the cooperatives employed electronic resources and tools including online repositories to store source code and PFs used screen share software to provide practice-level technical support. However, cooperatives that initially used the EHR-centric approach experienced difficulties transmitting eCQMs beyond their EHRs and so adapted by offering data aggregation and registry services from third parties.
#
Third Parties to Generate and Report Electronic Clinical Quality Measures
In contrast, two cooperatives planned to use EHRs for data collection and then have those data sent to third parties who would generate and report the eCQMs. As examples, one cooperative used an open-source tool called popHealth, whereas another sought to collect EHR data via a statewide health information exchange that would aggregate the data for calculating relevant eCQMs. The latter cooperative planned (and was ultimately able) to use aggregate data to develop a patient-specific 10-year risk calculator for predicting atherosclerotic CVD (ASCVD), and generated eCQM results and risk scores were returned to practices via a centralized reporting system.
Both cooperatives employed joint academic, public, and private management structures to generate eCQMs. The organizational structures were typically built from preexisting relationships established in supporting health IT and QI efforts, and included myriad organization types (e.g., universities, regional extension centers [RECs], health care networks, and state agencies). Staff and contractors included data experts (e.g., a program analyst and an SAS programmer) who could “clean, normalize, and transform” EHR data. However, cooperatives that initially chose this approach encountered challenges which included data quality issues that prevented eCQM calculations. Interviewees from these cooperatives also noted mistrust from practices that may have been uncomfortable sharing patient-level data. These two cooperatives adapted by allowing some of their practices to generate eCQMs directly from EHRs rather than using the third party.
#
Hybrid Approaches to Generating and Reporting Electronic Clinical Quality Measures
The two remaining cooperatives planned to use hybrid approaches by calculating eCQMs from within and outside of EHRs. One intended for its practices to initially generate eCQMs from their EHRs but over time rely on a statewide health information exchange (HIE) to generate eCQMs based on practices' aggregated data. The other cooperative's strategy was for one of its two participating networks to rely on EHRs for eCQMs, whereas the other was to have patient-level data automatically transmitted to a state-wide data warehouse in order to calculate eCQMs and provide the cooperative with aggregated data. Both cooperatives were led by academic researchers and partnered with various entities to obtain EHR data for eCQMs including QI organizations with long histories of supporting health IT in primary care. One cooperative that applied a hybrid strategy had to adapt after one of its two partners went out of business.
#
Evolution to Hybrid Approaches for EHRs and Electronic Clinical Quality Measures
Despite cooperatives' initial starting points, all seven cooperatives evolved to use a hybrid approach that used EHR functions and third-party services. This meant that cooperatives had to invent their own solutions which required additional resources, staff, and ingenuity. One successful organizational strategy was to build core teams with EHR and eCQM implementation expertise that were highly adaptable and could expend effort to balance project requirements with practices' goals. Key to success was being both responsive to practices' stated needs around eCQMs and balancing that with what EHRs could deliver. Effective responsiveness also meant that cooperatives directed both PFs and health IT leads to engage in in-person and online conversations to develop common terminology and understanding of needs and technical capacity. One interviewee noted that cultural differences between “database people, EHR people, and clinicians” required time to address. One solution was for cooperative representatives to routinely meet to discuss challenges and use the group's technical and clinical expertise to collectively problem solve. Another successful organizational strategy was to forge partnerships with third-party vendors who brought additional EHR, data, and data aggregation expertise. Additionally, third-party vendors played key roles in negotiations with EHR vendors on how to generate and report eCQMs.
Participants noted that further coordination and standardization at state and federal levels are needed because measure definitions around a similar topic (e.g., from U.S. Health Resources and Services Administration [HRSA] or the Centers for Medicare and Medicaid Services [CMS]) may have different specifications and may not correctly compare quality performance despite using certified EHRs. Participants suggested that organizations could benefit from additional guidance from state-level public health departments (or a hypothetical “department of information resources”) and coordination of eCQM efforts at the federal level from agencies like CMS and the National Committee for Quality Assurance (NCQA).
#
#
Electronic Clinical Quality Measure Generation
This section describes the strategies that underlie the interrelated processes for transforming EHR data into eCQMs via data extraction, validation, and reporting.
#
Electronic Clinical Quality Measure Extraction
Cooperatives interacted with EHRs of various capabilities for extracting eCQMs. One strategy was to build and use libraries of database query logic that could be used (or adapted) to different EHRs to perform eCQM extraction. This enabled four cooperatives to centralize how and when they pulled data from EHRs, even when practice-level customization was necessary. Cooperatives that partnered with a registry or HIE received tailored eCQM results including results by user-defined time periods. Another strategy was asking EHR vendors to provide stock eCQM reports which enabled cooperatives to better gauge the capabilities of EHRs and better understand the extent to which eCQM reports could be modified. Interviewees noted, however, that some EHR vendors required the purchase of additional modules in order to generate reports. Some interviewees explained that they were unable to extract data for eCQMs despite their efforts, and so addressed that by falling back on manual chart reviews. One interviewee described having to reach out to the Office of the National Coordinator to engage an EHR vendor that inappropriately requested additional payments to produce eCQM results.
#
Electronic Clinical Quality Measure Validation
Cooperatives took steps to validate EHR data for calculating eCQMs. One strategy was to develop templates against which they tested subsets of ABCS data. Those tests flagged data quality issues such as unrealistic totals or numerators larger than denominators. Cooperatives also invited clinicians to quality check EHR data and eCQM results because they found that clinicians were oftentimes able to intuit data quality errors that technical teams missed. In addition to raising potential data quality issues, interviewees noted that involving clinicians had the added benefit of gaining their buy-in for using eCQMs and for QI more broadly.
#
Electronic Clinical Quality Measure Reporting
Cooperatives developed solutions that enabled practices to send EHR data for calculating eCQMs. Some cooperatives sent periodic reminders to practices when eCQMs were due. For example, one cooperative e-mailed quarterly reminders to meet EvidenceNOW's quarterly reporting requirement. Interviewees advised that other organizations develop reliable and accurate processes and documentation for mapping eCQMs to multiple EHR data structures. Another key lesson was to partner with providers and leverage their expertise and familiarity with their own patient data to monitor quality while also carrying out internal data quality checks by using what one participant termed, “[a] watchdog approach.”
#
Practice Facilitation
Interviewees stated that PFs played key roles for generating and reporting eCQMs by supporting bidirectional communication between practices and technical teams. PFs notified cooperatives of potential risks to eCQM reporting such as confusing user interfaces that could negatively impact downstream data quality. In some instances, PFs provided “boots on the ground” to hand deliver eCQM performance reports or technical updates. Communication strategies extended beyond e-mail to include screensharing and interactive webinars, so that they could remotely engage practices and address issues such as EHR navigation and/or eCQM calculation in real time.
Creating valid eCQMs relied on data being reliably entered into the EHR and correct eCQM report configuration (if supported by the EHR). Cooperatives used different models for offering technical assistance to optimize use of the EHRs for eCQM reporting purposes: (1) separating PFs focused on practice QI (“coaches”) from PFs focused on EHRs (“health IT facilitators”) and (2) combining the two roles into one. Cooperatives that separated the roles explained that health IT facilitators provided a unique skill set that better addressed EHR- and eCQM-specific issues and clinician questions about both topics. They advocated for keeping the roles separate, rather than combined, because of the difficulty staffing people with the uncommon ability to balance technical knowledge and the ability to effectively engage clinicians and practice staff.
#
Performance Reporting
All seven cooperatives used feedback strategies to report eCQM performance back to practices. A common approach was framing eCQM results in terms of benchmarks by which comparisons and trends were displayed. Benchmarking enabled clinicians to compare their performance against others at a regional or national level. Some cooperatives developed eCQM dashboards that could include performance tracking at multiple levels: clinician patient panels that clinicians could annotate, aggregated eCQM measures within practices' surrounding neighborhoods, and system-level performance across an entire state. One cooperative used EHR data to develop a patient-level ASCVD risk stratification tool for clinicians. A third approach was for PFs to ensure clinicians and practices were actively using their eCQM results. This included PFs sitting down with physicians and staff to ensure that they had an active login and password to access results, and ensuring in-person and virtual touches were done to review eCQM results. [Fig. 1] illustrates examples of how eCQM-based performance feedback was delivered to practices.
Based on these results and the definitions provided in [Table 1], we offer a conceptual model of key strategic areas in supporting the use of eCQMs. Our model graphically represents the interrelated strategies that EvidenceNOW cooperatives employed to generate and report eCQMs ([Fig. 2]).
This model of key strategic areas may inform stakeholders who intend to undertake future eCQM-related projects. We further address the model in the “Discussion” section below.
#
#
Discussion
EvidenceNOW cooperatives developed a variety of strategies for their eCQM efforts. The strategies addressed sociotechnical issues which were complex interactions among technology and organizational factors such as health IT staffing.[19] [20] Although EHRs were a necessary component, EHRs alone were insufficient for carrying out eCQM efforts, and thus required consideration of the personal interactions between clinicians, PFs, and the EHR. We have compiled a list of recommendations in [Table 2] for generating and reporting eCQMs based on the EvidenceNOW experience.
Abbreviations: eCQM, electronic clinical quality measures; EHR, electronic health record; QI, quality improvement.
Cooperative Adaptation
We found that the strategies most EvidenceNOW cooperatives began with to generate and report eCQMs had to evolve, as they learned the capabilities (and limitations) of their EHRs and other health IT. Their initial strategies were either (1) an EHR-centric strategy whereby EHRs would store patient data and calculate eCQMs; (2) a third-party strategy whereby EHRs would transmit patient data to an external entity that would calculate eCQMs; or (3) a hybrid strategy of the two. Cooperatives ultimately employed hybrid strategies because they determined neither EHRs alone nor third parties alone could calculate eCQMs for QI in the participating practices, and their solutions required resources that often are unavailable to typical practices. Key to carrying out the work was to employ core staff with expertise in EHRs and/or eCQMs and promote cultures that valued flexibility, resilience, and communication. These “soft skills” could enable cooperatives to maintain effective formal working relationships with contractors or informal relationships with project champions within practice sites.
#
Electronic Clinical Quality Measure Generation
Similar to findings from Cohen et al,[12] interviewees described encountering a variety of challenges that included difficulties generating eCQM reports from EHRs for patient panels within user-specified time periods, and that many practices lack the expertise or staff to generate eCQMs. The design of EvidenceNOW to understand the effect of practice facilitation on improvement of guideline-recommended CVD prevention and treatment, providing a unique opportunity to demonstrate how practice facilitation can support practices to generate reports and use them to support QI. Yet, study participants also described types of challenges not previously reported, including the need to develop practice-specific technical solutions for issues with eCQM generation, inability to connect some EHRs to data aggregators or warehouses, and difficulty engaging clinicians to utilize eCQMs due to limited time or lack of perceived value of this activity.
These challenges in using EHRs to generate eCQMs are notable because national standards exist for the representation of clinical data to allow eCQM calculations. The eCQM standards require time and effort that can make it difficult to generate a new eCQM in response to newly available risk scores or EHR-derived reports. Despite those standards, participants reported that the basic functionalities offered by some certified EHR vendors could not deliver data for calculating eCQMs. Furthermore, some vendor-based solutions reportedly required licensing as additional components at additional costs. As Green et al[21] discussed in the context of “meaningful use,” particularly in low-resource settings, secondary fees for EHR functionalities act as barriers to effective use. Cooperatives devised several solutions to circumvent local limitations that EHRs had with eCQMs. For example, some cooperatives implemented workarounds, such as custom data extraction scripts or manual chart reviews, while others reserved budget to contract with EHR vendors to customize eCQM reports. Another approach that a cooperative employed was to trace from an eCQM calculation to individual patient records as part of the data processing phases of eCQM validation and reporting. This allowed providers who doubted the validity of the eCQM calculation to see what was recorded in the EHR for a subset of patients and allowed practices to create targeted patient intervention lists.
#
Performance Reporting
Cooperatives made efforts to deliver eCQM performance reports to engage clinicians and promote QI efforts. Key among these efforts were cooperative-developed electronic dashboards that enabled clinicians to compare themselves and their practice to others based on eCQM results. Cooperatives developed dashboards based on regional calculations and identifying national benchmarks. The solution to provide benchmark dashboards was a sensible approach for QI given that previous research has empirically shown performance improvements when primary care providers have access to such tools.[22] [23] Based on our findings, we believe that access to repositories of benchmark data could be of value to promote the use of eCQMs for QI. This is provided to some degree by CMS through the Merit-based Incentive Payment System (MIPS)[24]; however, we believe that there remains a need for additional regional and national benchmark resources.
#
Practice Facilitation
A final key takeaway from our research is that practices using EHRs for eCQMs require a substantial amount of hands-on support. As indicated by the cooperatives' strategies, we described that implementing eCQMs for QI may require solutions external to EHR-embedded functionalities. For example, cooperatives noted that PFs created manual workarounds to be able to identify individual patients even if the EHR did not have this embedded functionality. This allowed the high-level concept of the eCQM calculation to be tailored to the practice and provider by demonstrating which of their patients needed an intervention. This needed for hands-on support that is consistent with other findings reported in the literature for small-to-medium-sized practices.[11] [14] [21] [22] This argues for developing strategies for maintaining support, like those offered by the PFs in this study, to facilitate adoption of innovations in practice, monitoring, and reporting, particularly for small-to-medium-sized practices. Sustainability of support outside of grant-funded programs, like EvidenceNOW, will be a critical component to this success.
Participants noted that cooperatives took technology-forward steps, such as screen sharing to maintain interpersonal connectedness, while bridging geographical distances. These seemingly “light” information tools that may not be the domain of informatics research, but their use can make them important tools for engaging practice staff, identifying gaps and needs, and informing organizations of the challenges that practicing clinicians are having regarding EHRs and eCQMs.
#
Conceptual Framework
To address the challenges of reporting and generating eCQMs from EHRs, EvidenceNOW cooperatives adapted their organizational structures, leveraged EHRs and EHR data, and engaged with participating practices in varying ways. Toward this end, we generated a conceptual framework ([Fig. 2]) to illustrate the interconnected domains that were generally applied across all seven cooperatives, even though they strategically started at different places. Overall, cooperatives had to adapt by modifying and executing inter-related processes having to do with practice facilitation, performance reporting, and subprocesses that went into eCQM generation. We believe that the cooperatives' solutions provide valuable insights for future small- and medium-sized practices that be expected to generate and report eCQMs for QI purposes.
#
#
Limitations
Our study has limitations to note. First, the EvidenceNOW practices were part of a funded study which in-turn afforded special attention from PFs, as well as supplemental funding to offset costs related to quality measure reporting. These results may not be generalizable to other practices who do not have comparable support in place. In addition, many practices had past relationships with their EvidenceNOW cooperative partners, some of whom previously served as RECs funded by ONC to support meaningful use of their EHRs. As relationship building and maintenance can be an important strategy for success, facilitators engaging future practices that may require additional time before described strategies are successful. Third, limited time and resources prevented us from empirically determining any effects that resulted from any eCQM strategy. Finally, these results are based on interviews with administrators and facilitators from each cooperative within EvidenceNOW. We recognize these results may not be generalizable outside of this study setting, and therefore caution in interpretation is needed. Furthermore, our evaluation of “success” did not involve interviews with practice or provider staff.
#
Conclusion
In a large national sample of small-to-medium-sized primary care practices from the EvidenceNOW consortium, we identified the following four key process domains when developing strategies to operationalize eCQMs via EHRs: (1) cooperative adaptation, (2) eCQM generation capacity, (3) performance reporting requirements, and (4) practice facilitation capacity. Although the seven cooperatives differed at the start of the project across the four domains, by the end of the project period, they coalesced around hybrid, practical reporting solutions. These strategies accommodated limitations in native EHR functionality and delivered quality measures back to practices via hands-on, high-touch methods through on-site PFs. Our findings support the need for further improvements in EHR quality reporting capacity and on-site support staff or external practice facilitation to enable widespread uptake of quality measurement within primary care practices.
#
Clinical Relevance Statement
Primary care practices are being required to capture and report results from electronic clinical quality measures (eCQMs) to demonstrate the quality of their patient care. The literature contains few real-world examples of how primary care practices systematically plan, execute, and evaluate how they put eCQMs into practice via electronic health records (EHRs). This effort provides lessons learned as to the challenges and facilitators primary care practices can face when operationalizing eCQMs in EHRs.
#
Multiple Choice Questions
-
From which of the following did a cooperative use a strategy to build libraries of database query logic for different EHRs?
-
eCQM generation
-
eCQM extraction
-
eCQM validation
-
eCQM reporting
Correct Answer: The correct answer is option b. The eCQM extraction encompassed strategies that cooperatives used to pull eCQM data from a variety of EHRs. The eCQM extraction was difficult due to EHRs having different technical capabilities and data structures.
-
-
What is a strategy that practice facilitators used to promote eCQM performance at practice sites?
-
Meetings
-
Online videos
-
Dashboards
-
Newsletters
Correct Answer: The correct answer is option c. Cooperatives built dashboards based on individual eCQMs for ABCS that reported performance over time compared to other practices or providers.
-
#
#
Conflict of Interest
None declared.
Acknowledgements
We would like to acknowledge the valuable contributions from Milton Garrett III and Pauline Kenly for this study.
Protection of Human and Animal Subjects
The Northwestern University Institutional Review Board approved this study.
-
References
- 1 Chatterjee P, Joynt KE. Do cardiology quality measures actually improve patient outcomes?. J Am Heart Assoc 2014; 3 (01) e000404
- 2 D'Amore JD, Li C, McCrary L. et al. Using clinical data standards to measure quality: a new approach. Appl Clin Inform 2018; 9 (02) 422-431
- 3 McClure RC, Macumber CL, Skapik JL, Smith AM. Igniting harmonized digital clinical quality measurement through terminology, CQL, and FHIR. Appl Clin Inform 2020; 11 (01) 23-33
- 4 eCQMs. About eCQI. Accessed April 1, 2019 at: https://ecqi.healthit.gov/ecqms
- 5 Marcotte L, Seidman J, Trudel K. et al. Achieving meaningful use of health information technology: a guide for physicians to the EHR incentive programs. Arch Intern Med 2012; 172 (09) 731-736
- 6 Payne TH, Corley S, Cullen TA. et al. Report of the AMIA EHR-2020 task force on the status and future direction of EHRs. J Am Med Inform Assoc 2015; 22 (05) 1102-1110
- 7 Johnson SG, Speedie S, Simon G, Kumar V, Westra BL. Quantifying the effect of data quality on the validity of an eMeasure. Appl Clin Inform 2017; 8 (04) 1012-1021
- 8 Heisey-Grove DM, Wall HK, Wright JS. Electronic clinical quality measure reporting challenges: findings from the Medicare EHR Incentive Program's Controlling High Blood Pressure Measure. J Am Med Inform Assoc 2018; 25 (02) 127-134
- 9 Colin NV, Cholan RA, Sachdeva B, Nealy BE, Parchman ML, Dorr DA. Understanding the impact of variations in measurement period reporting for electronic clinical quality measures. EGEMS (Wash DC) 2018; 6 (01) 17
- 10 Cholan RA, Weiskopf NG, Rhoton DL. et al. Specifications of clinical quality measures and value set vocabularies shift over time: a study of change through implementation differences. AMIA Annu Symp Proc 2018; 2017: 575-584
- 11 Balasubramanian BA, Marino M, Cohen DJ. et al. Use of quality improvement strategies among small to medium-size us primary care practices. Ann Fam Med 2018; 16 (Suppl. 01) S35-S43
- 12 Cohen DJ, Dorr DA, Knierim K. et al. Primary care practices' abilities and challenges in using electronic health record data for quality improvement. Health Aff (Millwood) 2018; 37 (04) 635-643
- 13 Kukhareva P, Weir CR, Staes C, Borbolla D, Slager S, Kawamoto K. Integration of clinical decision support and electronic clinical quality measurement: domain expert insights and implications for future direction. AMIA Annu Symp Proc 2018; 2018: 700-709
- 14 Goetz Goldberg D, Kuzel AJ, Feng LB, DeShazo JP, Love LE. EHRs in primary care practices: benefits, challenges, and successful strategies. Am J Manag Care 2012; 18 (02) e48-e54
- 15 Meyers D, Miller T, Genevro J. et al. EvidenceNOW: balancing primary care implementation and implementation research. Ann Fam Med 2018; 16 (Suppl. 01) S5-S11
- 16 Million hearts 2007 priorities. Accessed April 1, 2019 at: https://millionhearts.hhs.gov/
- 17 Borkan JM. Immersion/crystallization. In: Crabtree BF, Miller WL. eds. Doing Qualitative Research. Newbury Park, CA: Sage Publications; 1999
- 18 Borkan JM. Immersion–crystallization: a valuable analytic tool for healthcare research. Fam Pract. 2021;
- 19 Sittig DF, Singh H. A new sociotechnical model for studying health information technology in complex adaptive healthcare systems. Qual Saf Health Care 2010; 19 (Suppl. 03) i68-i74
- 20 Singh H, Sittig DF. Measuring and improving patient safety through health information technology: the Health IT Safety Framework. BMJ Qual Saf 2016; 25 (04) 226-232
- 21 Green LA, Potworowski G, Day A. et al. Sustaining “meaningful use” of health information technology in low-resource practices. Ann Fam Med 2015; 13 (01) 17-22
- 22 Kiefe CI, Allison JJ, Williams OD, Person SD, Weaver MT, Weissman NW. Improving quality improvement using achievable benchmarks for physician feedback: a randomized controlled trial. JAMA 2001; 285 (22) 2871-2879
- 23 Hermans MP, Elisaf M, Michel G. et al; OPTIMISE International Steering Committee. Benchmarking is associated with improved quality of care in type 2 diabetes: the OPTIMISE randomized, controlled trial. Diabetes Care 2013; 36 (11) 3388-3395
- 24 Resource lLibrary. Accessed March 28, 2022 at: https://qpp.cms.gov/resources/resource-library
Address for correspondence
Publication History
Article published online:
04 May 2022
© 2022. Thieme. All rights reserved.
Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany
-
References
- 1 Chatterjee P, Joynt KE. Do cardiology quality measures actually improve patient outcomes?. J Am Heart Assoc 2014; 3 (01) e000404
- 2 D'Amore JD, Li C, McCrary L. et al. Using clinical data standards to measure quality: a new approach. Appl Clin Inform 2018; 9 (02) 422-431
- 3 McClure RC, Macumber CL, Skapik JL, Smith AM. Igniting harmonized digital clinical quality measurement through terminology, CQL, and FHIR. Appl Clin Inform 2020; 11 (01) 23-33
- 4 eCQMs. About eCQI. Accessed April 1, 2019 at: https://ecqi.healthit.gov/ecqms
- 5 Marcotte L, Seidman J, Trudel K. et al. Achieving meaningful use of health information technology: a guide for physicians to the EHR incentive programs. Arch Intern Med 2012; 172 (09) 731-736
- 6 Payne TH, Corley S, Cullen TA. et al. Report of the AMIA EHR-2020 task force on the status and future direction of EHRs. J Am Med Inform Assoc 2015; 22 (05) 1102-1110
- 7 Johnson SG, Speedie S, Simon G, Kumar V, Westra BL. Quantifying the effect of data quality on the validity of an eMeasure. Appl Clin Inform 2017; 8 (04) 1012-1021
- 8 Heisey-Grove DM, Wall HK, Wright JS. Electronic clinical quality measure reporting challenges: findings from the Medicare EHR Incentive Program's Controlling High Blood Pressure Measure. J Am Med Inform Assoc 2018; 25 (02) 127-134
- 9 Colin NV, Cholan RA, Sachdeva B, Nealy BE, Parchman ML, Dorr DA. Understanding the impact of variations in measurement period reporting for electronic clinical quality measures. EGEMS (Wash DC) 2018; 6 (01) 17
- 10 Cholan RA, Weiskopf NG, Rhoton DL. et al. Specifications of clinical quality measures and value set vocabularies shift over time: a study of change through implementation differences. AMIA Annu Symp Proc 2018; 2017: 575-584
- 11 Balasubramanian BA, Marino M, Cohen DJ. et al. Use of quality improvement strategies among small to medium-size us primary care practices. Ann Fam Med 2018; 16 (Suppl. 01) S35-S43
- 12 Cohen DJ, Dorr DA, Knierim K. et al. Primary care practices' abilities and challenges in using electronic health record data for quality improvement. Health Aff (Millwood) 2018; 37 (04) 635-643
- 13 Kukhareva P, Weir CR, Staes C, Borbolla D, Slager S, Kawamoto K. Integration of clinical decision support and electronic clinical quality measurement: domain expert insights and implications for future direction. AMIA Annu Symp Proc 2018; 2018: 700-709
- 14 Goetz Goldberg D, Kuzel AJ, Feng LB, DeShazo JP, Love LE. EHRs in primary care practices: benefits, challenges, and successful strategies. Am J Manag Care 2012; 18 (02) e48-e54
- 15 Meyers D, Miller T, Genevro J. et al. EvidenceNOW: balancing primary care implementation and implementation research. Ann Fam Med 2018; 16 (Suppl. 01) S5-S11
- 16 Million hearts 2007 priorities. Accessed April 1, 2019 at: https://millionhearts.hhs.gov/
- 17 Borkan JM. Immersion/crystallization. In: Crabtree BF, Miller WL. eds. Doing Qualitative Research. Newbury Park, CA: Sage Publications; 1999
- 18 Borkan JM. Immersion–crystallization: a valuable analytic tool for healthcare research. Fam Pract. 2021;
- 19 Sittig DF, Singh H. A new sociotechnical model for studying health information technology in complex adaptive healthcare systems. Qual Saf Health Care 2010; 19 (Suppl. 03) i68-i74
- 20 Singh H, Sittig DF. Measuring and improving patient safety through health information technology: the Health IT Safety Framework. BMJ Qual Saf 2016; 25 (04) 226-232
- 21 Green LA, Potworowski G, Day A. et al. Sustaining “meaningful use” of health information technology in low-resource practices. Ann Fam Med 2015; 13 (01) 17-22
- 22 Kiefe CI, Allison JJ, Williams OD, Person SD, Weaver MT, Weissman NW. Improving quality improvement using achievable benchmarks for physician feedback: a randomized controlled trial. JAMA 2001; 285 (22) 2871-2879
- 23 Hermans MP, Elisaf M, Michel G. et al; OPTIMISE International Steering Committee. Benchmarking is associated with improved quality of care in type 2 diabetes: the OPTIMISE randomized, controlled trial. Diabetes Care 2013; 36 (11) 3388-3395
- 24 Resource lLibrary. Accessed March 28, 2022 at: https://qpp.cms.gov/resources/resource-library