Appl Clin Inform 2022; 13(05): 1100-1107
DOI: 10.1055/a-1950-9032
Research Article

Decision Support to Improve Critical Care Services Documentation in an Academic Emergency Department

Robert W. Turer
1   Department of Emergency Medicine and Clinical Informatics Center, UT Southwestern Medical Center, Dallas, Texas, United States
,
John C. Champion
2   Department of Emergency Medicine, Vanderbilt University Medical Center, Nashville, Tennessee, United States
,
Brian S. Rothman
3   Department of Anesthesiology, Vanderbilt Medical Center, Nashville, Tennessee, United States
4   Department of Biomedical Informatics, Vanderbilt University Medical Center, Nashville, Tennessee, United States
,
Heather S. Dunn
5   Department of Finance, Vanderbilt University Medical Center, Nashville, Tennessee, United States
,
Kenneth M. Jenkins
6   Department of Compliance, Vanderbilt University Medical Center, Nashville, Tennessee, United States
,
Olayinka Everham
7   Health IT, Vanderbilt University Medical Center, Nashville, Tennessee, United States
,
Tyler W. Barrett
2   Department of Emergency Medicine, Vanderbilt University Medical Center, Nashville, Tennessee, United States
,
Ian D. Jones
2   Department of Emergency Medicine, Vanderbilt University Medical Center, Nashville, Tennessee, United States
4   Department of Biomedical Informatics, Vanderbilt University Medical Center, Nashville, Tennessee, United States
7   Health IT, Vanderbilt University Medical Center, Nashville, Tennessee, United States
,
Michael J. Ward
2   Department of Emergency Medicine, Vanderbilt University Medical Center, Nashville, Tennessee, United States
4   Department of Biomedical Informatics, Vanderbilt University Medical Center, Nashville, Tennessee, United States
8   Geriatric Research, Education, and Clinical Center, Tennessee Valley Healthcare System (Veterans Affairs), Nashville, Tennessee, United States
,
Nathaniel M. Miller
2   Department of Emergency Medicine, Vanderbilt University Medical Center, Nashville, Tennessee, United States
› Author Affiliations
 

Abstract

Objectives Critical care services (CCS) documentation affects billing, operations, and research. No studies exist on documentation decision support (DDS) for CCS in the emergency department (ED). We describe the design, implementation, and evaluation of a DDS tool built to improve CCS documentation at an academic ED.

Methods This quality improvement study reports the prospective design, implementation, and evaluation of a novel DDS tool for CCS documentation at an academic ED. CCS-associated ED diagnoses triggered a message to appear within the physician note attestation workflow for any patient seen in the adult ED. The alert raised awareness of CCS-associated diagnoses without recommending specific documentation practices. The message disappeared from the note automatically once signed. We measured current procedural terminology (CPT) codes 99291 or 99292 (representing CCS rendered) for 8 months before and after deployment to identify CCS documentation rates. We performed state-space Bayesian time-series analysis to evaluate the causal effect of our intervention on CCS documentation capture. We used monthly ED volume and monthly admission rates as covariate time-series for model generation.

Results The study included 92,350 ED patients with an observed mean proportion CCS of 3.9% before the intervention and 5.8% afterward. The counterfactual model predicted an average response of 3.9% [95% CI 3.5–4.3%]. The estimated absolute causal effect of the intervention was 2.0% [95% CI 1.5–2.4%] (p = 0.001).

Conclusion A DDS tool measurably increased ED CCS documentation. Attention to user workflows and collaboration with compliance and billing teams avoided alert fatigue and ensures compliance.


#

Background and Significance

Problem Description

Critical care services (CCS) represent an evaluation and management concept defined by the Centers for Medicare and Medicaid Services as “direct delivery of medical care by a physician(s) for a critically ill or critically injured patient.” More specifically, they define critical illness as impairing one or more vital organ systems such that there is a “high probability of imminent or life threatening deterioration in the patient's condition” and involves “high complexity decision making to assess, manipulate, and support vital system function(s) to treat single or multiple vital system organ failure and/or to prevent further life threatening deterioration of the patient's condition.”[1] The encoding of CCS in emergency departments (EDs) usually involves professional coders' identification of CCS documented within clinical notes resulting in CPT professional billing codes 99291 or 99292 (different combinations of these two codes reflect different durations of CCS provided). However, unlike other professional billing codes that can be abstracted by professional coders with minimal instruction from clinicians, CCS coding requires clinicians to explicitly declare in their documentation that CCS were performed. Further complicating the process of CCS billing is the subjective nature of determining which services meet criteria for CCS. These factors combined with the fact that clinicians must remember to document CCS may explain systematic under documentation at certain sites.

Accurate capture of these codes and avoidance of over or under reporting have billing, operational, and research implications, which are evaluated by national benchmarks and government audits.[2] From a billing standpoint, CCS codes are reimbursed at a higher rate than other emergency medicine professional billing procedure codes, so under-capture of CCS represents a source of lost revenue for health systems. Operationally and for research, CCS coding can serve as a surrogate for population level acuity, which can inform staffing decisions and studies involving measures of ED acuity. As volumes and acuity rise in EDs across the United States, the need for accurate documentation of CCS will likely follow.[3]


#

Available Knowledge and Knowledge Gaps

Appropriately implemented clinical decision support has been shown to positively impact clinical care in the ED.[4] There is little in the literature about the application of similar techniques to support ED documentation. Early work suggests that documentation decision support (DDS) tools promote a more accurate capture of documentation complexity.[5] To our knowledge, there are no published studies on the use of computerized DDS for improved capture of CCS documentation and encoding. Recognizing CCS through such tools may improve documentation of CCS. Prior studies have identified procedural and diagnostic characteristics of patients who frequently receive critical care that might facilitate such efforts.[6]

An ever important consideration when designing decision support tools is the potential burden that such tools can place on clinicians when improperly designed.[7] [8] [9] User-centered design techniques applied from the design stage can mitigate some of this burden.[10] [11]


#

Rationale

The study was motivated by the discovery of a substantial difference between CCS capture rates in our ED compared with national benchmarks. Specifically, the Society for Academic Emergency Medicine's Academy of Administrators in Academic Emergency Medicine (AAAEM) Benchmark Survey reports the proportion of CCS documentation across primary academic EDs.[12] In 2021, the mean CCS percentage was 6.34%, and in 2020, the mean CCS proportion was 5.68% compared with our pre-intervention 2020 baseline mean monthly proportion of 3.9%. Given our institution's status as a level 1 trauma, burn, comprehensive stroke, and transplant center, we anticipated ED CCS rates to be in line with other quaternary centers. Institutional review of clinical documentation confirmed under-documentation of CCS by clinicians. Given the resource allocation, billing, and clinician staffing implications of inadequate CCS capture, we were tasked with identifying and correcting root causes of the discrepancy to improve the capture of CCS documentation for patients whose management justified such documentation. Representatives from ED operations, informatics, revenue cycle, and compliance collaborated as a team to intervene and meet the clinical, financial, and compliance requirements set forth by the department and institution.


#
#

Objectives

Our objective was to create a point-of-documentation electronic health record (EHR)-based DDS tool to improve CCS documentation capture. In this manuscript, we present our experience designing, implementing, and evaluating such a tool for use in clinical practice. We hypothesized that deployment of a DDS tool would increase the proportion of ED visits with documented CCS. We followed SQUIRE guidelines for reporting quality improvement studies as closely as possible in the crafting of this manuscript.[13]


#

Methods

Context

This study evaluates a quality improvement project at an academic medical center in the Southeast United States that provides quaternary care for our city, state, and surrounding states.


#

Intervention

Herein we describe the design, implementation, and evaluation of our Epic (Epic Systems Corporation, Verona, Wisconsin, United States) EHR-based DDS tool with a focus on decision support design and computer-assisted documentation compliance.

Decision Support Tool Design

We began with an informal workflow analysis that included ED observations, review of baseline CCS documentation practices, and informal interviews with ED faculty. In our ED, patient care is supervised by attending physicians, and documentation reflecting the patient care delivered is mostly performed by physicians-in-training (residents), nurse practitioners (NPs), or physician assistants (PAs). NPs and PAs may legally document CCS independently or in combination with an attending physician. In contrast, residents cannot document CCS. This is because critical care provided by the resident and time spent teaching by the attending physician are considered separately from CCS provided by the attending physician and may not be included in total critical care time provided by the attending physician.[14] [15]

Our observations and EHR review suggested that critical care in the ED was overwhelmingly provided by teaching teams consisting of attending and resident physicians. On these teams, completed resident notes are sent to the attending physician for “attestation,” wherein the attending attests their involvement in the care of the patient and documents necessary clarifications about rendered care. On teaching teams, CCS is documented by attending physicians as part of attestation.

Through interviewing the ED faculty, we learned that our traditional attestation workflow was poorly designed for capturing CCS documentation. Specifically, we discovered that the Epic SmartForm intended to capture CCS documentation was embedded within the residents' Epic NoteWriter documentation template ([Fig. 1]). Since residents do not document CCS, this form was mostly overlooked except in rare cases where attendings wrote notes by themselves without a resident. Instead of entering the chart and opening NoteWriter to update the SmartForm, most attending physicians reported using Epic's In Basket interface for attestation and, by proxy, CCS documentation. The In Basket has a less feature-rich text editor than Epic's NoteWriter module and does not allow for the inclusion of Epic SmartBlocks or SmartForms, which would otherwise be ideal tools for documenting CCS. Despite its limitations, this workflow was overwhelmingly preferred by attending physicians because it facilitated note viewing and attestation without opening the chart and experiencing a time-consuming context switch compared with other workflows.

Zoom Image
Fig. 1 Traditional Epic SmartForm-based documentation workflow. This was ineffective because it was used by the wrong users (resident physicians) at the wrong time in workflow. Attending physicians attesting notes did not have access to this tool within that workflow.

Given the findings from our workflow analysis, we applied the five-rights of clinical decision support to design our DDS tool to improve CCS documentation capture.[11] The five rights are: the right information, to the right person, in the right intervention format, through the right channel, at the right time in workflow and can serve as a simple design framework for operational clinical decision support. Our workflow analysis identified the attending physician as the right person and the note attestation as the right time in workflow. Potential non-EHR channels included asynchronous emails from the billing or coding team, In Basket messages from clinical coders, and monitored work queues that identified documentation improvement opportunities. These proposals were abandoned due to alert fatigue concerns and an operationally unacceptable latency between initial care and documentation. Therefore, we chose the EHR-based attestation activity as the right channel. Determining the right information and right format was the next two design challenges.

Documentation and billing decision support must clearly avoid any practice that could be construed as either fraud or practicing medicine without a license (by billers, coders, or the EHR itself). Therefore, the decision to document CCS must only come from the supervising physicians themselves. Compliant documentation recommendations may raise awareness of clinical characteristics often associated with CCS but must not explicitly instruct the clinician to document in a particular way. With these guiding principles, the right information required computable clinical markers associated with CCS and compliant decision support messaging.

We considered several critical care markers to trigger our intervention workflow including vital sign abnormalities, ED diagnoses, specific laboratory abnormalities, procedure documentation, and medication administration. All cases required validation by the documenting physician, so false positives were less concerning than they might be in traditional clinical decision support workflows. Despite this, alert fatigue was a significant concern and occasional false negatives would not be problematic. After reviewing these characteristics, we chose ED diagnoses documented by a treating clinician as an ideal trigger for our study to balance between false positives and false negatives.


#

Knowledge Management

To build our list of diagnoses, we used a publicly available list of commonly documented critical care diagnoses (e.g., aortic dissection, anaphylaxis, and diabetic ketoacidosis)[16] and translated the listed clinical diagnoses into SNOMED-CT terminology concept hierarchies (SNOMED International, London, UK). Most diagnoses were sufficiently described by a single SNOMED-CT concept hierarchy while a few required two concepts to fully capture their scope. The diagnosis list and associated SNOMED-CT concepts are included in the [Supplementary Material] (available in the online version). These SNOMED-CT concept hierarchies were used to create Epic Groupers that facilitated maintainable knowledge management and drove DDS workflows.


#

Compliance

With our triggering events established, we then addressed compliance concerns. DDS tools must not encourage, by suggestion or in fact, clinicians to document encounter details unsupported by the clinical care provided. Encouraging the documentation of any note element (e.g., history, physical examination, or testing) that was not conducted during the encounter is fraudulent. Our physician-proposed decision support language was revised in partnership with our office of compliance and clinical documentation integrity teams to arrive at a clinician reminder that did not recommend any particular documentation practice. Instead, the language raised awareness when ED diagnoses associated with CCS were present. The approved phrasing was: “This patient may have filed diagnoses associated with critical care.”


#

Technical Considerations

With the right information established, we explored the right format to present our message to clinicians. Epic features a “disappearing text” toolset that can be conditionally displayed in the limited attestation workflow and configured to disappear from the legal medical record if clinicians did not interact with the DDS tool.[17] We applied this toolset to present the compliant instructional message that always disappeared after signature as a “vanishing tip” to inform users that CCS-associated diagnoses were present. An adjacent interactive tool implemented using an Epic SmartList delivered real-time CCS documentation during attestation that disappeared if the clinician did not use it. These tools were incorporated into the default attestation documentation template, and the “vanishing tip” was configured to conditionally appear when CCS-associated diagnoses were present using Epic's Clinical Engine Rule framework and Groupers as discussed above. Having defined the right format, we proceeded to development, testing, and deployment.


#

Development, Testing, and Deployment

We first implemented the DDS tool and rigorously tested it in a non-production environment. Five of the authors (R.W.T., J.C.C., T.W.B., I.D.J., and N.M.M.) were clinically active faculty physicians and provided feedback on the tool design and functionality in the test environment. The tool was demonstrated to our ED, compliance, finance, and coding leadership for approval. To educate the ED faculty, an instructional worksheet was distributed, then a tutorial about the use and value of the tool was presented at faculty meeting. The tool was deployed as the default attestation for all ED attending physicians on April 1, 2021.


#
#

Study of the Intervention

The intervention's effect on CCS documentation was evaluated using a retrospective quasi-experimental design featuring a Bayesian time-series analysis.


#

Measures

To analyze performance of the tool in improving critical care accuracy, we measured the percentage of monthly ED encounters with CCS rendered for 8 months before and after implementation of the DDS tool. We quantified CCS by whether CPT codes 99291 or 99292 were included in professional billing for the encounter, which allowed us to capture the impact of intervention on clinician practice and coding and billing workflows. Due to the selected intervention, process measures such as the number of faculty who used or did not use the tool were not collectable. More specifically, no audit trail was available for the use of the tool and formal qualitative assessment was not practical given the unfunded status of the QI initiative. We monitored the admission rates throughout the study as a marker for population level severity of illness, which served as a balancing measure.


#

Analysis

To evaluate for a causal effect of the DDS tool on CCS capture rates, we performed a state-space Bayesian time-series analysis using pre-DDS-deployment data as a Bayesian prior to generate a counterfactual time series predicting what would have occurred had the intervention never taken place. We used the techniques described by Brodersen et al and implemented in the CausalImpact R library.[18] This approach requires control time series that represent parallel covariate observations that were not directly impacted by the intervention when deriving the regression model. These allow the counterfactual model to account for other unrelated confounders. We used total monthly volume and monthly admission percentages as control time series. By subtracting the predicted counterfactual response, a semiparametric Bayesian posterior distribution for the causal effect of the intervention was generated. This analysis yielded counterfactual versus observed estimated CCS capture rates and the estimated causal impact over time with 95% credible intervals (95% CI). We report the probability of the observed causal effect using Bayesian one-sided tail-area probability. Analysis was performed used R version 4.1.2. We set α = 0.05 for significance testing.

Ethical Considerations and Protection of Human and Animal Subjects

The intervention described was performed as part of routine clinical care. The associated retrospective study was reviewed by the Vanderbilt University Medical Center Institutional Review Board and determined not to represent research.


#
#
#

Results

Our design process resulted in the creation of a real-time DDS tool for CCS documentation. A screenshot of the tool within the attestation workflow is shown in [Fig. 2] and a video of the CCS documentation workflow is included in the [Supplementary Material] (available in the online version).

Zoom Image
Fig. 2 Documentation decision support tool embedded within attending physician attestation workflow. The blue text containing “This patient may have filed diagnoses associated with critical care” automatically disappears upon signing and cannot be modified. The yellow SmartList containing “Critical care services provided” facilitates documentation of critical care services when used but disappears otherwise.

During the pre-intervention period (August 2020–March 2021), the median monthly ED volume was 5,198 (interquartile range [IQR] 5,019–5,394) and the median monthly admission percentage was 41.2% (IQR 40.2–42.2%). After intervention (April 2021 to December 2021), the median monthly ED volume was 5,754 (IQR 5,473–5,822) and the median monthly admission percentage was 39.0% (IQR 38.7–39.1%).

The mean proportion of observed CCS was 3.9 and 5.8% before and after DDS deployment, respectively. In the absence of an intervention, the counterfactual model predicted an average response of 3.9% [95% CI 3.5–4.3%]. The absolute causal effect of the intervention estimated by subtracting the counterfactual predicted value from the observed response was 2.0% [95% CI 1.5–2.4%]. The probability of obtaining this effect based on the observed data using Bayesian one-sided tail-area probability was p = 0.001, suggesting the observed effect was unlikely to be due to random fluctuations. Plots of the observed versus counterfactual models and estimated causal effect over time are shown in [Fig. 3A] and [B], respectively.

Zoom Image
Fig. 3 (A) The observed proportion of critical care services are shown as a solid line, while the counterfactual predicted proportion of critical care services are shown as a dotted line. Surrounding blue area represents the 95% credible interval. The observed critical care services percentage exceeds the 95% credible interval of the counterfactual model. (B) A causal effect estimate of the increase in critical care services provided is generated by subtracting the predicted from observed proportion of critical care services. We note evidence of a substantial causal effect after the DDS tool was deployed on April 1, 2021.

#

Discussion

Summary

Our study demonstrates a quality improvement effort at a single academic ED to enhance CCS documentation accuracy using a real-time, minimally interruptive EHR-based DDS tool. Our time series analysis suggests a statistically significant 51% increase in critical care documentation capture after deployment of our tool. Our post-intervention CCS capture proportion approaches expectations from AAAEM national benchmarks.


#

Interpretation

The contextual and conditional design was intended to minimize interruptions and avoid alert fatigue. Our embedded “vanishing tip” within the documentation workflow avoided hard-stops that could be perceived as guidance and was minimally invasive to attending physicians. This design also avoided clinicians having to respond to EHR query responses that might appear to promote CCS documentation through a forcing function. Clinician-documented diagnoses were used as EHR data elements that drove the decision support tool and minimized alert fatigue and false positive alerts. Greater sensitivity may be achieved by adding vital sign abnormalities, laboratory abnormalities, procedure documentation, or medication administration as triggers, though using these triggers may increase false positive alerts resulting in alert fatigue.

Our use of native EHR tools and fundamental design principles applied to common workflows will translate easily to other sites using Epic as well as those using other EHR vendor systems. We must emphasize the importance of including local compliance officers, coding and billing representatives, and health system leadership to ensure similar tools are adherent to local, regional, and national policy. This helps to properly design DDS tools that avoid explicitly or implicitly guiding documentation that would lead to potentially fraudulent documentation practices.

We are unaware of unintended consequences to our deployment. Specifically, since native EHR tools were used and approved through routine governance structures, costs were minimal. While no formal assessment of throughput was performed after deployment, our team received multiple compliments for the new workflow and did not receive any complaints (which are common with EHR workflow changes). We were unable to measure a change in documentation queries or delinquencies, though the increased CCS proportion would more likely suggest a decrease in coder documentation queries for cases potentially qualifying for CCS.


#

Limitations

Limitations of our study include its single site nature, the absence of formal usability testing, and the absence of a quantitative evaluation of how frequently the tool was displayed and ignored. The possibility exists that there were other changes in our patient population or care provided during the study periods. Furthermore, there may be inter-attending variability not captured without the use of a mixed effects model, which would be a useful measurement for future projects. We did not explicitly evaluate for changes in the appropriateness of critical care documentation before and after deployment of the DDS tool, though all charts are reviewed by professional coders who will not submit CCS procedure codes or will clarify with a physician documentation query if there are concerns about inappropriately claimed CCS codes. The outcomes evaluated in this study are those submitted by these professional coders, thus likely accounting for this limitation.

The Bayesian analysis is somewhat limited by the month level granularity of data that was available to us compared with daily or monthly data. The availability of only two additional time series for counterfactual generation represents an additional limitation. In future studies using similar techniques to evaluate clinical decision support interventions, prospective collection of finer-grained primary data, and of additional covariate time series would address these concerns.

Volumes were slightly higher after intervention, but admission rates were similar during both study periods, suggesting that the acuity and management of the populations were likely similar. Furthermore, we formally accounted for volume and admission rates during our time series analysis, which did not eliminate the observed causal effect of the DDS tool on CCS capture. We think the demonstrated outcomes and lessons learned outweigh the limitations, and we think that this intervention serves as a model for designing minimally invasive DDS tools.


#
#

Conclusion

In summary, we designed and implemented a real-time, minimally interruptive, EHR-embedded DDS tool that successfully increased CCS documentation to expected benchmark rates at an academic ED. We highlight the importance of a collaborative design process including clinical, compliance, and coding teams to ensure effective workflows and regulatory compliance.


#

Clinical Relevance Statement

Accurate documentation and clinical coding of CCS is essential for resource allocation, billing, and clinician staffing in EDs across the country. The techniques described are generalizable to many health systems and are applicable to other point-of-documentation clinical and educational applications.


#

Multiple Choice Questions

  1. All the following clinician types are able to document critical care services except:

    • Physicians-in-training (residents).

    • Attending physicians.

    • Physician assistants.

    • Nurse practitioners.

    Correct Answer: The correct answer is option a, Physicians-in-training (residents). While residents often provide care considered to be critical care services, these services are documented and billed as part of the trainee's education. Critical care services in this case should be documented by the supervising attending physician.

  2. All the following are part of the five rights of clinical decision support except:

    • The right information.

    • The right person.

    • The right dose.

    • The right channel.

    • The right time in workflow.

    Correct Answer: The correct answer is option c, The right dose. This is part of the five rights of medication administration, upon which the five rights of clinical decision support were built.


#
#

Conflict of Interest

None declared.

Protection of Human and Animal Subjects

The intervention described was performed as part of routine clinical care. The associated retrospective study was reviewed by the Vanderbilt University Medical Center Institutional Review Board and determined not to represent research.


Author Contributions

R.W.T. initially designed and implemented the DDS tool, performed the analysis, and led writing of the manuscript. J.C.C. conceived the potential triggers for critical care and curated the list of diagnoses as well as participating in design, implementation, analysis, and writing of the manuscript. B.S.R. led the team from a revenue cycle perspective ensuring that the intervention worked with billing/coding workflows, provided informatics supervision, and contributed substantially to the manuscript. H.S.D. and K.M.J. represented our finance and compliance teams, respectively, and both contributed to the design, implementation, testing, and manuscript authorship. O.E. implemented and tested the tool and created the images used in this manuscript. I.D.J. leads the informatics team at VUMC and supervised the design and implementation of the tool as well as implementation of the SNOMED codes. M.J.W. reviewed and contributed to the manuscript, study design, and statistical review. N.M.M. conceived the project, facilitated mapping of diagnostic concepts to SNOMED codes, participated in design, implementation, and analysis, and contributed substantially to the manuscript.


Supplementary Material


Address for correspondence

Robert W. Turer, MD
5323 Harry Hines Boulevard, Suite E4.300, Dallas, TX 75390
United States   

Publication History

Received: 18 May 2022

Accepted: 20 September 2022

Accepted Manuscript online:
26 September 2022

Article published online:
16 November 2022

© 2022. Thieme. All rights reserved.

Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany


Zoom Image
Fig. 1 Traditional Epic SmartForm-based documentation workflow. This was ineffective because it was used by the wrong users (resident physicians) at the wrong time in workflow. Attending physicians attesting notes did not have access to this tool within that workflow.
Zoom Image
Fig. 2 Documentation decision support tool embedded within attending physician attestation workflow. The blue text containing “This patient may have filed diagnoses associated with critical care” automatically disappears upon signing and cannot be modified. The yellow SmartList containing “Critical care services provided” facilitates documentation of critical care services when used but disappears otherwise.
Zoom Image
Fig. 3 (A) The observed proportion of critical care services are shown as a solid line, while the counterfactual predicted proportion of critical care services are shown as a dotted line. Surrounding blue area represents the 95% credible interval. The observed critical care services percentage exceeds the 95% credible interval of the counterfactual model. (B) A causal effect estimate of the increase in critical care services provided is generated by subtracting the predicted from observed proportion of critical care services. We note evidence of a substantial causal effect after the DDS tool was deployed on April 1, 2021.