Background and Significance
Computerized clinical decision support is intended to provide information when making decisions that will promote patient care that is safe, complete, and supported by guidelines and evidence.[1]
[2] Decision support has been shown to improve some health care processes[3]
[4] however, clinicians now encounter multiple alerts each day,[5] many of them intrusive or of limited clinical value.[6] Clinicians can receive so many interruptive alerts that they develop “alert fatigue,” and become desensitized to them.[7]
[8]
[9]
[10] The problem can be particularly severe with alerts generated for medication orders.[11] Clinicians with alert fatigue override medication alerts at a high rate,[11] potentially ignoring clinically important alerts in the process.[12]
[13] Negative patient outcomes can result.[14]
[15]
[16]
While it seems intuitive that the fewer alerts there are, the more effective they will be, there is limited evidence for this. Our hospital's medication safety committee obtained data showing that medication alerts were infrequently accepted at our hospital.[17] They also learned that providers at other institutions encountered many fewer medication alerts.[18]
[19] The committee subsequently requested a reduction in the alert burden in our order entry system. While drug-drug interaction (DDI) and drug-duplicate medication alerts appeared to both be firing excessively, DDI alerts appeared to be accepted at an especially low rate. The group therefore recommended first increasing the threshold for generating DDI alerts from “intermediate” to “severe,” that is, eliminating intermediate severity DDI alerts. We were not aware of prior research looking at the effects of turning off an entire class of alerts. We therefore designed a study examining the rates of medication alert acceptance following this change.
Methods
Setting and Caregivers
Johns Hopkins Bayview Medical Center (JHBMC) is a 400-bed academic medical center at Baltimore in Maryland. Providers in this study included residents and fellows who rotate to both JHBMC and Johns Hopkins Hospital (JHH), attending physicians (including both teaching attendings and attending hospitalists), and advanced practice providers (APPs; including nurse practitioners, nurse midwives, nurse anesthetists, and physician assistants) from all adult departments. Nurses, pharmacists, respiratory therapists, and medical students are able to place orders, with later cosignature by an authorized prescriber, and were therefore included in the study as well.
Baseline Order Entry Configuration
Between 2003 and 2004, JHBMC implemented a commercial electronic medical record system (EMR; Meditech Corporation, Westwood, Massachusetts, United States). The EMR's features included computerized provider order entry, provider and nurse documentation, and resulting for laboratory and imaging tests. When medication orders were placed using the EMR, they could generate drug-duplicate, drug-allergy, adverse reaction, DDI, and drug-dose alerts. The alerts “popped up” in a new window when attempting to sign all orders from an ordering session, interrupting the provider's workflow, and required a response before the provider could continue. When more than one drug-duplicate, drug-allergy, adverse reaction, or DDI alert were generated for a particular medication, all appeared in the same interruptive alert in identical font and color, as shown in [Fig. 1]. Providers could choose to disregard the alerts by clicking on an “Override” button at the bottom of the window that closed the window and allowed the new order to be placed or accept the alerts by clicking on an “Erase Order” button. This workflow is shown in [Fig. 2].
Fig. 1 Typical medication alert (used with permission from Meditech Corporation).
Fig. 2 Workflow for responding to medication alerts.
Drug-duplicate medication alerts were generated when a medication was ordered for which the patient was already prescribed or which had been administered in the last 24 hours; no exceptions were allowed for medications commonly reordered more than once a day. Drug-duplicate medication alerts were only generated for the same medication, and not for other medications from the same class. Drug-allergy or adverse reaction alerts were generated when a medication was ordered for a patient who had an allergy or adverse reaction (e.g., nausea and headache) documented in the EMR for that medication or medication class.
Medication alerts utilized a drug information database licensed from First DataBank (San Francisco, California, United States). First DataBank is one of the main suppliers of drug information databases in the United States, which can be integrated with EMRs to inform decisions when ordering, verifying and documenting administration of medications. First DataBank classifies potential DDIs as “contraindicated,” “severe,” “intermediate,” and “mild.” DDIs classified as “contraindicated” by First DataBank were grouped with those in the “severe” category in our EMR, resulting in three functional categories, mild, intermediate, and severe. Prior to the intervention, JHBMC's version of the EMR was configured so that providers were alerted to potential “severe” (including “contraindicated”) and “intermediate” DDI but not those classified as “mild.” No other customizations had been made to the database. DDI alerts showed the category of the ordered medication followed by the category of the medication for which there was a potential DDI. A “Details” button could be selected to learn specifically which medications were involved and the severity and nature of the potential DDIs identified. The system did not prompt users to discontinue the interacting medications that the patient was already on, or track any subsequent changes to them. The DDI alerts were only for other medications, not for food or laboratory results.
Drug-dose alerts appeared on a separate screen and are not addressed in this manuscript.
Intervention
As a quality-improvement initiative, JHBMC's Medication Error Reduction Improvement Team (MERIT) reviewed the medications generating alerts and proposed that the system be changed so that providers would only see “severe” DDI alerts and not be presented with the “intermediate” ones any more. This change was intended to eliminate approximately 75% of all DDI alerts, with a primary objective of increasing the acceptance rate for the remaining DDI alerts, and a secondary objective of increasing the acceptance rate for other types of medication alerts. The change was approved by the Pharmacy and Therapeutics Committee, and implemented on January 22, 2014. Providers were informed about the change through an e-mail announcement.
Data Collection
We conducted a retrospective pre–post audit of all medications ordered, and all medication alerts generated and displayed to providers, for all hospitalized adults who had medications ordered for 5 months before (August 22, 2013–January 21, 2014) and 5 months after (January 23, 2014–June 22, 2014) the change was made. It was not possible to determine if providers who accepted an alert, by erasing an order, subsequently reordered the medication. We collected descriptive data about patients and providers using administrative databases.
There was wide variation in the number of alerts generated per medication order. For example, if warfarin was ordered for a patient who was already on warfarin, a drug-duplicate alert would appear. However, if warfarin was ordered for a patient who was already on warfarin, but also allergic to warfarin, and taking one or more medications that interfered with warfarin, then drug-duplicate, drug-allergy, and DDI alerts would all appear on the same screen, as shown in [Fig. 1]. If the user in the second case responded by clicking on “Erase Order,” accepting the alert, it would be impossible to ascertain if they were responding primarily to the drug-allergy, drug-duplicate, or DDI alert. Therefore, to ensure that we could accurately determine different responses to each alert type, we separately analyzed medication orders that only generated single alerts. For each single alert, we obtained patient age, gender, hospital unit, event date, ordered medication, ordering caregiver, alert type, and caregivers' responses to the alert (“Override” alert or “Erase Order” [i.e., accept the alert]). There were very few orders placed by pharmacists, respiratory therapists, and medical students, and therefore these were grouped in an “Other Caregiver” category. At JHBMC, fellows can moonlight as attending physicians and therefore they were categorized with attendings. Hospital units were grouped according to acuity of care, depending on whether they served intensive care unit (ICU) or more stable “floor” patients.
Medications that were available in more than one form included the route of administration in their name, and, from this, they were categorized as parenteral or nonparenteral. For descriptive purposes, all nonparenteral forms of a given medication were grouped together as one medication, for example, sustained release of morphine and morphine elixir were classified as a single medication, “nonparenteral morphine”; similarly, parenteral forms of the medications were grouped together. Medications were further categorized according to whether or not they were on the Institute for Safe Medication Practice's list of high-alert medications (ISMP list).[20]
Analysis
Descriptive statistics were used for the medication orders generating any number of alerts. Alerts were dichotomously categorized according to whether they were overridden or accepted. For the orders, generating only single alerts, Student's t-test or Wilcoxon's rank-sum test was used to compare means or medians of continuous variables for the two groups, and Chi-square tests were used to compare the distributions of proportions of categorical variables. Multivariable Poisson regression was subsequently performed to calculate rate ratios for overriding versus accepting alerts, after adjusting for patient age, caregiver type, parenteral versus nonparenteral medication, and whether or not the medication is on the ISMP list of high-alert medications that were the factors found to be significantly predictive of failure to accept alerts in our prior study.[17] Rate ratios were used to calculate the relative percent changes in alert acceptance. All analyses were performed using Stata/SE version 13 (Statacorp, College Station, Texas, United States).
Results
There were similar numbers of medication orders placed during 5 months before and after the intervention ([Table 1]). Medication orders generated from 1 to 18 alerts before the intervention, and from 1 to 21 alerts after the intervention. There was a 37.5% relative decrease in the percentage of medication orders generating alerts after the intervention, though only a 9.6% absolute decrease. There was a 39.6% relative increase in the percentage of orders erased (i.e., all alerts accepted) in response to alerts after the intervention but only a 2.1% absolute increase ([Table 1]).
Table 1
Number of orders placed, number for which alerts were generated, and number erased (alert[s] accepted), and relative change, pre- and postintervention
|
Preintervention
n (calculation and %)
|
Postintervention
n (calculation and %)
|
Relative change (95% CI in %)
|
Absolute change (95% CI in %)
|
All medication orders
|
241,915
|
245,757
|
–
|
–
|
Medication orders that generated one or more alert
|
61,923
(61,923/241,915 = 25.6%)
|
39,254
(39,254/245,757 = 16%)
|
-37.5
(−38.4 to −36.8)
|
−9.6
(−9.4 to −9.9%)
|
Medication orders erased in response to one or more alerts (i.e., alerts accepted)
|
3,249
(3,249/61,923 = 5.3%)
|
2,884
(2,884/39,254 = 7.4%)
|
39.6
(33.2–47.2)
|
2.1
(1.8–2.4)
|
Medication orders that generated single alerts
|
40,139
(40,139/241,915 = 16.6%)
|
30,158
(30,158/245,757 = 12.3%)
|
−25.9
(−25.9 to −27.1)
|
−4.3
(4.1–4.5)
|
Medication orders erased in response to single alerts (i.e., alert accepted)
|
2,336
(2,336/40,139 = 5.8%)
|
2,424
(2,424/30,158 = 8%)
|
37.9
(32.5–43.9)
|
2.2
(1.8–2.6)
|
There was a 25.9% relative (4.3% absolute) decrease in the percent of medication orders generating a single alert after the intervention, and a 37.9% relative (2.2% absolute) increase in the percent of single orders erased (i.e., alert accepted) in response to alerts after the intervention ([Table 1]). Data regarding the patients, providers, and medications associated with the orders generating single alerts are shown in [Table 2]. There was a small but significant difference in the age of patients before and after the intervention. There were no other significant differences among patients, caregivers, or medications.
Table 2
Patient, provider, and medication features for medication orders generating single alerts, pre- and postintervention
|
Preintervention
|
Postintervention
|
p-Value
|
Patient characteristics
|
n = 7,145
|
n = 6,356
|
|
Mean age (SD)
|
55.0 (20.0)
|
56.0 (19.8)
|
0.0036
|
Male (%)
|
45.9
|
45.3
|
0.48
|
Median LOS (IQR, max)
|
3.0 (2–6, 106.0)
|
3.0 (2–6, 98.0)
|
0.52
|
ICU patients (%)
|
6.8
|
7.3
|
0.26
|
Caregiver types
|
n = 986
|
n = 924
|
0.83
|
Attendings[a] (%)
|
23
|
23[c]
|
|
Residents (%)
|
31
|
33
|
|
APPs (%)
|
13
|
13
|
|
Nurses (%)
|
32
|
29
|
|
Other caregivers[b] (%)
|
2
|
3
|
|
Medication characteristics
|
n = 1,015
|
n = 1,006
|
|
Parenteral (%)
|
24.7
|
25.0
|
0.91
|
ISMP high-alert medications (%)
|
16.7
|
18.8
|
0.18
|
Abbreviations: APP, advanced practice provider; ICU, intensive care unit; IQR, interquartile range; ISMP, Institute for Safe Medication Practice; LOS, length of stay; SD, standard deviation.
a Attending category includes teaching attendings, hospitalists and fellows.
b Other Caregivers include medical students, respiratory therapists, and pharmacists.
c Total does not equal 100% due to rounding.
DDI alerts accounted for 47.9% of single alerts before the intervention, and 14.8% of single alerts after the intervention ([Table 3]), a 69.1% relative (33.1% absolute) decrease. For our primary outcome, there was a statistically significant 85.7% relative increase in acceptance of DDI alerts after the intervention, though only a 1.8% absolute increase. Drug-allergy alerts accounted for 5.7% of single alerts before the intervention and 7.3% afterwards, a statistically significant 28.1% relative (1.6% absolute) increase, and there was a 16.4% relative (3.5% absolute) decrease in drug-allergy alert acceptance after the intervention. Relative adverse reaction alert acceptance increased by 21.7% (4.8% absolute), however this change was not statistically significant, and there was no significant change in drug-duplicate alert acceptance before and after the intervention. When a regression analysis was performed the findings were essentially the same ([Table 4]).
Table 3
Number of single alerts and percent accepted pre- and postintervention, and percent change in alert acceptance
|
Preintervention
n = 40,139
|
Postintervention
n = 30,158
|
|
|
Alert types
|
No. of alerts (%)
|
No. of alerts accepted (%)
|
No. of alerts (%)
|
No. of alerts accepted (%)
|
Relative change in alert acceptance (95% CI in %)
|
Absolute change in alert acceptance (%)
|
Adverse reaction
|
104
(0.3)
|
23
(22.1)
|
208[a]
(0.7)
|
56
(26.9)
|
21.7 (−25.1 to 97.8)
|
+ 4.8 (−5.4 to 15.0)
|
Drug allergy
|
2,274
(5.7)
|
486
(21.4)
|
2,199[a]
(7.3)
|
394
(17.9)
|
−16.4 (−26.6 to 4.3)
|
−3.5 (−1.2 to −5.8)
|
Drug duplicate
|
18,544
(46.2)
|
1,429
(7.7)
|
23,290[a]
(77.2)
|
1,799
(7.7)
|
0 (−6.5 to 7.4)
|
0 (−0.5 to 0.5)
|
Drug-drug interaction
|
19,217
(47.9)
|
398
(2.1)
|
4,461
(14.8)
|
175
(3.9)
|
85.7 (58.6–126.2)
|
1.8 (1.3–2.3)
|
a The number of single adverse reaction, drug-allergy and drug-duplicate alerts appear increased in the postintervention phase of the study because preintervention many of them would have been grouped with Intermediate severity DDI alerts, and therefore would not have been included in data looking only at single alerts in the preintervention phase.
Table 4
Adjusted[a] percentage change in alert acceptance pre- and postintervention
Alert types
|
Adjusted relative change in alert acceptance (95% CI in %)
|
Adjusted absolute change in alert acceptance (95% CI in %)
|
Adverse reaction
|
−0.5 (−40.3 to 65.5)
|
0.28 (−10.1 to 10.7)
|
Drug allergy
|
−18.9 (−29.0 to −7.3)
|
−4.2 (6.4 to −1.9%)
|
Drug duplicate
|
0 (−6.8 to 7.1)
|
0.03 (−0.5 to 0.5)
|
Drug-drug interaction
|
95.9 (63.8–134.2)
|
2.0 (1.4–2.4)
|
Abbreviations: CI, confidence interval; ISMP, Institute for Safe Medication Practice.
a Adjusted for patient age, caregiver type, parenteral versus non-parenteral medication, and whether or not the medication is on the ISMP list of high-alert medications.
Discussion
The elimination of intermediate DDI alerts resulted in a moderate decrease in the number of orders generating medication alerts, and a statistically significant increase in medication alert acceptance. However, overall alert acceptance remained extremely low. DDI alerts decreased a relative 69.1% and the increase in overall alert acceptance was almost exclusively due to a nearly 96% relative increase in DDI alert acceptance. However, the acceptance rate for DDI alerts remained extremely low, lower than the rate for other types of medication alerts, even though the postintervention phase only included severe DDI alerts. Additionally, while we had hoped that significantly decreasing the DDI alert burden would increase attendance to other alerts, drug-duplicate alert acceptance remained unchanged, and drug-allergy alert acceptance unexpectedly decreased a small amount.
To the best of our knowledge, this is the first study to describe the findings associated with turning off an entire class of alerts. Many have suggested that alert fatigue causes decreased effectiveness of clinical decision support in providers' order entry systems, and have called for greater specificity for medication alerts, particularly interruptive ones.[6]
[13]
[21]
[22]
[23]
[24]
[25] Several studies have looked at the theoretical effects of decreasing alerts,[26]
[27]
[28] yet only a few studies have looked at the effect of actually decreasing the alert burden in a computerized provider order entry (CPOE) system, as we did, and findings have been inconsistent.[29]
[30]
[31]
[32]
[33] One did not report alert acceptance rates,[29] and two others found no change in the percent of alerts that were accepted.[30]
[31] Others have found no association between alert burden and their acceptance rate.[32]
[33]
Medications alerts should not be turned off casually.[21]
[24] Decision-support experts recommend that institutions customize alert systems to eliminate clinically irrelevant alerts, with the hopes of decreasing alert fatigue and increasing the attention paid to more significant alerts.[34]
[35]
[36] However, selectively analyzing alerts is labor intensive and may not be feasible for smaller institutions.[35] Configuring the system to enable users to identify inappropriate alerts might help target those worth removing.[37] However, providers differ in their opinions of the value of alerts,[38]
[39] their perceptions of which are important,[26] and their rates of responses to them.[40] In one study, there was no correlation between designated alert severity and the number of providers who thought an alert could be safely turned off.[26] Alerts might prompt heightened monitoring for potential adverse events, even if they are overridden. Additionally, medication alert systems generally do not consider the effect of patient context and other nondrug contributions to medication-related adverse events. This was demonstrated in another study which showed that restricting alerts to obviously QT-prolonging drugs would not improve the positive predictive value of the remaining alerts, and would identify less than half of patients at risk for torsades de pointes.[27] Institutions might also be concerned about the legal ramifications of turning alerts off.[41] Broadly accepted recommendations are needed about which alerts are critical, rather than expecting every institution to independently assess the importance of every alert.[21]
[42]
[43]
[44]
The medication alert acceptance rate in this study, even after our intervention, was lower than the 11.5 to 26.7% rates, recently reported for inpatients.[45]
[46] Most likely it would be necessary to change the way medication alerts appear and behave, in addition to decrease the alert burden to increase the acceptance rate in our system to that seen more typically. There is huge variation between EMRs in the appearance and capabilities of their decision support functionality.[21]
[43]
[47]
[48] Ideally DDI alerts are displayed at the point when a medication order is first entered, and not when attempting to sign all orders in a session.[21]
[49] Some studies have reported increased alert acceptance when critical alerts are displayed more intrusively than those that are less important.[13]
[50]
[51] DDI alerts should include clear identification of the interacting drug pair, the potential consequence, its seriousness, and recommended action, with easy access to the mechanism of interaction, patient contextual information, and available evidence regarding the interaction.[21] However, many EMRs, like ours at the time of this study, do not incorporate these recommendations. Fortunately our institution moved to an EMR which satisfies many of them a year and a half after this study took place.
Limitations
Several limitations of our study should be considered. First, we were unable to distinguish “intermediate” and “severe” DDI alerts in the preintervention group, and therefore compared responses to both to just the remaining “severe” DDI alerts in the postintervention group. Intermediate DDI alerts are probably accepted at different rates than severe DDI alerts; for example, an intermediate DDI alert might be more likely than a severe DDI alert to be justifiably overridden while simultaneously prompting increased surveillance for adverse events. It is possible that we had compared only “severe” DDI alerts; we would have found no difference in that portion of the analysis. Second, we did not account for possible clustering around patients for whom multiple orders were placed or around providers with a higher alert burden, for whom alert acceptance might have been lower. Third, we used a pre–post study design that might not take into account other changes in the local environment or changes in the users themselves, particularly residents who could have occurred before and after the system change was made. Fourth, the DDI alert thresholds were defined by a proprietary commercial algorithm and we did not consider their validity, nor the appropriateness of the DDIs left in place; it is possible that we eliminated some clinically important DDIs and left in place certain low-value DDIs. Fifth, we did not measure patient outcomes, such as adverse drug events that are more important than acceptance rates when determining alerts effectiveness.[21] Finally, the study was conducted at a single–medical center, and findings might be different in another setting or with another EMR.
Conclusion
Decreasing alert burden and increasing the significance of DDI alerts presented to providers resulted in a statistically significant increase in the acceptance rate for medication alerts overall, and for DDIs in particular. However, overall alert acceptance, and acceptance of DDI alerts, remained dismally low, and the rate of alert acceptance was unchanged for drug-duplicate and decreased for drug-allergy alerts. Further study is needed to determine if alert acceptance would have been increased by modifying alert appearance or behavior, or by provider education, and to determine if interventions like ours will have an effect on alert fatigue and patient outcomes.
Clinical Relevance Statement
Clinical decision support can guide caregiver to make decision when placing orders; however, caregivers may be exposed to so many decision support alerts that they experience alert fatigue and hindering the effectiveness of the alerts. It is assumed, but not known for certain, that caregivers exposed to fewer alerts will be more likely to respond as desired to the remaining alerts. This study found that eliminating a class of DDI alerts was associated with a statistically significant increase in alert acceptance; however, overall alert acceptance remained very low.