Keywords
Clinical decision support - pediatrics - clinical practice guideline - use - administration
and maintenance of clinical information systems - ambulatory care - primary care
Introduction
Clinical Decision Support (CDS) that incorporates evidence-based recommendations can
lead to improved delivery of evidence-based clinical care [[1]]. This is especially true when CDS is implemented in an electronic health record
(EHR) and carefully integrated into clinical workflow [[2]–[4]]. However, when evidence and recommendations change, there is no direct pathway
for those changes to be incorporated into CDS.
Our institution developed and implemented CDS based on the 2009 American Academy of
Pediatrics (AAP) policy statement for determining palivizumab eligibility [[5], [6]]. Palivizumab is a monoclonal antibody that provides short-lasting passive immunity
to respiratory syncytial virus (RSV).[[5]] It is effective in decreasing hospitalizations related to RSV in select populations
[[5]]. Palivizumab, with reported costs between $588 and $1,552 per vial, is one of the
most expensive medications prescribed by primary care pediatricians.[[7], [8]] In addition to being expensive, for palivizumab to be effective it needs to be
administered on a monthly basis throughout the RSV season [[5], [9]] with a difficult adherence schedule [[6]]. Insurance payer approval decisions typically adhere strongly to the most current
recommendations for palivizumab eligibility.
As part of the pre-RSV season preparations each year, members of the CDS team search
for updates to the eligibility guidelines from the AAP. In 2014, just two years after
the introduction of the CDS, the AAP revised its policy statement for the upcoming
RSV season [[9]]. It was important for patient care to promptly update the CDS in response to the
updated guidelines, which were released only 3 months before the start of the 2014–15
RSV season. The original CDS took months to develop, test, and implement into patient
care. The CDS used in our institution for providing recommendations surrounding palivizumab
is hosted outside the EHR but linked into the EHR using a web-services based approach
[[6]]. When a patient’s chart is opened, a message is sent to the rules engine. The rules
engine determines if any CDS modules are relevant based on patient data from the EHR.
The relevant modules are rendered within the visit navigator and appear to the casual
clinician as if they were native to the EHR.
In order to provide the most up to date and evidence-based care throughout our health
system, we needed to adapt the current CDS to address these changes [[5], [9]]. Without these updates, the CDS that we so rigorously tested during the initial
implementation would have delivered erroneous recommendations for numerous patients.
We were concerned that erroneous recommendations would result in less efficient care
and general distrust in CDS.
During the initial CDS implementation we utilized a systematic process for the creation
of our CDS, called the “GLIDES” (Guidelines Into DEcision Support) method [[10]]. The GLIDES method was developed by a consortium of guideline developer and implementers
with a focus on identifying best practices for converting guideline recommendations
into implementable decision support [[11], [12]]. In short, the GLIDES process begins with formalization of recommendations from
source guidelines using an XML-based schema, the Guideline Elements Model (GEM) [[11], [12]]. During formalization, the source of the recommendation (i.e., page number and
paragraph location in the palivizumab policy statement) is included for each recommendation
[[13]]. The GEM encoded recommendations were then translated line by line into executable
rules, which in our institution were written in Drools (a Java-based rules engine)
[[14]]. Since all GEM annotations were retained in this translation, there was a direct
link back from the CDS source code to the narrative text in the guideline. It was
believed that utilizing the links between the CDS and the narrative text would simplify
updating in response to the new recommendations.
The effort required to update CDS in response to changes in evidence has not been
described in the literature. In this study, we describe our process for updating CDS
in response to changes in the palivizumab policy statement and the downstream implications
of tightly linking CDS to the original recommendations. We also evaluate the extent
that patient eligibility within our practices would change in response to the updated
recommendations.
Methods
We obtained the 2014 AAP policy statement from the AAP’s website [[9]]. We compared this side-byside with the parsed guideline from the 2009 AAP policy
statement and identified all areas of divergence between the documents ([Table 1]). Following this comparison, we worked with a subject matter expert to identify
the clinically relevant differences and to confirm that our understanding of the new
recommendations matched the intent of the recommendations. Each of these clinically
relevant differences was associated with actions and decision variables (e.g. gestational
age) encoded during the initial CDS development efforts. Drawing upon the link between
the decision variables from the text and the CDS rules engine, we located all sections
of the CDS rules that required updating.
Table 1
Comparison of the AAP 2009 and 2014 Policy Statements on palivizumab eligibility
|
2009 AAP Policy Statement
|
2014 AAP Policy Statement
|
1 – Eligibility Gestational Age
|
29 – 31 6/7 weeks gestation
< 6 months at the start of the season Eligible for 5 doses
|
≥ 29 weeks gestation
Not recommended UNLESS other comorbidities are present such as CLD or significant CHD
|
2 – Eligibility Other Risk Factors
|
32–34 6/7 weeks gestation
< 3 months at the start of the season AND One or more of the following risk factors:
|
≥ 32 weeks gestation
NOT recommended UNLESS other comorbidities are present such as significant CHD
|
3 – Chronic Lung Disease (CLD) Improved Specificity
|
Infants with CLD
Infants and children <24 months of age with CLD who require medical therapy within
6 months prior to the start of the season
Eligible during the first 2 years of life Medical therapy includes:
-
Chronic corticosteroids
-
Diuretics
-
Supplemental oxygen
-
Bronchodilators
|
Preterm infants with CLD
<32 weeks gestation AND requiring >21% oxygen after 28 days of life Can consider for a second season consideration IF medical support is required during the 6 months prior to the start of the season Medical support defined as
-
Chronic corticosteroids
-
Diuretics
-
Supplemental oxygen
-
Bronchodilators [[28]]
|
4 – Congenital Heart Disease (CHD) Improved Specificity
|
≤24 months at the start of the season
-
Acyanotic or cyanotic heart disease
-
Receiving medications for congestive heart failure
-
Pulmonary Hypertension (moderate to severe)
-
Consider a post-op dose after bypass for eligible patients
|
≤12 months at the start of the season
-
Acyanotic heart disease receiving medications for congestive heart failure and will
require corrective surgery
-
Pulmonary hypertension (moderate to severe)
-
Cyanotic heart disease – in consultation with cardiologist
-
Consider a post-op dose after bypass or ECMO for eligible patients
-
Consider in patients <2 years undergoing cardiac transplant during RSV season
|
5 – Cystic Fibrosis
|
No clear recommendation for use
|
Routine use in cystic fibrosis patients is NOT recommended In the 1st year of life: Only indicated if evidence of CLD and/or nutritional compromiseIn the 2nd year of
life: May be considered if severe lung disease or weight-for length <10th percentile
|
6 – Immune compromised
|
No clear recommendations for use
|
Recommended for use in patients <24 months of age if profoundly immunocompromised
during the RSV season
|
7 – Continuance of RSV after hospitaliz-ation
|
Continue palivizumab prophylaxis if there is breakthrough RSV hospitaliz-ation for
the remainder of the season
|
Discontinue palivizumab after RSV infection leading to hospitalization
|
One physician-programmer made the necessary changes to the palivizumab CDS rules.
The palivizumab CDS was one component of a more comprehensive intervention called
the “Preemie Assistant.”[[6]]. Nurses responsible for coordinating palivizumab administration efforts were the
primary targets for the CDS. Information about eligible children was displayed in
a patient list in the electronic health record. Nurses reviewed this list throughout
the RSV season and could access additional tools to support their workflow (►[Figure 1]). In addition to palivizumab eligibility CDS for both premature and full-term infants,
this intervention was designed to improve the quality of primary care delivered to
premature infants in the domains of growth assessment, nutrition recommendations,
developmental screening, blood pressure monitoring, and retinopathy of prematurity
follow-up. The CDS for this project used a web-service approach, which allowed us
to encode rules in Drools outside the EHR. We used JavaScript to deliver interactive
CDS content to the clinicians. Version control of the CDS source code was maintained
using an institutional Github repository [[15]]. This allowed for the rapid comparison of the newly updated code to the original
code, as well as providing a detailed timestamp record that could be used to estimate
the effort required to make these edits. The revised CDS was thoroughly evaluated
by our team using test patient data to ensure the updated CDS correctly implemented
the new recommendations.
Fig. 1 Screenshot of the palivizumab CDS tool in the context of the “preemie assistant.”
We simultaneously updated our reporting data queries to match the new recommendations.
The data queries are used at the start of the RSV season to identify eligible patients
and to support the nursing staff responsible for ordering palivizumab. The data queries
use standard SQL and do not include direct links to the evidence. These SQL queries
helped ensure that children who had previously established care and might not be due
for a routine visit near the start of the RSV season were not missed. In contrast,
the Preemie Assistant CDS included reminders that informed clinicians of eligible
children during office visits (typically at their first visit in the office) and supported
both identification and proper dose calculation of palivizumab [[6]].
We documented the time required to complete each phase of the CDS update (document
analysis, rule authoring, testing, and updating reports). Time on task was estimated
from version control access logs and personal recall. All members of the update team
certified that their report of time was representative of the amount of time spent
on this task.
To quantify the extent of the impact these changes would have, we used the data reporting
queries to develop lists of eligible patients based on the both the 2009 and the 2014
recommendations. We determined the number of patients eligible for palivizumab using
the 2009 criteria, 2014 criteria and both criteria sets. For patients that met the
2009 criteria but not the 2014 criteria, we further identified which changed recommendation
resulted in the patient being no longer eligible for palivizumab. In addition to distributing
lists of patients meeting the 2014 criteria prior to the RSV season, we also distributed
the list of patients meeting old palivizumab criteria but not meeting new palivizumab
criteria (clearly labeled as “not eligible”). This was to help educate clinicians
within our health system regarding the updated recommendations and to provide additional
time to clinicians looking to seek approval of palivizumab from insurance companies
despite a patient not meeting the current eligibility recommendations.
Results
Of the 15 recommendations in the 2009 AAP policy statement, 7 had clinically relevant
changes identified during side-by-side comparison. These differences relate to baseline
eligibility age cutoffs, changes in recommendations for individual disease states,
and the management of patients receiving palivizumab who suffer a RSV-related hospitalization
despite prophylaxis. The detailed comparison of the two policy statements (►[Table 1]) was facilitated by AAP policy statement structure.
The total time required to respond to the new policy statement recommendations was
11 person-hours shared between a physician-programmer, a clinical informaticist, a
subject matter expert, and a data analyst. This represents additional effort that
was required beyond the usual effort required to prepare for an RSV season in which
the guidelines had not changed. A breakdown of time spent on each updating activity
as well as the team members involved is included as ►[Table 2]. Most importantly, the system update, including testing and rule validation, was
completed in time for the 2014–15 RSV season.
Table 2
Breakdown of Steps and Effort for Updating CDS
Activity
|
Time Spent
|
Team Members Involved
|
Extracting recommendations from the 2014 AAP policy statement and associating these
with their 2009 predecessor
|
2 hours
|
Clinical Informatician
|
Updating decision variables from the 2009 AAP policy statement with 2014 definitions
|
1 hour
|
Clinical Informatician
|
Validation for extraction activities and updating of definitions
|
2 hours
|
Subject Matter Expert
|
Updating decision rules within the CDS
|
1 hour
|
Physician-programmer
|
Testing of CDS changes and confirming correctness of recommendations
|
2 hours
|
Clinical Informatician, Subject Matter Expert
|
Updating and running data queries to generate eligible patient lists
|
3 hours
|
Data Analyst
|
Total
|
11 hours
|
|
Encoding these differences required adding/changing 24 lines and removing 37 lines
of executable code (as a reference, this file contained 1,356 total lines of executable
code after the update). Only a single file within the CDS intervention required any
changes. Updating the SQL queries required for data reporting required making changes
to the SQL code. The SQL code was not developed using links to the sourced policy
statement. Updating the SQL queries required 10 edits (e.g. changed descriptive text
or date within a line), 18 insertions, and 13 deletions.
Simultaneously running the 2009 and 2014 data queries allowed our team to identify
patients eligible for palivizumab under the 2009 criteria, the 2014 criteria, and
both criteria (►[Table 3]). The breakdown of eligible patients using the 2009 criteria that would no longer
be eligible in 2014 due to changes in these criteria is given in ►[Table 4]. Of note, of the 352 patients who would have met palivizumab eligibility criteria
in 2009, 85 patients (24.1%) were no longer eligible based on the 2014 criteria. Conversely,
only 2 patients (0.74%) who met the 2014 palivizumab eligibility criteria would not
have been eligible under the 2009 eligibly criteria.
Table 3
Patients meeting the 2009 and/or 2014 palivizumab eligibility criteria[*]
Meet 2009 Criteria
|
352 (99.4%)
|
Meet 2014 Criteria
|
269 (76%)
|
Meet Only 2009 Criteria
|
85 (24.1%)
|
Meet Only 2014 Criteria
|
2 (0.74%)
|
Total
|
354
|
* Only includes patients born before the start of RSV season
Table 4
Reasons why patients previously eligible in 2009 would be ineligible in 2014[*]
Cardiac Disease no longer meeting eligibility criteria
|
43 (50.6%)
|
CLD no longer meeting eligibility criteria
|
21 (24.7%)
|
Under 6 months old but over 29 wks gestation
|
11 (12.9%)
|
Presumed CLD with medications but no longer meeting eligibility criteria
|
7 (8.2%)
|
Cardiac disease AND chronic lung disease no longer meeting eligibility criteria
|
3 (3.5%)
|
Total
|
85
|
* Only includes patients born before the start of RSV season
Discussion
Due to our approach in the initial development of the Preemie Assistant, which was
facilitated by use of the GEM, the reconciliation and update of rules to match the
new guideline required only 11 person-hours of effort. The inclusion of narrative
text as comments within the CDS made finding the correct areas to adapt as simple
as searching for the relevant narrative text. This allowed our team to spend the majority
of the effort on identifying the clinically relevant changes within the source guidelines
and converting the new recommendations into executable rules. Developers of CDS may
wish to consider adopting this practice as it made updating the CDS more manageable.
More material was deleted than added, which makes sense given that the updated recommendations
for palivizumab were more restrictive than the earlier recommendations. Although there
are no published estimates regarding the impact of these guidelines on palivizumab
utilization in outpatient settings, among inpatients, recent literature reported an
approximately 50% reduction in palivizumab use across potentially eligibly patients.[[16]] Without performing our side-by-side analysis to identify differences, it is possible
that recommendations from the earlier guideline would have remained in the updated
CDS if we were only focused on updating the recommendations addressed in the new guideline.
For example, the comparison of the recommendation on CLD in 2014 clearly indicated
a second RSV season. Without our side-by-side analysis we may not have caught this
subtle change and continued to use an age-based, as opposed to season-based, cut-off
for this population.
Keeping computer programs updated is a well-known problem to any software developer,
and has been recognized as a key concern for CDS development [[17]–[19]]. The software lifecycle has been well described numerous times and portions of
this are directly applicable to CDS maintenance [[20]–[22]]. Published literature by teams such as the Clinical Decision Support Consortium
have been helpful in outlining the issues and processes of decision support maintenance,
and have proposed alternatives to local maintenance of CDS [[18], [19], [23], [24]]. However, to date no publications have quantified that effort actually required
to update clinical decision support in response to changes in recommendations. Sittig
et. al. noted that 12 days to update CDS was “state of the art in knowledge management”
[[24]], but does not address the size of the team, the number of hours, the extent of
the changes, or the reason for changes. Wright et. al. noted that updates often only
occur every several years [[25]]. Additionally, Hulse et. al. [[26]] described a longitudinal approach to content and software management that describes
several key issues with CDS maintenance, including versioning, but does not address
how to incorporate a changed evidence base into updating CDS [[26]]. Grandi et. al. also discuss versioning of guidelines, but focus primarily on multiple
versions of guidelines during the authoring and revision phase, rather than on how
to best account for revised guidelines during CDS updating [[27]]. One other factor to consider is that CDS developed using grant-based or time-limited
funding has the potential to become stranded if there is no plan or funding for maintenance.
By using the links between the CDS and the 2009 policy statement we leveraged a method
to efficiently identify all CDS rules which required changes to account for the 2014
update. Without this link it would have required significantly more time to identify
the sections within the CDS requiring changes, and in all likelihood the entirety
of the source code would have needed to been reviewed. We did not re-GEM cut the entire
2014 policy statement (the approach used to extract the recommendations from the 2009
AAP policy statement), so if there is another update it is unclear how this will affect
our effort in developing a second revision. Instead, we used our prior GEM cutting
work to vastly speed up the “reconciliation” of rules to the new guideline. Additionally,
as EHRs become more supportive of clinical terminology standards and standards to
access remote knowledge services, it will be helpful for CDS developers and updaters
to proactively use these standards. This may help with dissemination and maintenance
of CDS in the long term. However, even with these standards there may remain situations
where changes in guidelines will require unanticipated changes in the contents or
structure of data consumed or produced by remote knowledge services.
Two months after the start of the 2014–2015 RSV season, the AAP published an errata
on the 2014 Policy Statement [[28]]. This errata was only published in the journal of Pediatrics and not indexed in
Pubmed. Therefore, it took some time for it to come to our projects team’s attention.
Despite this, the errata only resulted in a single change to eligibility criteria
which was easily identified in the CDS code and SQL queries and therefore was not
included in our estimates of person-hours for this project. However, it is important
to note that when developing CDS from evidence-based sources it is necessary to continuously
monitor for changes to the evidence-base and to published recommendations.
Running both old and new test report queries provided a clear picture of the effect
eligibility criteria changes would have within out network at the start of the 2014
RSV season. It also allowed our team to check for validity and to ensure that the
reporting rules and the CDS were once again aligned. Other studies have investigated
the patient care impacts and financial impacts of this change and noted a cost savings
with no significant impact on RSV hospitalization rates [[16], [29]]. The controversy surrounding these updated recommendations is also acknowledged
[[30]].
Limitations
We have described our experience successfully maintaining CDS through a significant
change in clinical guidelines for only one specific clinical problem. Additionally,
while we report the time required for updating this system we were unable to identify
a baseline time for CDS updating for comparison. There are also limitations surrounding
generalizability of our approach. Many organizations are not yet using, or yet able
to use, a web services based approach for CDS rules. Organizations dependent on EHR
provided tools might not be able use the link between evidence and the CDS to identify
areas requiring change. Additionally, organizations using sharable CDS may have additional
barriers to updating including maintaining multiple versions simultaneously, addressing
difference in workflows, and more strenuous testing.
Our strategy for using GEM to maintain a tight linkage between our CDS and the original
guidelines may be less successful for CDS maintenance in other clinical domains. Notably
the same professional organization was the author for all versions of the palivizumab
policy statement. Consequently, the two versions of the document involved in our project
had a similar structure. In situations where more drastic structural differences occur
to the published guideline - as may occur if different professional organizations
become involved in authoring a guideline - our approach to CDS maintenance may be
less efficient. Fortunately, the structure of published guidelines has become increasingly
standardized through checklists and other tools that have recently emerged for guideline
authors [[10]–[12], [31]–[33]]. Consequently, drastic changes in guideline publications between versions should
become less common. This trend should allow our approach, which tightly links executable
CDS to narrative guideline publications, to be an increasingly successful strategy
for maintaining CDS published guidelines are revised.
Conclusion
Tightly linking executable CDS rules to narrative palivizumab recommendations using
the GEM facilitated timely maintenance of the CDS when the published recommendations
changed. This strategy of tightly linking executable CDS to narrative guidelines may
help knowledge engineers efficiently maintain CDS in the face of incremental changes
to published guidelines in many clinical domains. Developing strategies to handle
more significant structural changes in published guidelines remains an important area
of concern for future work.
Clinical Relevance Statement
Clinical Relevance Statement
When clinical recommendations change any CDS based upon these recommendations need
to be updated. Updating software can be a time consuming process and updates based
on a change in evidence is previously undescribed. Using a systematic method to develop
CDS from published recommendations facilitated our efforts to update the CDS when
the published recommendations changed.
Abbrevations
AAP – American Academy of Pediatrics
RSV – Respiratory Syncytial Virus
CDS – Clinical Decision Support
EHR – Electronic Health Record
GEM – Guideline Elements Model