Subscribe to RSS
DOI: 10.4338/ACI-2016-10-RA-0173
Rapid Adjustment of Clinical Decision Support in Response to Updated Recommendations for Palivizumab Eligibility
Correspondence to:
Publication History
received:
20 October 2016
accepted:
12 March 2017
Publication Date:
21 December 2017 (online)
- Summary
- Introduction
- Methods
- Results
- Discussion
- Limitations
- Conclusion
- Clinical Relevance Statement
- Abbrevations
- References
Summary
Background: Palivizumab is effective at reducing hospitalizations due to respiratory syncytial virus among high-risk children, but is indicated for a small population. Identification of patients eligible to receive palivizumab is labor-intensive and error-prone. To support patient identification we developed Clinical Decision Support (CDS) based on published recommendations in 2012. This CDS was developed using a systematic process, which directly linked computer code to a recommendation’s narrative text. In 2014, updated recommendations were published, which changed several key criteria used to determine eligible patients.
Objective: Assess the effort required to update CDS in response to new palivizumab recommendations and identify factors that impacted these efforts.
Methods: We reviewed the updated American Academy of Pediatrics (AAP) policy statement from Aug 2014 and identified areas of divergence from the prior publication. We modified the CDS to account for each difference. We recorded time spent on each activity to approximate the total effort required to update the CDS.
Results: Of the 15 recommendations in the initial policy statement, 7 required updating. The CDS update was completed in 11 person-hours. Comparison of old and new recommendations was facilitated by the AAP policy statement structure and required 3 hours. Validation of the revised logic required 2 hours by a clinical domain expert. An informaticist required 3 hours to update and test the CDS. This included adding 24 lines and deleting 37 lines of code. Updating relevant data queries took an additional 3 hours and involved 10 edits.
Conclusion: We quickly adapted CDS in response to changes in recommendations for palivizumab administration. The consistent AAP policy statement structure and the link we developed between these statements and the CDS rules facilitated our efforts. We recommend that CDS implementers establish linkages between published narrative recommendations and their executable rules to facilitate maintenance efforts.
Citation: Michel J, Utidjian LH, Karavite D, Hogan A, Ramos MJ, Miller J, Shiffman RN, Grundmeier RW. Rapid adjustment of clinical decision support in response to updated recommendations for palivizumab eligibility. Appl Clin Inform 2017; 8: 581–592 https://doi.org/10.4338/ACI-2016-10-RA-0173
#
Keywords
Clinical decision support - pediatrics - clinical practice guideline - use - administration and maintenance of clinical information systems - ambulatory care - primary careIntroduction
Clinical Decision Support (CDS) that incorporates evidence-based recommendations can lead to improved delivery of evidence-based clinical care [[1]]. This is especially true when CDS is implemented in an electronic health record (EHR) and carefully integrated into clinical workflow [[2]–[4]]. However, when evidence and recommendations change, there is no direct pathway for those changes to be incorporated into CDS.
Our institution developed and implemented CDS based on the 2009 American Academy of Pediatrics (AAP) policy statement for determining palivizumab eligibility [[5], [6]]. Palivizumab is a monoclonal antibody that provides short-lasting passive immunity to respiratory syncytial virus (RSV).[[5]] It is effective in decreasing hospitalizations related to RSV in select populations [[5]]. Palivizumab, with reported costs between $588 and $1,552 per vial, is one of the most expensive medications prescribed by primary care pediatricians.[[7], [8]] In addition to being expensive, for palivizumab to be effective it needs to be administered on a monthly basis throughout the RSV season [[5], [9]] with a difficult adherence schedule [[6]]. Insurance payer approval decisions typically adhere strongly to the most current recommendations for palivizumab eligibility.
As part of the pre-RSV season preparations each year, members of the CDS team search for updates to the eligibility guidelines from the AAP. In 2014, just two years after the introduction of the CDS, the AAP revised its policy statement for the upcoming RSV season [[9]]. It was important for patient care to promptly update the CDS in response to the updated guidelines, which were released only 3 months before the start of the 2014–15 RSV season. The original CDS took months to develop, test, and implement into patient care. The CDS used in our institution for providing recommendations surrounding palivizumab is hosted outside the EHR but linked into the EHR using a web-services based approach [[6]]. When a patient’s chart is opened, a message is sent to the rules engine. The rules engine determines if any CDS modules are relevant based on patient data from the EHR. The relevant modules are rendered within the visit navigator and appear to the casual clinician as if they were native to the EHR.
In order to provide the most up to date and evidence-based care throughout our health system, we needed to adapt the current CDS to address these changes [[5], [9]]. Without these updates, the CDS that we so rigorously tested during the initial implementation would have delivered erroneous recommendations for numerous patients. We were concerned that erroneous recommendations would result in less efficient care and general distrust in CDS.
During the initial CDS implementation we utilized a systematic process for the creation of our CDS, called the “GLIDES” (Guidelines Into DEcision Support) method [[10]]. The GLIDES method was developed by a consortium of guideline developer and implementers with a focus on identifying best practices for converting guideline recommendations into implementable decision support [[11], [12]]. In short, the GLIDES process begins with formalization of recommendations from source guidelines using an XML-based schema, the Guideline Elements Model (GEM) [[11], [12]]. During formalization, the source of the recommendation (i.e., page number and paragraph location in the palivizumab policy statement) is included for each recommendation [[13]]. The GEM encoded recommendations were then translated line by line into executable rules, which in our institution were written in Drools (a Java-based rules engine) [[14]]. Since all GEM annotations were retained in this translation, there was a direct link back from the CDS source code to the narrative text in the guideline. It was believed that utilizing the links between the CDS and the narrative text would simplify updating in response to the new recommendations.
The effort required to update CDS in response to changes in evidence has not been described in the literature. In this study, we describe our process for updating CDS in response to changes in the palivizumab policy statement and the downstream implications of tightly linking CDS to the original recommendations. We also evaluate the extent that patient eligibility within our practices would change in response to the updated recommendations.
#
Methods
We obtained the 2014 AAP policy statement from the AAP’s website [[9]]. We compared this side-byside with the parsed guideline from the 2009 AAP policy statement and identified all areas of divergence between the documents ([Table 1]). Following this comparison, we worked with a subject matter expert to identify the clinically relevant differences and to confirm that our understanding of the new recommendations matched the intent of the recommendations. Each of these clinically relevant differences was associated with actions and decision variables (e.g. gestational age) encoded during the initial CDS development efforts. Drawing upon the link between the decision variables from the text and the CDS rules engine, we located all sections of the CDS rules that required updating.
2009 AAP Policy Statement |
2014 AAP Policy Statement |
|
---|---|---|
1 – Eligibility Gestational Age |
29 – 31 6/7 weeks gestation
|
≥ 29 weeks gestation
|
2 – Eligibility Other Risk Factors |
32–34 6/7 weeks gestation
|
≥ 32 weeks gestation
|
3 – Chronic Lung Disease (CLD) Improved Specificity |
Infants with CLD
|
Preterm infants with CLD
|
4 – Congenital Heart Disease (CHD) |
≤24 months at the start of the season
|
≤12 months at the start of the season
|
5 – Cystic Fibrosis |
No clear recommendation for use |
Routine use in cystic fibrosis patients is NOT recommended |
6 – Immune compromised |
No clear recommendations for use |
Recommended for use in patients <24 months of age if profoundly immunocompromised during the RSV season |
7 – Continuance of RSV |
Continue palivizumab prophylaxis if there is breakthrough RSV hospitaliz-ation for the remainder of the season |
Discontinue palivizumab after RSV infection leading to hospitalization |
One physician-programmer made the necessary changes to the palivizumab CDS rules. The palivizumab CDS was one component of a more comprehensive intervention called the “Preemie Assistant.”[[6]]. Nurses responsible for coordinating palivizumab administration efforts were the primary targets for the CDS. Information about eligible children was displayed in a patient list in the electronic health record. Nurses reviewed this list throughout the RSV season and could access additional tools to support their workflow (►[Figure 1]). In addition to palivizumab eligibility CDS for both premature and full-term infants, this intervention was designed to improve the quality of primary care delivered to premature infants in the domains of growth assessment, nutrition recommendations, developmental screening, blood pressure monitoring, and retinopathy of prematurity follow-up. The CDS for this project used a web-service approach, which allowed us to encode rules in Drools outside the EHR. We used JavaScript to deliver interactive CDS content to the clinicians. Version control of the CDS source code was maintained using an institutional Github repository [[15]]. This allowed for the rapid comparison of the newly updated code to the original code, as well as providing a detailed timestamp record that could be used to estimate the effort required to make these edits. The revised CDS was thoroughly evaluated by our team using test patient data to ensure the updated CDS correctly implemented the new recommendations.
We simultaneously updated our reporting data queries to match the new recommendations. The data queries are used at the start of the RSV season to identify eligible patients and to support the nursing staff responsible for ordering palivizumab. The data queries use standard SQL and do not include direct links to the evidence. These SQL queries helped ensure that children who had previously established care and might not be due for a routine visit near the start of the RSV season were not missed. In contrast, the Preemie Assistant CDS included reminders that informed clinicians of eligible children during office visits (typically at their first visit in the office) and supported both identification and proper dose calculation of palivizumab [[6]].
We documented the time required to complete each phase of the CDS update (document analysis, rule authoring, testing, and updating reports). Time on task was estimated from version control access logs and personal recall. All members of the update team certified that their report of time was representative of the amount of time spent on this task.
To quantify the extent of the impact these changes would have, we used the data reporting queries to develop lists of eligible patients based on the both the 2009 and the 2014 recommendations. We determined the number of patients eligible for palivizumab using the 2009 criteria, 2014 criteria and both criteria sets. For patients that met the 2009 criteria but not the 2014 criteria, we further identified which changed recommendation resulted in the patient being no longer eligible for palivizumab. In addition to distributing lists of patients meeting the 2014 criteria prior to the RSV season, we also distributed the list of patients meeting old palivizumab criteria but not meeting new palivizumab criteria (clearly labeled as “not eligible”). This was to help educate clinicians within our health system regarding the updated recommendations and to provide additional time to clinicians looking to seek approval of palivizumab from insurance companies despite a patient not meeting the current eligibility recommendations.
#
Results
Of the 15 recommendations in the 2009 AAP policy statement, 7 had clinically relevant changes identified during side-by-side comparison. These differences relate to baseline eligibility age cutoffs, changes in recommendations for individual disease states, and the management of patients receiving palivizumab who suffer a RSV-related hospitalization despite prophylaxis. The detailed comparison of the two policy statements (►[Table 1]) was facilitated by AAP policy statement structure.
The total time required to respond to the new policy statement recommendations was 11 person-hours shared between a physician-programmer, a clinical informaticist, a subject matter expert, and a data analyst. This represents additional effort that was required beyond the usual effort required to prepare for an RSV season in which the guidelines had not changed. A breakdown of time spent on each updating activity as well as the team members involved is included as ►[Table 2]. Most importantly, the system update, including testing and rule validation, was completed in time for the 2014–15 RSV season.
Activity |
Time Spent |
Team Members Involved |
---|---|---|
Extracting recommendations from the 2014 AAP policy statement and associating these with their 2009 predecessor |
2 hours |
Clinical Informatician |
Updating decision variables from the 2009 AAP policy statement with 2014 definitions |
1 hour |
Clinical Informatician |
Validation for extraction activities and updating of definitions |
2 hours |
Subject Matter Expert |
Updating decision rules within the CDS |
1 hour |
Physician-programmer |
Testing of CDS changes and confirming correctness of recommendations |
2 hours |
Clinical Informatician, Subject Matter Expert |
Updating and running data queries to generate eligible patient lists |
3 hours |
Data Analyst |
Total |
11 hours |
Encoding these differences required adding/changing 24 lines and removing 37 lines of executable code (as a reference, this file contained 1,356 total lines of executable code after the update). Only a single file within the CDS intervention required any changes. Updating the SQL queries required for data reporting required making changes to the SQL code. The SQL code was not developed using links to the sourced policy statement. Updating the SQL queries required 10 edits (e.g. changed descriptive text or date within a line), 18 insertions, and 13 deletions.
Simultaneously running the 2009 and 2014 data queries allowed our team to identify patients eligible for palivizumab under the 2009 criteria, the 2014 criteria, and both criteria (►[Table 3]). The breakdown of eligible patients using the 2009 criteria that would no longer be eligible in 2014 due to changes in these criteria is given in ►[Table 4]. Of note, of the 352 patients who would have met palivizumab eligibility criteria in 2009, 85 patients (24.1%) were no longer eligible based on the 2014 criteria. Conversely, only 2 patients (0.74%) who met the 2014 palivizumab eligibility criteria would not have been eligible under the 2009 eligibly criteria.
Meet 2009 Criteria |
352 (99.4%) |
Meet 2014 Criteria |
269 (76%) |
Meet Only 2009 Criteria |
85 (24.1%) |
Meet Only 2014 Criteria |
2 (0.74%) |
Total |
354 |
Cardiac Disease no longer meeting eligibility criteria |
43 (50.6%) |
CLD no longer meeting eligibility criteria |
21 (24.7%) |
Under 6 months old but over 29 wks gestation |
11 (12.9%) |
Presumed CLD with medications but no longer meeting eligibility criteria |
7 (8.2%) |
Cardiac disease AND chronic lung disease no longer meeting eligibility criteria |
3 (3.5%) |
Total |
85 |
#
Discussion
Due to our approach in the initial development of the Preemie Assistant, which was facilitated by use of the GEM, the reconciliation and update of rules to match the new guideline required only 11 person-hours of effort. The inclusion of narrative text as comments within the CDS made finding the correct areas to adapt as simple as searching for the relevant narrative text. This allowed our team to spend the majority of the effort on identifying the clinically relevant changes within the source guidelines and converting the new recommendations into executable rules. Developers of CDS may wish to consider adopting this practice as it made updating the CDS more manageable.
More material was deleted than added, which makes sense given that the updated recommendations for palivizumab were more restrictive than the earlier recommendations. Although there are no published estimates regarding the impact of these guidelines on palivizumab utilization in outpatient settings, among inpatients, recent literature reported an approximately 50% reduction in palivizumab use across potentially eligibly patients.[[16]] Without performing our side-by-side analysis to identify differences, it is possible that recommendations from the earlier guideline would have remained in the updated CDS if we were only focused on updating the recommendations addressed in the new guideline. For example, the comparison of the recommendation on CLD in 2014 clearly indicated a second RSV season. Without our side-by-side analysis we may not have caught this subtle change and continued to use an age-based, as opposed to season-based, cut-off for this population.
Keeping computer programs updated is a well-known problem to any software developer, and has been recognized as a key concern for CDS development [[17]–[19]]. The software lifecycle has been well described numerous times and portions of this are directly applicable to CDS maintenance [[20]–[22]]. Published literature by teams such as the Clinical Decision Support Consortium have been helpful in outlining the issues and processes of decision support maintenance, and have proposed alternatives to local maintenance of CDS [[18], [19], [23], [24]]. However, to date no publications have quantified that effort actually required to update clinical decision support in response to changes in recommendations. Sittig et. al. noted that 12 days to update CDS was “state of the art in knowledge management” [[24]], but does not address the size of the team, the number of hours, the extent of the changes, or the reason for changes. Wright et. al. noted that updates often only occur every several years [[25]]. Additionally, Hulse et. al. [[26]] described a longitudinal approach to content and software management that describes several key issues with CDS maintenance, including versioning, but does not address how to incorporate a changed evidence base into updating CDS [[26]]. Grandi et. al. also discuss versioning of guidelines, but focus primarily on multiple versions of guidelines during the authoring and revision phase, rather than on how to best account for revised guidelines during CDS updating [[27]]. One other factor to consider is that CDS developed using grant-based or time-limited funding has the potential to become stranded if there is no plan or funding for maintenance.
By using the links between the CDS and the 2009 policy statement we leveraged a method to efficiently identify all CDS rules which required changes to account for the 2014 update. Without this link it would have required significantly more time to identify the sections within the CDS requiring changes, and in all likelihood the entirety of the source code would have needed to been reviewed. We did not re-GEM cut the entire 2014 policy statement (the approach used to extract the recommendations from the 2009 AAP policy statement), so if there is another update it is unclear how this will affect our effort in developing a second revision. Instead, we used our prior GEM cutting work to vastly speed up the “reconciliation” of rules to the new guideline. Additionally, as EHRs become more supportive of clinical terminology standards and standards to access remote knowledge services, it will be helpful for CDS developers and updaters to proactively use these standards. This may help with dissemination and maintenance of CDS in the long term. However, even with these standards there may remain situations where changes in guidelines will require unanticipated changes in the contents or structure of data consumed or produced by remote knowledge services.
Two months after the start of the 2014–2015 RSV season, the AAP published an errata on the 2014 Policy Statement [[28]]. This errata was only published in the journal of Pediatrics and not indexed in Pubmed. Therefore, it took some time for it to come to our projects team’s attention. Despite this, the errata only resulted in a single change to eligibility criteria which was easily identified in the CDS code and SQL queries and therefore was not included in our estimates of person-hours for this project. However, it is important to note that when developing CDS from evidence-based sources it is necessary to continuously monitor for changes to the evidence-base and to published recommendations.
Running both old and new test report queries provided a clear picture of the effect eligibility criteria changes would have within out network at the start of the 2014 RSV season. It also allowed our team to check for validity and to ensure that the reporting rules and the CDS were once again aligned. Other studies have investigated the patient care impacts and financial impacts of this change and noted a cost savings with no significant impact on RSV hospitalization rates [[16], [29]]. The controversy surrounding these updated recommendations is also acknowledged [[30]].
#
Limitations
We have described our experience successfully maintaining CDS through a significant change in clinical guidelines for only one specific clinical problem. Additionally, while we report the time required for updating this system we were unable to identify a baseline time for CDS updating for comparison. There are also limitations surrounding generalizability of our approach. Many organizations are not yet using, or yet able to use, a web services based approach for CDS rules. Organizations dependent on EHR provided tools might not be able use the link between evidence and the CDS to identify areas requiring change. Additionally, organizations using sharable CDS may have additional barriers to updating including maintaining multiple versions simultaneously, addressing difference in workflows, and more strenuous testing.
Our strategy for using GEM to maintain a tight linkage between our CDS and the original guidelines may be less successful for CDS maintenance in other clinical domains. Notably the same professional organization was the author for all versions of the palivizumab policy statement. Consequently, the two versions of the document involved in our project had a similar structure. In situations where more drastic structural differences occur to the published guideline - as may occur if different professional organizations become involved in authoring a guideline - our approach to CDS maintenance may be less efficient. Fortunately, the structure of published guidelines has become increasingly standardized through checklists and other tools that have recently emerged for guideline authors [[10]–[12], [31]–[33]]. Consequently, drastic changes in guideline publications between versions should become less common. This trend should allow our approach, which tightly links executable CDS to narrative guideline publications, to be an increasingly successful strategy for maintaining CDS published guidelines are revised.
#
Conclusion
Tightly linking executable CDS rules to narrative palivizumab recommendations using the GEM facilitated timely maintenance of the CDS when the published recommendations changed. This strategy of tightly linking executable CDS to narrative guidelines may help knowledge engineers efficiently maintain CDS in the face of incremental changes to published guidelines in many clinical domains. Developing strategies to handle more significant structural changes in published guidelines remains an important area of concern for future work.
#
Clinical Relevance Statement
When clinical recommendations change any CDS based upon these recommendations need to be updated. Updating software can be a time consuming process and updates based on a change in evidence is previously undescribed. Using a systematic method to develop CDS from published recommendations facilitated our efforts to update the CDS when the published recommendations changed.
#
Abbrevations
AAP – American Academy of Pediatrics
RSV – Respiratory Syncytial Virus
CDS – Clinical Decision Support
EHR – Electronic Health Record
GEM – Guideline Elements Model
#
#
Conflict of Interest
Dr. Grundmeier is a co-inventor of the Care Assistant decision support framework, which was used to implement portions of the decision support described in this manuscript. No patent or licensing agreement exists for this technology and the invention has generated no revenue. As an inventor of the Care Assistant, Dr. Grundmeier may have a perceived conflict of interest. However, members of the study team who have no conflicts of interest reviewed all data and analyses.
Acknowledgements
We thank the network of primary care physicians and their patients and families for their contributions to clinical research through the Pediatric Research Consortium at CHOP.
Protection of Human and Animal Subjects
Human and animal subjects were not used in this project.
-
References
- 1 Sucher JF, Moore FA, Todd SR, Sailors RM, McKinley BA. Computerized clinical decision support: a technology to implement and validate evidence based guidelines. J Trauma 2008; 64 (02) 520-537.
- 2 Damiani G, Pinnarelli L, Colosimo SC, Almiento R, Sicuro L, Galasso R, Sommella L, Ricciardi W. The effectiveness of computerized clinical guidelines in the process of care: a systematic review. BMC Health Serv Res 2010; 10: 2.
- 3 Latoszek-Berendsen A, Tange H, van den Herik HJ, Hasman A. From clinical practice guidelines to computer-interpretable guidelines. A literature overview. Methods Inf Med 2010; 49 (06) 550-570.
- 4 Forrest CB, Fiks AG, Bailey LC, Localio R, Grundmeier RW, Richards T, Karavite DJ, Elden L, Alessandrini EA. Improving Adherence to Otitis Media Guidelines With Clinical Decision Support and Physician Feedback. Pediatrics. 2013
- 5 American Academy of Pediatrics.. From the American Academy of Pediatrics: Policy statements--Modified recommendations for use of palivizumab for prevention of respiratory syncytial virus infections. Pediatrics 2009; 124 (06) 1694-1701.
- 6 Utidjian LH, Hogan A, Michel J, Localio AR, Karavite D, Song L, Ramos MJ, Fiks AG, Lorch S, Grund-meier RW. Clinical Decision Support and Palivizumab: A Means to Protect from Respiratory Syncytial Virus. Appl Clin Inform 2015; 6 (04) 769-784.
- 7 GoodRx. Palivizumab Prics and Palivizumab Coupons. 2016 [cited 2016 June 11]; Available from: http://www.goodrx.com/palivizumab?drug-name=palivizumab
- 8 Ambrose CS, McLaurin KK. The Medicaid Cost of Palivizumab. J Pediatric Infect Dis Soc 2015; 4 (01) 83-84.
- 9 Updated guidance for palivizumab prophylaxis among infants and young children at increased risk of hospitalization for respiratory syncytial virus infection. Pediatrics 2014; 134 (Suppl. 02) e620-e638.
- 10 Dixon MMG, Shiffman RN. GuideLines Into DEcision Support (GLIDES). Yale University: Yale University; 2013
- 11 Shiffman RN, Michel G, Essaihi A, Thornquist E. Bridging the guideline implementation gap: a systematic, document-centered approach to guideline implementation. J Am Med Inform Assoc 2004; 11 (05) 418-426.
- 12 Shiffman RN, Karras BT, Agrawal A, Chen R, Marenco L, Nath S. GEM: A Proposal for a More Comprehensive Guideline Document Model Using XML. J Am Med Inform Assoc 2000; 7: 11.
- 13 Karavite D. GEM cut representation of the Policy Statement—Modified Recommendations for Use of Palivizumab for Prevention of Respiratory Syncytial Virus Infections. Center for Biomedical Informatics. 2010
- 14 Proctor M. Drools.A Rule Engine for Complex Event Processing. In: Schürr A, Varró D, Varró G. editors. Applications of Graph Transformations with Industrial Relevance: 4th International Symposium, AGTIVE 2011, Budapest, Hungary, October 4–7, 2011. Revised Selected and Invited Papers Berlin, Heidelberg: Springer Berlin Heidelberg; 2012. p. 2.
- 15 GitHub. GitHub Inc.; 2016 [cited 2016]; Available from: https://www.github.com
- 16 Zembles TN, Gaertner KM, Gutzeit MF, Willoughby RE. Implementation of American Academy of Pediatrics guidelines for palivizumab prophylaxis in a pediatric hospital. Am J Health Syst Pharm 2016; 73 (06) 405-408.
- 17 Wright A, Sittig DF, Ash JS, Bates DW, Feblowitz J, Fraser G, Maviglia SM, McMullen C, Nichol WP, Pang JE, Starmer J, Middleton B. Governance for clinical decision support: case studies and recommended practices from leading institutions. J Am Med Inform Assoc 2011; 18 (02) 187-194.
- 18 Ash JS, Sittig DF, Dykstra R, Wright A, McMullen C, Richardson J, Middleton B. Identifying best practices for clinical decision support and knowledge management in the field. Stud Health Technol Inform 2010; 160 Pt 2 806-810.
- 19 Peleg M. Computer-interpretable clinical guidelines: a methodological review. J Biomed Inform 2013; 46 (04) 744-763.
- 20 CMS. Selecting a Development Approach. In: Services OoI, editor.: Center for Medicare and Medicaid Service (CMS); 2008 p. 1-10.
- 21 Shah H, Allard RD, Enberg R, Krishnan G, Williams P, Nadkarni PM. Requirements for guidelines systems: implementation challenges and lessons from existing software-engineering efforts. BMC Med Inform Decis Mak. 2012: 12
- 22 Boehm B. A spiral model of software development and enhancement. SIGSOFT Softw Eng Notes 1986; 11 (04) 14-24.
- 23 Sittig DF, Wright A, Osheroff JA, Middleton B, Teich JM, Ash JS, Campbell E, Bates DW. Grand challenges in clinical decision support. J Biomed Inform 2008; 41 (02) 387-392.
- 24 Sittig DF, Wright A, Simonaitis L, Carpenter JD, Allen GO, Doebbeling BN, Sirajuddin AM, Ash JS, Middleton B. The state of the art in clinical knowledge management: an inventory of tools and techniques. Int J Med Inform 2010; 79 (01) 44-57.
- 25 Wright A, Ash JS, Erickson JL, Wasserman J, Bunce A, Stanescu A, St Hilaire D, Panzenhagen M, Gebhardt E, McMullen C, Middleton B, Sittig DF. A qualitative study of the activities performed by people involved in clinical decision support: recommended practices for success. J Am Med Inform Assoc 2014; 21 (03) 464-472.
- 26 Hulse NC, Galland J, Borsato EP. Evolution in clinical knowledge management strategy at Intermountain Healthcare. AMIA Annu Symp Proc 2012; 2012: 390-399.
- 27 Grandi F, Mandreoli F, Martoglia R. Efficient management of multi-version clinical guidelines. J Biomed Inform 2012; 45 (06) 1120-1136.
- 28 Pediatrics AAo. Errata. Pediatrics 2014; 134 (06) 1221.
- 29 Grindeland CJ, Mauriello CT, Leedahl DD, Richter LM, Meyer AC. Association Between Updated Guideline-Based Palivizumab Administration and Hospitalizations for Respiratory Syncytial Virus Infections. Pediatr Infect Dis J. 2016 Epub 2016/04/15.
- 30 McLaurin KK, Chatterjee A, Makari D. Modeling the Potential Impact of the 2014 American Academy of Pediatrics Respiratory Syncytial Virus Prophylaxis Guidance on Preterm Infant RSV Outcomes. Infect Dis Ther 2015 Epub 2015/10/27.
- 31 Shiffman RN, Dixon J, Brandt C, Essaihi A, Hsiao A, Michel G, O’Connell R. The GuideLine Implementability Appraisal (GLIA): development of an instrument to identify obstacles to guideline implementation. BMC Med Inform Decis Mak 2005; 5 (23) 1-8.
- 32 Hajizadeh N, Kashyap N, Michel G, Shiffman RN. GEM at 10: A decade’s experience with the Guideline Elements Model. AMIA Annu Symp Proc 2011: 520-529.
- 33 Shiffman RN, Michel G, Rosenfeld RM, Davidson C. Building better guidelines with BRIDGE-Wiz: development and evaluation of a software assistant to promote clarity, transparency, and implementability. J Am Med Inform Assoc 2012; 19 (01) 94-101.
Correspondence to:
-
References
- 1 Sucher JF, Moore FA, Todd SR, Sailors RM, McKinley BA. Computerized clinical decision support: a technology to implement and validate evidence based guidelines. J Trauma 2008; 64 (02) 520-537.
- 2 Damiani G, Pinnarelli L, Colosimo SC, Almiento R, Sicuro L, Galasso R, Sommella L, Ricciardi W. The effectiveness of computerized clinical guidelines in the process of care: a systematic review. BMC Health Serv Res 2010; 10: 2.
- 3 Latoszek-Berendsen A, Tange H, van den Herik HJ, Hasman A. From clinical practice guidelines to computer-interpretable guidelines. A literature overview. Methods Inf Med 2010; 49 (06) 550-570.
- 4 Forrest CB, Fiks AG, Bailey LC, Localio R, Grundmeier RW, Richards T, Karavite DJ, Elden L, Alessandrini EA. Improving Adherence to Otitis Media Guidelines With Clinical Decision Support and Physician Feedback. Pediatrics. 2013
- 5 American Academy of Pediatrics.. From the American Academy of Pediatrics: Policy statements--Modified recommendations for use of palivizumab for prevention of respiratory syncytial virus infections. Pediatrics 2009; 124 (06) 1694-1701.
- 6 Utidjian LH, Hogan A, Michel J, Localio AR, Karavite D, Song L, Ramos MJ, Fiks AG, Lorch S, Grund-meier RW. Clinical Decision Support and Palivizumab: A Means to Protect from Respiratory Syncytial Virus. Appl Clin Inform 2015; 6 (04) 769-784.
- 7 GoodRx. Palivizumab Prics and Palivizumab Coupons. 2016 [cited 2016 June 11]; Available from: http://www.goodrx.com/palivizumab?drug-name=palivizumab
- 8 Ambrose CS, McLaurin KK. The Medicaid Cost of Palivizumab. J Pediatric Infect Dis Soc 2015; 4 (01) 83-84.
- 9 Updated guidance for palivizumab prophylaxis among infants and young children at increased risk of hospitalization for respiratory syncytial virus infection. Pediatrics 2014; 134 (Suppl. 02) e620-e638.
- 10 Dixon MMG, Shiffman RN. GuideLines Into DEcision Support (GLIDES). Yale University: Yale University; 2013
- 11 Shiffman RN, Michel G, Essaihi A, Thornquist E. Bridging the guideline implementation gap: a systematic, document-centered approach to guideline implementation. J Am Med Inform Assoc 2004; 11 (05) 418-426.
- 12 Shiffman RN, Karras BT, Agrawal A, Chen R, Marenco L, Nath S. GEM: A Proposal for a More Comprehensive Guideline Document Model Using XML. J Am Med Inform Assoc 2000; 7: 11.
- 13 Karavite D. GEM cut representation of the Policy Statement—Modified Recommendations for Use of Palivizumab for Prevention of Respiratory Syncytial Virus Infections. Center for Biomedical Informatics. 2010
- 14 Proctor M. Drools.A Rule Engine for Complex Event Processing. In: Schürr A, Varró D, Varró G. editors. Applications of Graph Transformations with Industrial Relevance: 4th International Symposium, AGTIVE 2011, Budapest, Hungary, October 4–7, 2011. Revised Selected and Invited Papers Berlin, Heidelberg: Springer Berlin Heidelberg; 2012. p. 2.
- 15 GitHub. GitHub Inc.; 2016 [cited 2016]; Available from: https://www.github.com
- 16 Zembles TN, Gaertner KM, Gutzeit MF, Willoughby RE. Implementation of American Academy of Pediatrics guidelines for palivizumab prophylaxis in a pediatric hospital. Am J Health Syst Pharm 2016; 73 (06) 405-408.
- 17 Wright A, Sittig DF, Ash JS, Bates DW, Feblowitz J, Fraser G, Maviglia SM, McMullen C, Nichol WP, Pang JE, Starmer J, Middleton B. Governance for clinical decision support: case studies and recommended practices from leading institutions. J Am Med Inform Assoc 2011; 18 (02) 187-194.
- 18 Ash JS, Sittig DF, Dykstra R, Wright A, McMullen C, Richardson J, Middleton B. Identifying best practices for clinical decision support and knowledge management in the field. Stud Health Technol Inform 2010; 160 Pt 2 806-810.
- 19 Peleg M. Computer-interpretable clinical guidelines: a methodological review. J Biomed Inform 2013; 46 (04) 744-763.
- 20 CMS. Selecting a Development Approach. In: Services OoI, editor.: Center for Medicare and Medicaid Service (CMS); 2008 p. 1-10.
- 21 Shah H, Allard RD, Enberg R, Krishnan G, Williams P, Nadkarni PM. Requirements for guidelines systems: implementation challenges and lessons from existing software-engineering efforts. BMC Med Inform Decis Mak. 2012: 12
- 22 Boehm B. A spiral model of software development and enhancement. SIGSOFT Softw Eng Notes 1986; 11 (04) 14-24.
- 23 Sittig DF, Wright A, Osheroff JA, Middleton B, Teich JM, Ash JS, Campbell E, Bates DW. Grand challenges in clinical decision support. J Biomed Inform 2008; 41 (02) 387-392.
- 24 Sittig DF, Wright A, Simonaitis L, Carpenter JD, Allen GO, Doebbeling BN, Sirajuddin AM, Ash JS, Middleton B. The state of the art in clinical knowledge management: an inventory of tools and techniques. Int J Med Inform 2010; 79 (01) 44-57.
- 25 Wright A, Ash JS, Erickson JL, Wasserman J, Bunce A, Stanescu A, St Hilaire D, Panzenhagen M, Gebhardt E, McMullen C, Middleton B, Sittig DF. A qualitative study of the activities performed by people involved in clinical decision support: recommended practices for success. J Am Med Inform Assoc 2014; 21 (03) 464-472.
- 26 Hulse NC, Galland J, Borsato EP. Evolution in clinical knowledge management strategy at Intermountain Healthcare. AMIA Annu Symp Proc 2012; 2012: 390-399.
- 27 Grandi F, Mandreoli F, Martoglia R. Efficient management of multi-version clinical guidelines. J Biomed Inform 2012; 45 (06) 1120-1136.
- 28 Pediatrics AAo. Errata. Pediatrics 2014; 134 (06) 1221.
- 29 Grindeland CJ, Mauriello CT, Leedahl DD, Richter LM, Meyer AC. Association Between Updated Guideline-Based Palivizumab Administration and Hospitalizations for Respiratory Syncytial Virus Infections. Pediatr Infect Dis J. 2016 Epub 2016/04/15.
- 30 McLaurin KK, Chatterjee A, Makari D. Modeling the Potential Impact of the 2014 American Academy of Pediatrics Respiratory Syncytial Virus Prophylaxis Guidance on Preterm Infant RSV Outcomes. Infect Dis Ther 2015 Epub 2015/10/27.
- 31 Shiffman RN, Dixon J, Brandt C, Essaihi A, Hsiao A, Michel G, O’Connell R. The GuideLine Implementability Appraisal (GLIA): development of an instrument to identify obstacles to guideline implementation. BMC Med Inform Decis Mak 2005; 5 (23) 1-8.
- 32 Hajizadeh N, Kashyap N, Michel G, Shiffman RN. GEM at 10: A decade’s experience with the Guideline Elements Model. AMIA Annu Symp Proc 2011: 520-529.
- 33 Shiffman RN, Michel G, Rosenfeld RM, Davidson C. Building better guidelines with BRIDGE-Wiz: development and evaluation of a software assistant to promote clarity, transparency, and implementability. J Am Med Inform Assoc 2012; 19 (01) 94-101.