Appl Clin Inform 2021; 12(02): 383-390
DOI: 10.1055/s-0041-1729164
Research Article

Infobuttons for Genomic Medicine: Requirements and Barriers

Luke V. Rasmussen
1   Department of Preventive Medicine, Northwestern University, Chicago, Illinois, United Sates
,
John J. Connolly
2   The Center for Applied Genomics, The Children's Hospital of Philadelphia, Philadelphia, Pennsylvania, United Sates
,
Guilherme Del Fiol
3   Department of Biomedical Informatics, University of Utah, Salt Lake City, Utah, United Sates
,
Robert R. Freimuth
4   Department of Health Sciences Research, Mayo Clinic, Rochester, Minnesota, United Sates
,
Douglas B. Pet
5   Department of Neurology, University of California San Francisco, San Francisco, California, United Sates
,
Josh F. Peterson
6   Department of Biomedical Informatics, Vanderbilt University School of Medicine, Nashville, Tennessee, United Sates
,
Brian H. Shirts
7   Department of Laboratory Medicine, University of Washington, Seattle, Washington, United Sates
,
Justin B. Starren
1   Department of Preventive Medicine, Northwestern University, Chicago, Illinois, United Sates
,
Marc S. Williams
8   Genomic Medicine Institute, Geisinger, Danville, Pennsylvania, United Sates
,
Nephi Walton
8   Genomic Medicine Institute, Geisinger, Danville, Pennsylvania, United Sates
9   Intermountain Precision Genomics, Intermountain Healthcare, St George, Utah, United Sates
,
Casey Overby Taylor
8   Genomic Medicine Institute, Geisinger, Danville, Pennsylvania, United Sates
10   Department of Medicine and Biomedical Engineering, Johns Hopkins School of Medicine, Baltimore, Maryland, United Sates
› Author Affiliations
Funding This phase of the eMERGE network was initiated and funded by the NHGRI through the following grants: U01HG008657 (Group Health Cooperative/University of Washington); U01HG008685 (Brigham and Women's Hospital); U01HG008672 (Vanderbilt University Medical Center); U01HG008666 (Cincinnati Children's Hospital Medical Center); U01HG006379 (Mayo Clinic); U01HG008679 (Geisinger Clinic); U01HG008680 (Columbia University Health Sciences); U01HG008684 (Children's Hospital of Philadelphia); U01HG008673 (Northwestern University); U01HG008701 (Vanderbilt University Medical Center serving as the Coordinating Center); U01HG008676 (Partners Healthcare/Broad Institute); U01HG008664 (Baylor College of Medicine); and U54MD007593 (Meharry Medical College).
 

Abstract

Objectives The study aimed to understand potential barriers to the adoption of health information technology projects that are released as free and open source software (FOSS).

Methods We conducted a survey of research consortia participants engaged in genomic medicine implementation to assess perceived institutional barriers to the adoption of three systems: ClinGen electronic health record (EHR) Toolkit, DocUBuild, and MyResults.org. The survey included eight barriers from the Consolidated Framework for Implementation Research (CFIR), with additional barriers identified from a qualitative analysis of open-ended responses.

Results We analyzed responses from 24 research consortia participants from 18 institutions. In total, 14 categories of perceived barriers were evaluated, which were consistent with other observed barriers to FOSS adoption. The most frequent perceived barriers included lack of adaptability of the system, lack of institutional priority to implement, lack of trialability, lack of advantage of alternative systems, and complexity.

Conclusion In addition to understanding potential barriers, we recommend some strategies to address them (where possible), including considerations for genomic medicine. Overall, FOSS developers need to ensure systems are easy to trial and implement and need to clearly articulate benefits of their systems, especially when alternatives exist. Institutional champions will remain a critical component to prioritizing genomic medicine projects.


#

Background and Significance

Within the space of health and biomedical informatics, many systems are deployed as publicly accessible resources, and a growing number are made available as free and open source software (FOSS). However, even with the promise of a “free” solution, not all of these systems see significant adoption. That is to say, simply developing and releasing FOSS is not itself sufficient to ensure its use. Given the funding and effort often put into developing these systems, it is important to identify, address, and reduce barriers to their adoption so that their potential may be realized.

One systematic review of barriers to FOSS adoption identified 19 factors across four dimensions: technological, organizational, environmental, and individual.[1] The combination of technical and nontechnical considerations are also seen as important within health and biomedical informatics implementations,[2] and given the multiple infrastructure layers and settings in which FOSS can be found,[3] different barriers may be observed. For example, the adoption of a Bioconductor package[4] by a research laboratory may see different barriers than the adoption of a clinical decision support (CDS) system within the clinical enterprise due to the different types of infrastructure needs and governance processes that are required. Although medical specialty may not always impact potential barriers, it may be important to consider this aspect. For example, one study determined that barriers to the implementation of pharmacogenomic CDS did not differ greatly from those seen in other health information technology (HIT) implementations.[5]

In this study, we surveyed individuals from genomic medicine consortia that were potential or current implementers of infobuttons as one type of FOSS for CDS. Infobuttons[6] are context-sensitive links embedded within the electronic health record (EHR), which act as a form of passive CDS to deliver more targeted and relevant information resources to both clinicians and patients. The goal for our survey was to better understand potential barriers to the adoption of FOSS for infobuttons to support genetic and genomic medicine.


#

Methods

A 20-question web-based survey was developed by the electronic Medical Records and Genomics (eMERGE) Consortium's Infobutton workgroup to assess perceptions about three infobutton-related technologies for genetic and genomic medicine: (1) the ClinGen EHR toolkit,[7] which is built upon the OpenInfobutton system;[8] (2) DocUBuild, a content authoring and management system; and (3) MyResults.org, a collection of information resources targeted to patients for interpreting pharmacogenomic results. These systems were built by different institutions to address different needs, and at the time of the survey, these systems were at differing levels of maturity, and so a brief summary was included to introduce each. Respondents were not asked to implement the systems as part of their responses, but to consider barriers to adoption given what was described for each.

Questions included respondent background (four questions), familiarity with infobuttons (five questions), opinions on utility of the three systems (seven questions), and perceived barriers to adoption of the systems at their institution (three questions). The list of barriers presented was based on the Consolidated Framework for Implementation Research (CFIR) to improve implementation effectiveness.[9] Validation and reliability testing was deemed unnecessary given the descriptive and qualitative nature of the survey. The survey is included as [Supplementary Material A] (available in the online version).

A convenience sample of consortia was selected by the authors, and the list was expanded through snowball sampling based on recommendations from survey recipients. Invitations to participate were limited to participants in consortium-driven initiatives that focused on implementing genomic medicine. If applicable, within a consortium, we identified specific groups that focused on informatics, EHR integration, and/or the return of clinical genomics results; we believed that these groups would have more knowledge of HIT as well as implementation considerations. The full list of consortia and corresponding groups that received the survey invitation, as well as the category of institutions that participate in each group, is provided in [Table 1]. Survey invitations were distributed starting in February 2017, and the survey was rolled out to consortia via group mailing lists as each consortium was identified. No reminders were sent after the initial announcement. The survey was closed to all responses in September 2017.

Table 1

Genomic medicine consortia that received invitations to participate in the survey, any specific working groups in each consortium that was contacted (if applicable), and the class of members that participated in the identified groups

Consortium

Group(s) contacted

Member composition

Clinical Genomics Resource

https://clinicalgenome.org/

EHR Working Group

Academic Medical Centers

Health Systems

Industry

Clinical Pharmacogenetics Implementation Consortium

https://cpicpgx.org/

Informatics Working Group

Academic Medical Centers

Health Systems

Clinical Sequencing Evidence-Generating Research

https://cser-consortium.org/

EMR Working Group

Academic Medical Centers

Health Systems

Displaying and Integrating Genetic Information through the EHR

http://www.nationalacademies.org/hmd/Activities/ Research/GenomicBasedResearch/Innovation-Collaboratives/DIGITizE.aspx

N/A

Academic Medical Centers

Health IT Vendors

Health Systems

Industry

Electronic Medical Records and Genomics

https://emerge.mc.vanderbilt.edu/

Return of Results Workgroup

EHR Integration Workgroup

Implementation Workgroup

Academic Medical Centers

Health Systems

Global Alliance for Genomics and Health

https://www.ga4gh.org/

eHealth Task Team

Academic Medical Centers

Health IT Vendors

Health Systems

Industry

Implementing Genomics In Practice

https://gmkb.org/ignite/

Clinical Informatics Interest Group

Academic Medical Centers

Health Systems

Inter-Society Coordinating Committee for Practitioner Education in Genomics

https://www.genome.gov/iscc/

(N/A)

Academic Medical Centers

Health Systems

Industry

Professional Societies

Abbreviations: EHR, electronic health record; NA, not applicable.


We removed incomplete responses from the final data analyses. To account for abandoned and incomplete surveys, we only included responses that had a progress of 100% within the survey system (Qualtrics, Provo, Utah, United States). We further excluded responses that showed as completed but did not contain responses to any questions. We assumed these were surveys where the responder clicked through all the questions but did not attempt to reply.

Open-ended responses regarding perceived barriers were extracted from the survey data and placed in a separate Excel file. Two of the authors (L.V.R. and J.B.S.) independently reviewed the responses and used open coding to classify described barriers. Coders determined if the described barriers fit into one or more of the CFIR barriers or, if not, proposed a new barrier. The two coders collaboratively resolved discrepancies and reached consensus on barrier name, definition, and application across the open-ended responses.

Given the time between the initial survey and the reporting of those data and given the inability to recontact the original respondents, the authors collectively reassessed the current state of the identified barriers within their institutions in February 2021. The authors critically reviewed the results of the survey and determined whether the results were largely unchanged, or whether the barriers had decreased or increased over time. This review was conducted via e-mail with no blinding of responses. As we were unable to directly compare responses from 2017 to current state, we instead summarized the overall identified trends.

Quantitative analyses include descriptive statistics. Both data preparation and analysis were done by using R 3.6.3,[10] and results were integrated into the manuscript using StatTag for macOS v3.0.6.[11]


#

Results

Respondent Characteristics

During the survey period, we received 81 responses of which 24 (29.6%) were considered complete per our criteria and were included for analysis. Respondents represented 18 distinct institutions, of which 16 are academic medical centers or health systems affiliated with academic institutions and 2 are HIT vendors ([Table 2]). A total of four institutions (with one respondent each) were affiliated with the development of one or more of the surveyed systems. Within their respective consortia, 13 respondents (54.2%) self-declared participation in at least one informatics/HIT-related workgroup. Additional characteristics of the respondents are available in the online version ([Supplementary Material B]).

Table 2

List of institution names, institution category, and the number of respondents from that institution

Institution name

Institution category

Number of respondents

Children's Hospital of Philadelphia[a]

Academic/Health

1

Children's Mercy

Academic/Health

1

Columbia University

Academic/Health

1

Concert Genetics

Vendor

1

Duke University

Academic/Health

2

Geisinger[a]

Academic/Health

1

Intermountain Healthcare[a]

Academic/Health

1

Kaiser Permanente Washington

Academic/Health

2

MEDITECH

Vendor

1

Mayo Clinic

Academic/Health

3

Mount Sinai

Academic/Health

1

Northwestern University[a]

Academic/Health

1

Partners HealthCare

Academic/Health

2

St. Jude Children's Research Hospital

Academic/Health

1

University of Iowa

Academic/Health

1

University of Pittsburgh

Academic/Health

1

Vanderbilt University Medical Center

Academic/Health

2

Weill Cornell Medicine

Academic/Health

1

a Institutions were responsible for the development of one or more of the surveyed systems.


Note: Academic/Health: academic medical centers, as well as health systems affiliated with academic institutions. Vendor: Health information technology solution developer.


Of the 24 respondents, 16 (66.7%) described being knowledgeable (marking “agree” or “strongly agree” on a 5-point Likert scale) about infobuttons in general, but were overall less knowledgeable about the surveyed tools (OpenInfobutton: 11, 45.8%; ClinGen EHR Toolkit: five, 20.8%; DocUBuild: seven, 29.2%; MyResults.org: nine, 37.5%). Respondents had even less experience using the tools, with the majority not using them at all as of the time of the survey. The highest self-reported users of the tools were from respondents at the institution where the tool was developed ([Supplementary Material B] [available in the online version]). [Table 3] shows that eight respondent institutions are using infobuttons and three of these (37.5%) are using them for genomic medicine.

Table 3

Summary of responses regarding the current use of infobuttons at the respondent institution, the use of infobuttons for genomic medicine, and any plans to expand the general availability of infobuttons (n = 24)

Response

Infobuttons available at institution

Infobuttons for genomic medicine content

Plans to expand infobutton availability

Yes

8 (33.33%)

3 (12.5%)

4 (16.67%)

No

8 (33.33%)

16 (66.67%)

7 (29.17%)

Not sure

8 (33.33%)

5 (20.83%)

13 (54.17%)


#

Perceived Barriers to Adoption

Of the eight CFIR barriers to adoption presented within the survey, only one barrier (“poor design quality and packaging”) was not rated as a potential issue for any of the three tools by the respondents. [Table 4] lists the eight CFIR barriers rated in order of prevalence across all three resources.

Table 4

Respondents' perceived barriers to adoption for the three surveyed resources (n = 24)

CFIR perceived barrier

ClinGen EHR toolkit

DocUBuild

MyResults.org

Total

Description

Lack of adaptability

8 (33.3%)

5 (20.8%)

6 (25%)

19

The degree to which the tool can be adapted, tailored, refined, or reinvented to meet local needs

Lack of trialability

5 (20.8%)

2 (8.3%)

0 (0%)

7

Ability to test the tool on a small scale initially

Not clearly better than alternatives

1 (4.2%)

3 (12.5%)

3 (12.5%)

7

The tool does not appear to offer any advantage when compared with an alternative solution

Too complex

4 (16.7%)

1 (4.2%)

2 (8.3%)

7

Perceived intricacy or difficulty

Lack of evidence strength and quality

2 (8.3%)

2 (8.3%)

1 (4.2%)

5

Quality and validity of evidence supporting the belief that the tool will have desired outcomes

Lack of a legitimate source

0 (0%)

0 (0%)

2 (8.3%)

2

The legitimacy of the source of the tool

Too costly

1 (4.2%)

0 (0%)

0 (0%)

1

Overall cost of ownership includes implementation costs for personnel, maintenance, etc.

Poor design quality and packaging

0 (0%)

0 (0%)

0 (0%)

0

How the tool is bundled, presented, and assembled

Abbreviations: CFIR, Consolidated Framework for Implementation Research; EHR, electronic health record.


Note: Total across the three surveyed resources may include the same respondent and so are not shown with a percentage.


In addition to the CFIR barriers, an additional six barriers were identified from the qualitative analysis of open-ended responses. Of these, three had only one supporting quote but were seen to be more broadly applicable and therefore warranted listing as a distinct barrier. These additional barriers are listed in [Table 5] and displayed in order of prevalence across all three resources.

Table 5

Additional perceived barriers to adoption for the three surveyed resources (n = 24) as specified by respondents in open-ended responses

Respondent-identified perceived barrier

ClinGen EHR toolkit

DocUBuild

MyResults.org

Total

Description and example quotes

Not an institutional priority

5 (20.8%)

3 (12.5%)

2 (8.3%)

10

Interest may exist, but doubt that the institution would prioritize this above other initiatives to complete implementation.

“Business priority; IT department priority”

“I have not been provided the opportunity to invest time or resources into exploring or implementing this feature.”

Lack of demand

2 (8.3%)

3 (12.5%)

0 (0%)

5

Insufficient demand for the tool capabilities within the organization.

“I think the main thing that will drive this into use is a mission critical use case - which I do think we emerge over time.”

Lack of EHR integration

2 (8.3%)

1 (4.2%)

1 (4.2%)

4

Desire to integrate the tool (via service interfaces and/or the user interface) into the EHR.

“This would have to be integrated into EHR framework and be embedded in workflow vs. jsut [sic] being used as an external resource.”

Lack of coded data

1 (4.2%)

0 (0%)

0 (0%)

1

Insufficient structured data (e.g., genetic test results) available within the EHR to warrant implementation.

“We find in the institutions we work with, most genetic tests are ordered under miscellaneous codes and not tracked within the EHR in a meaningful way.”

Limited institutional resources

1 (4.2%)

0 (0%)

0 (0%)

1

Interest may exist; however, health IT resources are sufficiently allocated to other projects.

“Limited resources due to Epic roll out.”

Provider workflow issues

0 (0%)

0 (0%)

1 (4.2%)

1

Unable to integrate the tool appropriately into the health care provider workflow where it would be adopted.

“Lack of time available to providers.”

Abbreviation: EHR, electronic health record.


Note: Total across the three surveyed resources may include the same respondent and so are not shown with a percentage.


[Tables 4] and [5] list perceived barriers in order of the total frequency reported across the three surveyed systems. The frequency for each individual system is also shown. Since the purpose and capabilities of each system differed, barriers can and did differ by system. For example, “not clearly better than alternatives” was more frequently identified for DocUBuild and MyResults.org than the ClinGen EHR Toolkit.

With respect to the current state of these perceived barriers, we note at our own institutions heterogeneous perspectives. Improvements have been made to the three systems since 2017, encompassing both technical advancements and enhanced informational content. Perceived barriers related to adaptability, trialability, and complexity remain largely unchanged, but emerging and improved vendor products have provided new alternatives (although alternatives are less desirable at institutions where a particular tool was developed). Additionally, our institutions vary with respect to local prioritization of genetic and genomic medicine projects, a prerequisite to prioritize the implementation of any of these systems. Some institutions continue to face prioritization issues, while others have been able to garner broader institutional support as implementation of genomics was identified as a strategic priority.


#
#

Discussion

For three open source systems related to infobuttons for genomic medicine, we have evaluated a total of 14 perceived barriers of adoption from potential and current users. Overall, there were no barriers identified that were unique to genomic medicine, although some barriers may be currently amplified within genomic medicine. For example, “lack of coded data” is currently a particular challenge to genomic medicine as many genomic results are returned to and stored in the EHR as PDF documents. Advances in the past few years have seen increased support for the transmission and storage of coded genomic data,[12] [13] but this capability is currently available at relatively few institutions.

Given that the barriers are not unique to genomic medicine, we can leverage existing models that describe the facets of information technology adoption.[14] This allows us to draw upon established literature and recommendations for addressing the barriers; however, there are some specific considerations that can be applied within the health care domain, and more specifically to genetic and genomic medicine. Here we describe recommendations and considerations for the highest ranking barriers, collapsing for brevity those that share similar strategies:

  • Lack of adaptability: Respondents identified that the tools did not appear to be adapted or adaptable to meet their local institutional needs. Although not explicitly stated in the responses, we believe adaptability is such a large concern because of organizational differences in how care is delivered, and how organizational culture is established around clinician and patient engagement. Outside of differences in how healthcare is provided across institutions, we note a consistent theme across these research consortia is the heterogeneity in implementation of genetic and genomic medicine.[5] While tools such as the ClinGen Toolkit and DocUBuild do allow customization regarding information resources retrieved, other aspects of adaptability may not have been expressly considered. Examples may include how results are formatted and displayed (e.g., branding).[15]

  • Not an institutional priority and lack of demand: Genetic and genomic medicine have been in practice within health care organizations, but models to make precision medicine more widespread and an increase in demand for informatics innovation is increasing requests for implementation efforts across the surveyed research consortia. This finding is consistent with a previous study within the eMERGE network with respect to pharmacogenomic CDS implementation.[5] Given a large number of competing priorities, health care organizations cannot prioritize everything concurrently. For initiatives such as genetic and genomic medicine, careful planning within the institution to identify a leader/champion may address this,[16] especially if the project is seen as a “research project” (often receiving a lower priority). Beyond organizational leadership, similar strategies are needed to clearly describe the benefit of a system to practicing clinicians. This is needed for a combination of top-down and bottom-up buy-in for a system.

  • Lack of trialability: Although all three tools had publicly accessible versions that could be trialed by potential users, trialability is much broader and can include customization for local needs and workflows, including actual evaluation within the clinical workflow. System developers seeking adopters may need to identify partners and support them to reduce the burden of implementing use cases to perform a trial. In this regard, system developers should plan sufficient resources to carry out such trials with their partners.

  • Not clearly better than alternatives: Potential adopters need to have a clear narrative on why a particular tool provides novel capabilities or advantages compared with other seemingly similar systems. Prior to embarking on the development of a tool, the team must clearly understand what is currently available, as well as what is in use. A challenge here may be that where a tool is truly novel, the benefit of adoption is not made clear to the potential adopter in the brief time they have to perform a cursory evaluation. This may have been the case in the descriptions presented to respondents within this survey. FOSS projects need to consider not only market research, but also marketing material of their developed tool such that they clearly and succinctly compare and contrast with other competing systems.

  • Too complex: A system must be perceived as easy to use in order for it to be adopted.[14] An additional challenge for overall system evaluations then is not just the complexity of the user interface, but also the perceived complexity to get a system up and running within an organization. For FOSS, this can be driven largely by the technology used and the quality of the documentation provided. FOSS projects should allocate sufficient time to develop and verify their setup instructions[17] and be prepared to respond quickly to installation problems. However, factors outside of the developer's control are the IT culture at the adopting organization, as well as internal governance and approval processes. For example, an organization that routinely deploys software to servers running Microsoft Windows may see a Linux-based system as “complex.” Even solutions such as containers (e.g., Docker) and setup automation scripts may not reduce the perception of complexity if there is a technology learning curve.

  • Lack of evidence strength and quality and Lack of a legitimate source: Especially within a clinical setting, FOSS must clearly demonstrate that it is of sufficient quality and robustness that it is ready for adoption.[18] [19] With general adopters of software, this is typically seen when the software is observed to be in use across a large number of institutions (“popularity” is equated with “quality”). The challenge presented to innovative tools therefore is guiding the diffusion of the idea across organizational leaders.[16] In addition to the technical quality of a tool, the legitimacy of any evidence within the tool must also be clearly presented. Level of evidence for genetic and genomic medicine is constantly evolving, and efforts such as ClinGen[20] and ClinVar[21] not only provide legitimate sources that knowledge may be linked to, but also examples for how to sufficiently annotate and describe the level of evidence.

  • Lack of EHR integration: Integration of any external system (FOSS or otherwise) with an institution's EHR system poses technical challenges, as well as effort and risk.[22] This can be challenging for FOSS developers to overcome, as it has typically required knowledge of the EHR system internals to seamlessly integrate. Knowledge of multiple EHR systems (involving proprietary information) is typically not feasible for most FOSS developers without partnering with multiple institutions. There may also be challenges with what an EHR system allows for integration points, for example, an infobutton may not be available within some EHR systems where genetic and genomic results are displayed. Technologies such as Substitutable Medical Applications, Reusable Technologies on Fast Healthcare Interoperability Resources (SMART on FHIR)[23] and CDS Hooks[24] have worked to address this by providing standards-based integration points through which custom applications can be integrated. Integration via these technologies, as opposed to purely standalone systems, may aid FOSS adoption, but this approach requires that the standards are already implemented within the clinical systems.

Limitations

We acknowledge limitations within our study, including the low number of respondents, a reliance on one to two representatives of each organization, and surveying those not necessarily responsible for HIT implementation projects who may have responded from their experience in research projects as opposed to enterprise-wide implementations. In addition, we did not ask respondents to attest if they had any potential conflicts of interest (COI), which may bias their responses. Therefore, we cannot control for the event of a respondent with a significant COI based on development or implementation of one of the systems. While the overall number of respondents is small in proportion to the number of requests made, it reflects perspectives across 18 distinct institutions, many of which are large academic medical centers and health systems in the United States, and who are leaders in the implementation of genetic and genomic medicine programs. Therefore, the results are likely to generalize to other similar institutions, although may not be as applicable to more traditional health care delivery systems. Finally, we acknowledge that in a rapidly changing field, the elapsed time between survey and dissemination of the findings may result in overstating the magnitude of barriers in the current state, potentially reducing relevance. To mitigate that potential effect, we reviewed those conclusions and provided an assessment of the current state of these barriers from the standpoint of the authors (many of whose institutions are represented within the original survey). The authors determined that many of the originally reported barriers remain unchanged today.


#
#

Conclusion

Of the eight CFIR barriers surveyed and additional six identified barriers, the most common barriers relate to perceptions of how easy FOSS systems are to trial and adapt for local implementation, as well as perceived benefits of the system overexisting alternatives. While our survey focused on infobutton-related FOSS systems for genetic and genomic medicine, these findings are consistent with barriers observed generally for FOSS. Developers of FOSS can address these barriers in part through succinct and clear documentation, and collaborative partnerships during initial implementations.


#

Clinical Relevance Statement

Developers of FOSS for CDS need to consider and address potential barriers to adoption of their systems. Multiple barriers exist, some of which can be addressed by clear documentation, and partnerships with implementers. Systems for genomic medicine can introduce additional barriers if not prioritized by the health care institution.


#

Multiple Choice Questions

  1. Which barrier can appear if a FOSS system cannot be easily tested within an institution at a small scale when developing FOSS for CDS?

    • Lack of adaptability

    • Lack of trialability

    • Too complex

    • Lack of evidence strength and quality

    Correct Answer: The correct answer is option b. “Lack of trialability” is a potential or perceived barrier if it is not clear how a system could be tested by the institution before it is more widely rolled out to the entire enterprise. Trialability can include initial testing to learn system capabilities, as well as preliminary rollout in a controlled environment for feasibility testing.

  2. What type of CDS does an infobutton provide?

    • Complex

    • Costly

    • Active

    • Passive

    Correct Answer: The correct answer is option d. Infobuttons are a form of passive CDS because they wait for the clinician (or patient) to click on the infobutton link before providing them with targeted information.


#
#

Conflict of Interest

L.V.R. reports grants from NIH/NHGRI during the conduct of the study and also has a Provisional Patent related to the design of Ancillary Genomic Systems that is no longer being pursued. J.B.S. reports a grant from NIH/NHGRI 1U01 HG008673–01 during the conduct of the study and also has a provisional patent related to the design of Ancillary Genomic Systems that is no longer being pursued. R.R.F. reports grants from NIH/NHGRI during the conduct of the study.

Protection of Human and Animal Subjects Protections

The work described was deemed nonhuman subjects research by the Johns Hopkins University Institutional Review Board.


  • References

  • 1 Petrov D, Obwegeser N. Adoption barriers of open-source software. Syst Rev 2018
  • 2 Cresswell K, Sheikh A. Organizational issues in the implementation and adoption of health information technology innovations: an interpretative review. Int J Med Inform 2013; 82 (05) e73-e86
  • 3 Paton C, Karopka T. The role of free/Libre and open source software in learning health systems. Yearb Med Inform 2017; 26 (01) 53-58
  • 4 Bioconductor. Bioconductor - Home. Published 2020. Accessed November 11, 2020 at: https://www.bioconductor.org/
  • 5 Herr TM, Bielinski SJ, Bottinger E. et al. Practical considerations in genomic decision support: the eMERGE experience. J Pathol Inform 2015; 6: 50
  • 6 Cimino JJ, Elhanan G, Zeng Q. Supporting infobuttons with terminological knowledge. Proc AMIA Annu Fall Symp 1997; 528-532
  • 7 Heale BS, Overby CL, Del Fiol G. et al. Integrating genomic resources with electronic health records using the HL7 Infobutton standard. Appl Clin Inform 2016; 7 (03) 817-831
  • 8 Del Fiol G, Curtis C, Cimino JJ. et al. Disseminating context-specific access to online knowledge resources within electronic health record systems. Stud Health Technol Inform 2013; 192: 672-676
  • 9 Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci 2009; 4: 50
  • 10 R: A Language and Environment for Statistical Computing [computer program]. Vienna, Austria: R Foundation for Statistical Computing. Accessed 2018 at: https://www.gbif.org/tool/81287/r-a-language-and-environment-for-statistical-computing
  • 11 StatTag [computer program]. Chicago, Illinois, United States: Galter Health Sciences Library; 2016. Available at: https://sites.northwestern.edu/stattag/
  • 12 Walton NA, Johnson DK, Person TN, Reynolds JC, Williams MS. Pilot implementation of clinical genomic data into the native electronic health record: challenges of scalability. ACI Open. 2020; 04 (02) e162-e166
  • 13 Lau-Min KS, Asher SB, Chen J. et al. Real-world integration of genomic data into the electronic health record: the PennChart Genomics Initiative. Genet Med 2020
  • 14 Venkatesh V, Morris MG, Davis GB, Davis FD. User acceptance of information technology: toward a unified view. Manage Inf Syst Q 2003; 27 (03) 425-478
  • 15 Rasmussen LV, Overby CL, Connolly J. et al. Practical considerations for implementing genomic information resources. Experiences from eMERGE and CSER. Appl Clin Inform 2016; 7 (03) 870-882
  • 16 Greenhalgh T, Robert G, Bate P, Macfarlane F, Kyriakidou O. Diffusion and dissemination. In: Diffusion of Innovations in Health Service Organisations: A Systematic Literature Review. Oxford, UK: Blackwell Publishing Ltd; 2005
  • 17 Cresswell KM, Bates DW, Sheikh A. Ten key considerations for the successful implementation and adoption of large-scale health information technology. J Am Med Inform Assoc 2013; 20 (e1(: e9-e13
  • 18 Chruściel-Nogalska M, Smektała T, Tutak M, Sporniak-Tutak K, Olszewski R. Open-source software in dentistry: a systematic review. Int J Technol Assess Health Care 2017; 33 (04) 487-493
  • 19 Silva LB, Jimenez RC, Blomberg N, Luis Oliveira J. General guidelines for biomedical software development. F1000 Res 2017; 6: 273
  • 20 Rehm HL, Berg JS, Brooks LD. et al; ClinGen. ClinGen--the clinical genome resource. N Engl J Med 2015; 372 (23) 2235-2242
  • 21 Landrum MJ, Lee JM, Benson M. et al. ClinVar: improving access to variant interpretations and supporting evidence. Nucleic Acids Res 2018; 46 (D1): D1062-D1067
  • 22 Payne T, Fellner J, Dugowson C, Liebovitz D, Fletcher G. Use of more than one electronic medical record system within a single health care organization. Appl Clin Inform 2012; 3 (04) 462-474
  • 23 Mandel JC, Kreda DA, Mandl KD, Kohane IS, Ramoni RB. SMART on FHIR: a standards-based, interoperable apps platform for electronic health records. J Am Med Inform Assoc 2016; 23 (05) 899-908
  • 24 Dolin RH, Boxwala A, Shalaby J. A pharmacogenomics clinical decision support service based on FHIR and CDS hooks. Methods Inf Med 2018; 57 (S 02): e115-e123

Address for correspondence

Luke V. Rasmussen, MS
Department of Preventive Medicine
Suite 1100, 750 North Lake Shore Drive, Chicago, IL 60611
United Sates   

Publication History

Received: 23 December 2020

Accepted: 12 March 2021

Article published online:
12 May 2021

© 2021. Thieme. All rights reserved.

Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany

  • References

  • 1 Petrov D, Obwegeser N. Adoption barriers of open-source software. Syst Rev 2018
  • 2 Cresswell K, Sheikh A. Organizational issues in the implementation and adoption of health information technology innovations: an interpretative review. Int J Med Inform 2013; 82 (05) e73-e86
  • 3 Paton C, Karopka T. The role of free/Libre and open source software in learning health systems. Yearb Med Inform 2017; 26 (01) 53-58
  • 4 Bioconductor. Bioconductor - Home. Published 2020. Accessed November 11, 2020 at: https://www.bioconductor.org/
  • 5 Herr TM, Bielinski SJ, Bottinger E. et al. Practical considerations in genomic decision support: the eMERGE experience. J Pathol Inform 2015; 6: 50
  • 6 Cimino JJ, Elhanan G, Zeng Q. Supporting infobuttons with terminological knowledge. Proc AMIA Annu Fall Symp 1997; 528-532
  • 7 Heale BS, Overby CL, Del Fiol G. et al. Integrating genomic resources with electronic health records using the HL7 Infobutton standard. Appl Clin Inform 2016; 7 (03) 817-831
  • 8 Del Fiol G, Curtis C, Cimino JJ. et al. Disseminating context-specific access to online knowledge resources within electronic health record systems. Stud Health Technol Inform 2013; 192: 672-676
  • 9 Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci 2009; 4: 50
  • 10 R: A Language and Environment for Statistical Computing [computer program]. Vienna, Austria: R Foundation for Statistical Computing. Accessed 2018 at: https://www.gbif.org/tool/81287/r-a-language-and-environment-for-statistical-computing
  • 11 StatTag [computer program]. Chicago, Illinois, United States: Galter Health Sciences Library; 2016. Available at: https://sites.northwestern.edu/stattag/
  • 12 Walton NA, Johnson DK, Person TN, Reynolds JC, Williams MS. Pilot implementation of clinical genomic data into the native electronic health record: challenges of scalability. ACI Open. 2020; 04 (02) e162-e166
  • 13 Lau-Min KS, Asher SB, Chen J. et al. Real-world integration of genomic data into the electronic health record: the PennChart Genomics Initiative. Genet Med 2020
  • 14 Venkatesh V, Morris MG, Davis GB, Davis FD. User acceptance of information technology: toward a unified view. Manage Inf Syst Q 2003; 27 (03) 425-478
  • 15 Rasmussen LV, Overby CL, Connolly J. et al. Practical considerations for implementing genomic information resources. Experiences from eMERGE and CSER. Appl Clin Inform 2016; 7 (03) 870-882
  • 16 Greenhalgh T, Robert G, Bate P, Macfarlane F, Kyriakidou O. Diffusion and dissemination. In: Diffusion of Innovations in Health Service Organisations: A Systematic Literature Review. Oxford, UK: Blackwell Publishing Ltd; 2005
  • 17 Cresswell KM, Bates DW, Sheikh A. Ten key considerations for the successful implementation and adoption of large-scale health information technology. J Am Med Inform Assoc 2013; 20 (e1(: e9-e13
  • 18 Chruściel-Nogalska M, Smektała T, Tutak M, Sporniak-Tutak K, Olszewski R. Open-source software in dentistry: a systematic review. Int J Technol Assess Health Care 2017; 33 (04) 487-493
  • 19 Silva LB, Jimenez RC, Blomberg N, Luis Oliveira J. General guidelines for biomedical software development. F1000 Res 2017; 6: 273
  • 20 Rehm HL, Berg JS, Brooks LD. et al; ClinGen. ClinGen--the clinical genome resource. N Engl J Med 2015; 372 (23) 2235-2242
  • 21 Landrum MJ, Lee JM, Benson M. et al. ClinVar: improving access to variant interpretations and supporting evidence. Nucleic Acids Res 2018; 46 (D1): D1062-D1067
  • 22 Payne T, Fellner J, Dugowson C, Liebovitz D, Fletcher G. Use of more than one electronic medical record system within a single health care organization. Appl Clin Inform 2012; 3 (04) 462-474
  • 23 Mandel JC, Kreda DA, Mandl KD, Kohane IS, Ramoni RB. SMART on FHIR: a standards-based, interoperable apps platform for electronic health records. J Am Med Inform Assoc 2016; 23 (05) 899-908
  • 24 Dolin RH, Boxwala A, Shalaby J. A pharmacogenomics clinical decision support service based on FHIR and CDS hooks. Methods Inf Med 2018; 57 (S 02): e115-e123