Subscribe to RSS
DOI: 10.1055/a-1942-6889
Effect of Notes' Access and Complexity on OpenNotes' Utility
- Abstract
- Background and Significance
- Methods
- Results
- Discussion
- Conclusion
- Clinical Reference Statement
- Multiple choice questions
- References
Abstract
Background Health care providers are now required to provide their patients access to their consultation and progress notes. Early research of this concept, known as “OpenNotes,” showed promising results in terms of provider acceptability and patient adoption, yet objective evaluations relating to patients' interactions with the notes are limited.
Objectives To assess the effect of the complexity level of notes and number of accesses (initial read vs. continuous access) on the user's performance, perceived usability, cognitive workload, and satisfaction with the notes.
Methods We used a 2*2 mixed subjects experimental design with two independent variables: (1) note's complexity at two levels (simple vs. complex) and (2) number of accesses to notes at two levels (initial vs. continuous). Fifty-three participants were randomly assigned to receive a simple versus complex radiation oncology clinical note and were tested on their performance for understanding the note content after an initial read, and then with continuous access to the note. Performance was quantified by comparing each participant's answers to the ones developed by the research team and assigning a score of 0 to 100 based on participants' understanding of the notes. Usability, cognitive workload, and satisfaction scores of the notes were quantified using validated tools.
Results Performance for understanding was significantly better in simple versus complex notes with continuous access (p = 0.001). Continuous access to the notes was also positively associated with satisfaction scores (p = 0.03). The overall perceived usability, cognitive workload, and satisfaction scores were considered low for both simple and complex notes.
Conclusion Simplifying notes can improve understanding of notes for patients/families. However, perceived usability, cognitive workload, and satisfaction with even the simplified notes were still low. To make notes more useful for patients and their families, there is a need for dramatic improvements to the overall usability and content of the notes.
#
Background and Significance
Electronic health records (EHRs) have now been widely adopted in the United States.[1] One potential advantage of EHRs is that patients can readily access their own health data.[2] Indeed, starting in 2021, government regulations require health care providers to provide patients access to all health information in their electronic medical records, including patient consultation and progress notes[3] (often referred to as “OpenNotes”). This requirement followed many studies suggesting that access to notes leads to patients reporting greater control of their care.[4] In addition, studies showed that patients claimed that providing access to consultation and progress notes helped to educate them and increases adherence to medications.[3] [4] [5]
Overall, most studies report that patients wish to continue to have access to their notes, and only a few patients report feeling confused, worried, or offended by the notes.[3] [4] [6] [7] [8] [9] [10] [11] For example, in one survey analysis of 96 oncology clinicians and 3,418 patients with a cancer diagnosis, 70% of clinicians and 98% of patients indicated that OpenNotes is a “good idea,”[6] with 44% of the clinicians indicating that patients would be confused by their notes.[6] [12] [13] In another survey study with 88 patients being seen in radiation oncology, 96% of patients reported that accessing notes improved their understanding of their condition, 94% reported an improved understating of side effects, and 96% felt more confident about their treatment. On the other hand, some patients reported being more worried (11%), getting more confused (6%), and regretted reading the notes (4%).[8]
Most of the prior studies report clinicians' and patients' subjective opinions about access to notes with no studies objectively measuring patient understanding of the notes. In addition, there is limited research on how the complexity of these notes and the time spent with the note would affect patients' performance, perceived usability, cognitive workload, and satisfaction with the notes. Thus, the aim of this work was to assess the effect of the complexity level of notes, and the impact of the degree of access to the note, on the user's performance for understanding the note content, perceived usability, cognitive workload, and satisfaction with the notes.
#
Methods
Participants
This study was completed by healthy volunteers, as surrogates for caregivers. We conducted a pilot analysis with 10 participants to calculate the sample size needed for this study. Our analysis suggested that ≈45 participants would be needed to detect statistical differences between the independent variables at the statistical power of 0.80 and the significance level of 0.05.[14] [15] Fifty-three participants completed this study (55% response rate); age range: 19 to 63, mean: 29.5, and standard deviation: 9.4. [Table 1] provides a summary of information about participants' demographics. Recruitment was done by sending emails to a variety of list serves at two large academic institution in the United States, and through Research for Me,[16] a platform that provides research participant recruitment services. Participants were compensated with a $10 gift card each upon completion of the study.
#
Study Setting
We used a radiation oncology setting since the delivery of radiation therapy is often anxiety-provoking and the notes might thus be particularly reassuring (or worrisome) to patients. For the first independent variable (notes' complexity), an experienced radiation oncology provider wrote two versions of the same patient's note intended to represent “simple” versus “complex” versions (see [Supplementary Appendix A] [available in the online version] for the notes used in this study). The radiation oncology provider wrote a simple consultation note (the simple level), and then included additional information including more technical terminologies and details to make the notes more complex (the complex level). To quantify the simple versus complex note, two human factors engineering researchers, without any previous medical training, conducted a content analysis (i.e., a qualitative research approach commonly used to categorize qualitative data).[17] [18] [19] The researchers categorized and scored each sentence in the notes into (1) information known to the patient, (2) technical information, or (3) provider recommendations. Each researcher then coded each sentence in the notes as simple (if they could readily understand it) or complex (if they could not readily understand it). The researchers coded the notes independently and addressed any disagreements by consensus.[18] [20] For each of the notes, we calculated a complexity score based on the ratio of simple versus complex sentences. Prior to conducting the study, a few iterations of the notes were generated by the radiation oncology provider to achieve a difference in complexity level that was satisfactory to both researchers and the radiation oncology provider. The difference was considered satisfactory when the complex:simple ratios for the information known to patients and provider recommendation categories (i.e., nontechnical information categories) in the complex note were more than twice as high as those in the simple note (see [Table 2] for details and summary results of the content analysis).
For the second independent variable (number of accesses to notes): we asked the participants to answer questions about their understanding of the notes twice: once after their initial reading of the note once, and again with continuous access to the notes. This helped us understand if performance was related merely to the participants reading the note once (and not being able to recall information) versus not being able to understand the information even with unlimited access to the note.
#
Experimental Design
We used a 2*2 mixed subjects experimental design with the two independent variables: (1) note's complexity at two levels (simple vs. complex), and (2) the number of accesses to notes at two levels (initial vs. continuous). While controlling for prior experience with reading clinical notes, participants were randomly assigned to one of two conditions: simple versus complex notes. Participants were instructed to read the assigned notes as if they were a patient's family member (caregiver). Then, the participants read the notes once (initial) and answered “performance” questions assessing their understanding of the notes without being able to go back and read the notes again. Then, the participants were provided continuous access to the note and were again asked the same series of performance questions. After completing each performance evaluation (initial vs. continuous), participants completed validated questionnaires measuring the perceived usability, cognitive workload, and satisfaction.
#
Performance
Performance was quantified by comparing each participant's answers to the performance questionnaire (see [Supplementary Appendix B], available in the online version) and scored as the percent of “correct answers” (as developed by the research team; thus in a range of 0 to 100%). Therefore, correctly answering the performance questions would reflect higher understanding of the notes. In addition, we conducted a subanalysis on four of the performance questions that were considered clinically critical (physical exam key findings, recommended treatment options, recommended medications, and specialist referral).
#
Perceived Usability
Perceived usability was quantified using the Systems Usability Scale (SUS).[21] [22] [23] SUS is a valid, reliable, and most commonly used tool to measure usability of patient-facing interfaces.[24] [25] SUS is a 10-item questionnaire, with a five-point rating scale for each item ranging from strongly disagree to strongly agree. The outcome is a 0 [low] to 100 [high] score calculated based on user's rating of the 10 items with higher scores indicating better perceived usability.[26] [27]
#
Cognitive Workload
Cognitive workload was quantified using the National Aeronautical and Space Administration's Task Load Index (NASA-TLX),[28] a valid and reliable subjective measure. The NASA-TLX questionnaire evaluates cognitive workload using six dimensions (mental demand, physical demand, temporal demand, frustration, effort, and performance). It provides a global cognitive workload score from 0 [low] to 100 [high].
#
Satisfaction
Satisfaction was quantified using a slightly modified version of a previously developed survey.[29] Participants rated their satisfaction with the information in the notes, the time required to read the notes, language used in the notes, and the overall design of the notes using a 5-point Likert scale (see [Supplementary Appendix C] [available in the online version] for the satisfaction survey used in this study). Total satisfaction scores were calculated by averaging scores of items for satisfaction with the information in the notes, the time required to read the notes, language used in the notes, and the overall design.
#
#
Results
Data Analysis
IBM SPSS Statistics 28.0.1.0 was used to analyze the data. We conducted multiple mixed-subject ANOVAs (analyses of variance) to determine the effect of the independent variables (simple vs. complex note [between subjects] and initial vs. continuous access to the note [within-subject]) on performance, usability, cognitive workload, and satisfaction. The z-scores from the skewness and kurtosis were used to check for normality of data. Mauchly's test of sphericity was used to test the assumption of sphericity. Least significant difference adjustments were applied to test simple main effects at a statistical significance of p < 0.05. All simple pairwise comparisons were evaluated at an alpha level of 0.05. We included the demographics variables (e.g., age, education, and previous experience with reading clinical notes) in the analysis as covariates to count for their effect on the dependent variables. However, none of the covariates showed a statistically significant effect. A summary of the findings is provided in [Table 3].
Abbreviations: NASA-TLX, National Aeronautical and Space Administration's Task Load Index; SD, standard deviation; SUS, Systems Usability Scale.
#
Performance
Participants randomized to the simple notes with continuous access to the notes had better performance when compared to all the other experimental conditions (F (1, 50) = 11.78, p = 0.001, η2 = 0.19; [Fig. 1]). Descriptive statistics are provided in [Table 3]. The same result was seen when the analysis was limited to the four clinically critical questions (F (1, 50) = 6.16, p = 0.017, η2 = 0.11; [Fig. 2]).
#
Perceived Usability
There were no significant results associated with perceived usability (SUS scores; p > 0.05; [Table 3]).
#
Cognitive Workload
There were no significant results associated with cognitive workload (p > 0.05; [Table 3]), including analysis of individual dimensions of the NASA-TLX. Descriptive statistics of the combined data are provided in [Fig. 3] and [Table 4]. The results are combined since there were no significant differences.
Abbreviations: NASA-TLX, National Aeronautical and Space Administration's Task Load Index; SD, standard deviation.
#
Satisfaction
Participants indicated higher satisfaction when they had continuous access to the notes (M = 3.2) than when they did not (M = 2.9) with a significant mean difference of 0.24 (95% confidence interval: 0.06–0.36), p = 0.008 (F (1, 50) = 0.11, p = 0.74, η2 = .002; [Table 5]). The analysis of individual satisfaction items showed no statistically significance. Descriptive statistics of the combined data are provided in [Fig. 4] and [Table 5]. The results are combined since there were no significant differences.
Satisfaction items |
Information |
Time |
Language |
Design |
Overall satisfaction |
---|---|---|---|---|---|
Mean |
3.06 |
3.60 |
2.89 |
2.78 |
3.09 |
SD |
1.38 |
1.35 |
1.41 |
1.37 |
1.19 |
Abbreviation: SD, standard deviation.
#
#
Discussion
Overall, our results suggest that performance was better with the simple note and with continuous access. Continuous access to the notes was also positively associated with better satisfaction. The overall perceived usability, cognitive workload, and satisfaction scores were low. None of the covariates, including participant's previous experience with reading clinical notes, showed a statistically significant effect. This could be due to the fact that our means to define “experience” is imperfect, and, even if participants had experience with reading clinical notes in other fields, this may not have assisted in their understanding of a radiation oncology note specifically. In the subsections below, we discuss each dependent variable.
Performance
Participants' performance (i.e., understanding) was best with continuous access to simple notes. This is most likely due to the participants being able to go back to the notes and find answers to the performance questions. Results also suggest that participants were not able to recall even the clinically most critical information, which might be somewhat worrisome.[30] Even with continuous access to simple notes, performance was relatively low (overall ≈75% of questions were answered correctly), and no participants answered all questions correctly. Overall, our findings suggest that concerted efforts are needed to further simplify and improve usability of the notes. This might also suggest that, at least for some patients, it might not be practical to expect complete understanding via the note alone, and might potentially limit the ultimate utility of OpenNotes.[31] [32] [33] [34]
#
Perceived Usability
There were no statistical differences in usability between simple and complex notes, nor in the initial vs. continuous access. In all groups, the overall mean of perceived usability scores was relatively very low, i.e., ≈50–54 on the 100-point scale.[35] Thus, simplifying the notes' content in this study was insufficient to show an improvement in perceived usability. In the literature, the average usability score is 68[36] and it is a common practice in the human–computer interaction field to consider any score below this average as unsatisfactory. Thus, usability of notes needs further improvements.
#
Cognitive Workload
Participants perceived that reading the notes imposes a high cognitive workload, i.e., higher than what is considered “optimal” in the literature, especially for tasks with very low physical demands such as reading.[37] This could be due to the high use of medical jargon in the notes which adds burden onto the user, though the simple note was written explicitly aiming to reduce jargon. Thus, techniques to further reduce cognitive workload during interactions with notes are needed.
#
Satisfaction
Participants were slightly more satisfied when they had continuous access to the notes. However, satisfaction scores were generally low in both experimental conditions. Ideally, satisfaction scores need to be 4 and above (i.e., satisfied or extremely satisfied). While multiple studies report that patients were highly satisfied to receive access to their notes,[4] [8] [10] [31] [38] [39] [40] [41] these conclusions were generally based on a broad subjective question, rather than a formal tool designed to assess satisfaction. Our findings suggest that participants are not satisfied with the content of the notes (information, time to read, language used, and design). Thus, strategies to further improve satisfaction with the notes are needed.
#
Limitations of Findings
There are a number of limitations that prevent the generalizability of our findings. First, this was a remote study without a moderator present while the participants completed the study. Thus, we asked participants to conduct this study in a setting with limited exposure to distractions and interruptions, though compliance cannot be confirmed. Second, recruitment was done through an online platform. Thus, our sample consisted of those who are familiar and comfortable with using online tools and therefore their perception of reading provider's notes could be different from those who do not use technology frequently. Third, participants were asked to assume they were the caregiver of a patient, but they had no background information about the patient. In the real world, participants would have known the patient to at least some degree, and thus would be more familiar with their medical history. To help with this, we used relatively generic notes and standardized the notes among all participants. Fourth, the comparisons between the initial versus continuous access to the notes are not randomized (as it is not possible to randomize this variable), and the subjects knew the performance for understanding questions at the time when they had continuous access to the notes (since they had just done the assessments after their initial reading of the note). Despite this, performance for understanding was low even in the continuous access setting. Finally, we used only radiation oncology notes, making our findings perhaps most applicable to this particular clinical domain. Findings from this study may or may not be generalizable to other fields.
#
#
Conclusion
While participants randomized to the simple notes with continues access to the notes performed better than the participants randomized to the complex notes, their understanding, perceived usability, cognitive workload, and satisfaction with the notes were still low. While patients and their families have a strong interest in accessing their clinical notes, and the majority of patients expect meaningful benefits from reading the notes, our results suggest that to make the notes useful there is a need for dramatic improvements to the usability and content of the notes. Doing so might facilitate the use of clinical notes to also serve as a means to provide instructions and resources for patients (and families).
#
Clinical Reference Statement
This study suggests that the current way of writing clinical notes does not meet patient's need. The current version of the clinical notes is associated with low performance (understanding of the notes), usability, workload, and satisfaction. Thus, this could potentially lead patient to misinterpret the information in the notes and make poor decisions regarding their health. In order to improve patient engagement and decision making by providing patients access to their clinical notes, the usability and content of the notes must be improved to better address patients' needs.
#
Multiple choice questions
-
When patients read clinical notes, which of the following has an impact on performance (understanding the content of the notes)?
-
Notes' complexity (simple vs. complex)
-
Access to notes (initial vs. continuous)
-
Both notes' complexity and access to notes
-
None
Correct Answer: The correct answer is option c. Participants' performance (i.e., understanding) was best with continuous access and simple notes.
-
-
When patients read clinical notes, which of the following has an impact on their cognitive workload?
-
Notes' complexity (simple vs. complex)
-
Access to notes (initial vs. continuous)
-
Both notes' complexity and access to notes
-
None
Correct Answer: The correct answer is option d. None of these variables have significant effect on patients' cognitive workload. All participants reported high cognitive workload no matter which condition they were in.
-
#
#
Conflict of Interest
None declared.
Protection of Human and Animal Subjects
Participation was voluntary and does not pose undue risk. All human participants read and signed the informed consent form and all needed information was given to them when deciding whether to participate in the study. This was a remote unmoderated study and was fully implemented in Qualtrics.[42] The study protocol was reviewed and approved by the University of North Carolina at Chapel Hill Institutional Review Board (IRB) under reference ID: 338124.
-
References
- 1 HealthIT.gov. Office-based physician electronic health record adoption. Accessed May 18, 2021 at: https://dashboard.healthit.gov/quickstats/pages/physician-ehr-adoption-trends.php
- 2 Roehrs A, da Costa CA, Righi RD, de Oliveira KSF. Personal health records: a systematic literature review. J Med Internet Res 2017; 19 (01) e13
- 3 Delbanco T, Walker J, Darer JD. et al. Open notes: doctors and patients signing on. Ann Intern Med 2010; 153 (02) 121-125
- 4 Kayastha N, Pollak KI, LeBlanc TW. Open notes: a qualitative study of oncology patients' experiences reading their cancer care notes. J Clin Oncol 2017; 35 (31, suppl): 33-33
- 5 Murugan A, Gooding H, Greenbaum J. et al. Lessons learned from OpenNotes learning mode and subsequent implementation across a pediatric health system. Appl Clin Inform 2022; 13 (01) 113-122
- 6 Salmi L, Dong ZJ, Yuh B, Walker J, DesRoches CM. Open notes in oncology: patient versus oncology clinician views. Cancer Cell 2020; 38 (06) 767-768
- 7 Shaverdian N, Wang X, Hegde JV. et al. The patient's perspective on breast radiotherapy: initial fears and expectations versus reality. Cancer 2018; 124 (08) 1673-1681
- 8 Shaverdian N, Chang EM, Chu F-I. et al. Impact of open access to physician notes on radiation oncology patients: results from an exploratory survey. Pract Radiat Oncol 2019; 9 (02) 102-107
- 9 Turer RW, DesRoches CM, Salmi L, Helmer T, Rosenbloom ST. Patient perceptions of receiving COVID-19 test results via an online patient portal: an open results survey. Appl Clin Inform 2021; 12 (04) 954-959
- 10 Sarabu C, Lee T, Hogan A, Pageler N. The value of OpenNotes for pediatric patients, their families and impact on the patient-physician relationship. Appl Clin Inform 2021; 12 (01) 76-81
- 11 Ponathil AP, Khasawneh A, Byrne K, Madathil KC. Factors affecting the choice of a dental care provider by older adults based on online consumer reviews. IISE Trans Healthc Syst Eng 2021; 11 (01) 51-69
- 12 Blease C, Salmi L, DesRoches CM. Open notes in cancer care: coming soon to patients. Lancet Oncol 2020; 21 (09) 1136-1138
- 13 NIH. Cancer patients say clinical notes access valuable - National Cancer Institute. Accessed May 18, 2021 at: https://www.cancer.gov/news-events/cancer-currents-blog/2020/open-clinical-notes-access-by-cancer-patients
- 14 Cohen J. The effect size. In: Statistical Power Analysis for the Behavioral Sciences. Mahwah: NJLawrence Erlbaum Associates; 1988: 8-13
- 15 Khasawneh A, Chalil Madathil K, Dixon E, Wisniewski P, Zinzow H, Roth R. An investigation on the portrayal of blue whale challenge on youtube and twitter. Proc Hum Factors Ergon Soc Annu Meet 2019; 63 (01) 887-888
- 16 Research for Me - Home. Accessed May 10, 2022 at https://researchforme.unc.edu/index.php/en/
- 17 Harwood TG, Garry T. An overview of content analysis. Marketing Rev 2003; 3 (04) 479-498
- 18 Richards KAR, Hemphill MA. A practical guide to collaborative qualitative data analysis. J Teach Phys Educ 2018; 37 (02) 225-231
- 19 Khasawneh A, Chalil Madathil K, Dixon E, Wiśniewski P, Zinzow H, Roth R. Examining the self-harm and suicide contagion effects of the blue whale challenge on YouTube and Twitter: qualitative study. JMIR Ment Health 2020; 7 (06) e15973
- 20 Khasawneh A, Madathil KC, Zinzow H. et al. An investigation of the portrayal of social media challenges on youtube and twitter. Trans Soc Comput 2021; 4 (01) 1-23
- 21 Peres SC, Pham T, Phillips R. Validation of the system usability scale (SUS). Proc Hum Factors Ergon Soc Annu Meet 2013; 57 (01) 192-196
- 22 Wilson MK, Khasawneh A, Ponathil A. et al. A preliminary study investigating patients' perceptions of research consenting methods. Proc Hum Factors Ergon Soc Annu Meet 2019; 63 (01) 1931-1935
- 23 Khasawneh A, Rogers H, Bertrand J, Madathil KC, Gramopadhye A. Human adaptation to latency in teleoperated multi-robot human-agent search and rescue teams. Autom Construct 2019; 99: 265-277
- 24 Gomes KM, Ratwani RM. Evaluating improvements and shortcomings in clinician satisfaction with electronic health record usability. JAMA Netw Open 2019; 2 (12) e1916651
- 25 Cole AC, Adapa K, Khasawneh A, Richardson DR, Mazur L. Codesign approaches involving older adults in the development of electronic healthcare tools: a systematic review. BMJ Open 2022; 12 (07) e058390
- 26 Lewis JR, Sauro J. Item benchmarks for the system usability scale. Journal of Usability Studies 2018; 13 (03) 158-167
- 27 Sauro J, Lewis JR. Quantifying the User Experience: Practical Statistics for User Research. Amsterdam: Elsevier; 2004
- 28 Hart SG. Nasa-Task Load Index (NASA-TLX); 20 years later. Proc Hum Factors Ergon Soc Annu Meet 2006; 50 (09) 904-908
- 29 Hamad J, Fox A, Kammire MS, Hollis AN, Khairat S. Evaluating the experiences of new and existing teledermatology patients during the COVID-19 pandemic: cross-sectional survey study. JMIR Dermatol 2021; 4 (01) e25999
- 30 Cowan N. What are the differences between long-term, short-term, and working memory?. Prog Brain Res 2008; 169: 323-338
- 31 Mishra VK, Hoyt RE, Wolver SE, Yoshihashi A, Banas C. Qualitative and quantitative analysis of patients' perceptions of the patient portal experience with OpenNotes. Appl Clin Inform 2019; 10 (01) 10-18
- 32 Fossa AJ, Bell SK, DesRoches C. OpenNotes and shared decision making: a growing practice in clinical transparency and how it can support patient-centered care. J Am Med Inform Assoc 2018; 25 (09) 1153-1159
- 33 Leveille SG, Walker J, Ralston JD, Ross SE, Elmore JG, Delbanco T. Evaluating the impact of patients' online access to doctors' visit notes: designing and executing the OpenNotes project. BMC Med Inform Decis Mak 2012; 12: 32
- 34 Bialostozky M, Huang JS, Kuelbs CL. Are you in or are you out? provider note sharing in pediatrics. Appl Clin Inform 2020; 11 (01) 166-171
- 35 Adobe XD Ideas. The System Usability Scale & How it's Used in UX. Accessed February 7, 2022 at: https://xd.adobe.com/ideas/process/user-testing/sus-system-usability-scale-ux/
- 36 Usability.gov. System Usability Scale (SUS). Accessed February 8, 2022 at: https://www.usability.gov/how-to-and-tools/methods/system-usability-scale.html
- 37 Prabaswari AD, Basumerda C, Utomo BW. The mental workload analysis of staff in study program of private educational organization. IOP Conf Ser: Mater Sci Eng 2019; 528: 012018
- 38 Walker J, Leveille S, Bell S. et al. Opennotes after 7 years: patient experiences with ongoing access to their clinicians' outpatient visit notes. J Med Internet Res 2019; 21 (05) e13876
- 39 Panattoni L, Stone A, Chung S, Tai-Seale M. Patients report better satisfaction with part-time primary care physicians, despite less continuity of care and access. J Gen Intern Med 2015; 30 (03) 327-333
- 40 Alpert JM, Morris BB, Thomson MD, Matin K, Geyer CE, Brown RF. OpenNotes in oncology: oncologists' perceptions and a baseline of the content and style of their clinician notes. Transl Behav Med 2019; 9 (02) 347-356
- 41 Joseph A, Chalil Madathil K, Jafarifiroozabadi R. et al. Communication and teamwork during telemedicine-enabled stroke care in an ambulance. Hum Factors 2022; 64 (01) 21-41
- 42 QualtricsXM . The leading experience management software. Accessed November 2, 2021 at: https://www.qualtrics.com/
Address for correspondence
Publication History
Received: 09 June 2022
Accepted: 11 September 2022
Accepted Manuscript online:
14 September 2022
Article published online:
26 October 2022
© 2022. Thieme. All rights reserved.
Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany
-
References
- 1 HealthIT.gov. Office-based physician electronic health record adoption. Accessed May 18, 2021 at: https://dashboard.healthit.gov/quickstats/pages/physician-ehr-adoption-trends.php
- 2 Roehrs A, da Costa CA, Righi RD, de Oliveira KSF. Personal health records: a systematic literature review. J Med Internet Res 2017; 19 (01) e13
- 3 Delbanco T, Walker J, Darer JD. et al. Open notes: doctors and patients signing on. Ann Intern Med 2010; 153 (02) 121-125
- 4 Kayastha N, Pollak KI, LeBlanc TW. Open notes: a qualitative study of oncology patients' experiences reading their cancer care notes. J Clin Oncol 2017; 35 (31, suppl): 33-33
- 5 Murugan A, Gooding H, Greenbaum J. et al. Lessons learned from OpenNotes learning mode and subsequent implementation across a pediatric health system. Appl Clin Inform 2022; 13 (01) 113-122
- 6 Salmi L, Dong ZJ, Yuh B, Walker J, DesRoches CM. Open notes in oncology: patient versus oncology clinician views. Cancer Cell 2020; 38 (06) 767-768
- 7 Shaverdian N, Wang X, Hegde JV. et al. The patient's perspective on breast radiotherapy: initial fears and expectations versus reality. Cancer 2018; 124 (08) 1673-1681
- 8 Shaverdian N, Chang EM, Chu F-I. et al. Impact of open access to physician notes on radiation oncology patients: results from an exploratory survey. Pract Radiat Oncol 2019; 9 (02) 102-107
- 9 Turer RW, DesRoches CM, Salmi L, Helmer T, Rosenbloom ST. Patient perceptions of receiving COVID-19 test results via an online patient portal: an open results survey. Appl Clin Inform 2021; 12 (04) 954-959
- 10 Sarabu C, Lee T, Hogan A, Pageler N. The value of OpenNotes for pediatric patients, their families and impact on the patient-physician relationship. Appl Clin Inform 2021; 12 (01) 76-81
- 11 Ponathil AP, Khasawneh A, Byrne K, Madathil KC. Factors affecting the choice of a dental care provider by older adults based on online consumer reviews. IISE Trans Healthc Syst Eng 2021; 11 (01) 51-69
- 12 Blease C, Salmi L, DesRoches CM. Open notes in cancer care: coming soon to patients. Lancet Oncol 2020; 21 (09) 1136-1138
- 13 NIH. Cancer patients say clinical notes access valuable - National Cancer Institute. Accessed May 18, 2021 at: https://www.cancer.gov/news-events/cancer-currents-blog/2020/open-clinical-notes-access-by-cancer-patients
- 14 Cohen J. The effect size. In: Statistical Power Analysis for the Behavioral Sciences. Mahwah: NJLawrence Erlbaum Associates; 1988: 8-13
- 15 Khasawneh A, Chalil Madathil K, Dixon E, Wisniewski P, Zinzow H, Roth R. An investigation on the portrayal of blue whale challenge on youtube and twitter. Proc Hum Factors Ergon Soc Annu Meet 2019; 63 (01) 887-888
- 16 Research for Me - Home. Accessed May 10, 2022 at https://researchforme.unc.edu/index.php/en/
- 17 Harwood TG, Garry T. An overview of content analysis. Marketing Rev 2003; 3 (04) 479-498
- 18 Richards KAR, Hemphill MA. A practical guide to collaborative qualitative data analysis. J Teach Phys Educ 2018; 37 (02) 225-231
- 19 Khasawneh A, Chalil Madathil K, Dixon E, Wiśniewski P, Zinzow H, Roth R. Examining the self-harm and suicide contagion effects of the blue whale challenge on YouTube and Twitter: qualitative study. JMIR Ment Health 2020; 7 (06) e15973
- 20 Khasawneh A, Madathil KC, Zinzow H. et al. An investigation of the portrayal of social media challenges on youtube and twitter. Trans Soc Comput 2021; 4 (01) 1-23
- 21 Peres SC, Pham T, Phillips R. Validation of the system usability scale (SUS). Proc Hum Factors Ergon Soc Annu Meet 2013; 57 (01) 192-196
- 22 Wilson MK, Khasawneh A, Ponathil A. et al. A preliminary study investigating patients' perceptions of research consenting methods. Proc Hum Factors Ergon Soc Annu Meet 2019; 63 (01) 1931-1935
- 23 Khasawneh A, Rogers H, Bertrand J, Madathil KC, Gramopadhye A. Human adaptation to latency in teleoperated multi-robot human-agent search and rescue teams. Autom Construct 2019; 99: 265-277
- 24 Gomes KM, Ratwani RM. Evaluating improvements and shortcomings in clinician satisfaction with electronic health record usability. JAMA Netw Open 2019; 2 (12) e1916651
- 25 Cole AC, Adapa K, Khasawneh A, Richardson DR, Mazur L. Codesign approaches involving older adults in the development of electronic healthcare tools: a systematic review. BMJ Open 2022; 12 (07) e058390
- 26 Lewis JR, Sauro J. Item benchmarks for the system usability scale. Journal of Usability Studies 2018; 13 (03) 158-167
- 27 Sauro J, Lewis JR. Quantifying the User Experience: Practical Statistics for User Research. Amsterdam: Elsevier; 2004
- 28 Hart SG. Nasa-Task Load Index (NASA-TLX); 20 years later. Proc Hum Factors Ergon Soc Annu Meet 2006; 50 (09) 904-908
- 29 Hamad J, Fox A, Kammire MS, Hollis AN, Khairat S. Evaluating the experiences of new and existing teledermatology patients during the COVID-19 pandemic: cross-sectional survey study. JMIR Dermatol 2021; 4 (01) e25999
- 30 Cowan N. What are the differences between long-term, short-term, and working memory?. Prog Brain Res 2008; 169: 323-338
- 31 Mishra VK, Hoyt RE, Wolver SE, Yoshihashi A, Banas C. Qualitative and quantitative analysis of patients' perceptions of the patient portal experience with OpenNotes. Appl Clin Inform 2019; 10 (01) 10-18
- 32 Fossa AJ, Bell SK, DesRoches C. OpenNotes and shared decision making: a growing practice in clinical transparency and how it can support patient-centered care. J Am Med Inform Assoc 2018; 25 (09) 1153-1159
- 33 Leveille SG, Walker J, Ralston JD, Ross SE, Elmore JG, Delbanco T. Evaluating the impact of patients' online access to doctors' visit notes: designing and executing the OpenNotes project. BMC Med Inform Decis Mak 2012; 12: 32
- 34 Bialostozky M, Huang JS, Kuelbs CL. Are you in or are you out? provider note sharing in pediatrics. Appl Clin Inform 2020; 11 (01) 166-171
- 35 Adobe XD Ideas. The System Usability Scale & How it's Used in UX. Accessed February 7, 2022 at: https://xd.adobe.com/ideas/process/user-testing/sus-system-usability-scale-ux/
- 36 Usability.gov. System Usability Scale (SUS). Accessed February 8, 2022 at: https://www.usability.gov/how-to-and-tools/methods/system-usability-scale.html
- 37 Prabaswari AD, Basumerda C, Utomo BW. The mental workload analysis of staff in study program of private educational organization. IOP Conf Ser: Mater Sci Eng 2019; 528: 012018
- 38 Walker J, Leveille S, Bell S. et al. Opennotes after 7 years: patient experiences with ongoing access to their clinicians' outpatient visit notes. J Med Internet Res 2019; 21 (05) e13876
- 39 Panattoni L, Stone A, Chung S, Tai-Seale M. Patients report better satisfaction with part-time primary care physicians, despite less continuity of care and access. J Gen Intern Med 2015; 30 (03) 327-333
- 40 Alpert JM, Morris BB, Thomson MD, Matin K, Geyer CE, Brown RF. OpenNotes in oncology: oncologists' perceptions and a baseline of the content and style of their clinician notes. Transl Behav Med 2019; 9 (02) 347-356
- 41 Joseph A, Chalil Madathil K, Jafarifiroozabadi R. et al. Communication and teamwork during telemedicine-enabled stroke care in an ambulance. Hum Factors 2022; 64 (01) 21-41
- 42 QualtricsXM . The leading experience management software. Accessed November 2, 2021 at: https://www.qualtrics.com/