RSS-Feed abonnieren
DOI: 10.1055/a-2224-8000
Evaluation of a Patient Decision Aid for Refractive Eye Surgery
Authors
Funding Barrett, the Honors College at Arizona State University
- Abstract
- Background and Significance
- Objectives
- Methods
- Results
- Discussion
- Conclusion
- Clinical Relevance Statement
- Multiple Choice Questions
- References
Abstract
Background We developed a prototype patient decision aid, EyeChoose, to assist college-aged students in selecting a refractive surgery. EyeChoose can educate patients on refractive errors and surgeries, generate evidence-based recommendations based on a user's medical history and personal preferences, and refer patients to local refractive surgeons.
Objectives We conducted an evaluative study on EyeChoose to assess the alignment of surgical modality recommendations with a user's medical history and personal preferences, and to examine the tool's usefulness and usability.
Methods We designed a mixed methods study on EyeChoose through simulations of test cases to provide a quantitative measure of the customized recommendations, an online survey to evaluate the usefulness and usability, and a focus group interview to obtain an in-depth understanding of user experience and feedback.
Results We used stratified random sampling to generate 245 test cases. Simulated execution indicated EyeChoose's recommendations aligned with the reference standard in 243 (99%). A survey of 55 participants with 16 questions on usefulness, usability, and general impression showed that 14 questions recorded more than 80% positive responses. A follow-up focus group with 10 participants confirmed EyeChoose's useful features of patient education, decision assistance, surgeon referral, as well as good usability with multimedia resources, visual comparison among the surgical modalities, and the overall aesthetically pleasing design. Potential areas for improvement included offering nuances in soliciting user preferences, providing additional details on pricing, effectiveness, and reversibility of surgeries, expanding the function of surgeon referral, and fixing specific usability issues.
Conclusion The initial evaluation of EyeChoose suggests that it could provide effective patient education, generate appropriate recommendations, connect to local refractive surgeons, and demonstrate good system usability in a test environment. Future research is required to enhance the system functions, fully implement and evaluate the tool in naturalistic settings, and examine the findings' generalizability to other populations.
Keywords
refractive surgical procedures - computer-assisted decision-making - patient education as topic - benefits and impact assessments - application of evaluation methodology evaluation guidelinesBackground and Significance
Refractive errors are eye disorders wherein the length of the eye, shape of the cornea, and/or rigidity of the lens prevents light from focusing on retina, resulting in blurry vision.[1] Refractive surgeries, such as Laser-assisted In Situ Keratomileusis (LASIK), Photorefractive Keratectomy (PRK), and Phakic Intraocular Lenses (Phakic IOLs), are an alternative to prescription eye glasses and contact lenses, which permanently correct refractive errors.[2] [3] [4] [5] Over 2.3 billion patients diagnosed with refractive errors have access to ophthalmic care, creating a large market for refractive eye surgery.[6] The expected global demand for refractive surgical procedures will increase from 3.6 million in 2020 to 5.8 million by 2025.[6] The total cost of refractive surgery is expected to grow from $6.5 billion in 2019 to $10.3 billion in 2025.[6]
Between the ages of 16 and 21, when their vision begins to stabilize, patients with refractive errors start to explore the option of refractive eye surgery with their optometrist/ophthalmologist. College-aged patients, around this age range, need to make the decisions on (1) whether to undergo surgery and (2) if so, which surgical modality they should select.
A potential tool to assist patients with refractive errors is a decision aid. Decision aids are applications that facilitate patient decision-making regarding particular health problems. Features of these tools can include (1) patient education, (2) patient preference elicitation, and (3) customized recommendations based on individual patient's preferences.[7] [8] Two existing patient tools on refractive eye surgery, developed by Healthwise and the Mayo Clinic, can provide many useful features to assist patient decisions.[9] [10] However, these tools lack important functions, such as comprehensive information about refractive errors and refractive surgeries, customized recommendations of refractive surgery modalities, and recommendations of local surgeons, which are deemed necessary as shown in our previous study on needs assessment.[11]
To address the user needs and the limitations in the existing tools, we developed a patient decision aid, EyeChoose, to assist college-aged students in selection of a refractive eye surgery modality. The major functions of EyeChoose include (1) providing patient education on existing surgical modalities for correction of myopia, hyperopia, and astigmatism, (2) constructing individualized ranked recommendations for surgical modalities, and (3) referring patients to the top surgeon(s) for the recommended surgical modalities.[11] The development of EyeChoose was reported in our previous publication.[11] Partial screenshots of the EyeChoose tool are shown in [Fig. 1].[11]


Objectives
In this paper, we report an evaluative study of EyeChoose to assess the alignment of surgical modality recommendations with a user's medical history and personal preferences and to examine the tool's usefulness and usability.
Methods
Overview
We designed a mixed methods study with three components: (1) a quantitative assessment of the customized recommendations generated by EyeChoose, through the use of case simulations; (2) a qualitative evaluation of the usefulness and usability of EyeChoose, via an online survey; (3) an in-depth understanding of user experience and feedback, through a focus group interview.
Simulations with Test Cases
The individualized ranked recommendation of surgical modalities generated by EyeChoose was based on a scoring algorithm (see our previous paper)[11] with a series of input variables, including (1) user's medical history, such as uncontrolled eye diseases, uncontrolled autoimmune diseases, uveitis, and other related contraindications to one or more of the refractive eye surgeries; (2) user's vision prescription indicating the refractive errors; (3) user's preferences and lifestyles, such as invasiveness of surgery, cost-effectiveness, risk of dry eye after surgery, influence upon subsequent cataract surgery, recovery period, pain, follow-ups, reversibility of surgery, career in military, and participation in contact sports.[11] [12] [13] [14] [15] From a total of 15,881 possible combinations of these variables, we used stratified sampling to obtain a set of 245 test cases to ensure a fair representation of different contraindications, refractive errors, and user preferences. For each test case, we manually developed the most appropriate recommendations, based upon (1) the American Academy of Ophthalmology's (AAO) Preferred Practice Patterns[16] and (2) consultations with three practicing ophthalmologists in the Phoenix metropolitan area. We then shared these test cases and recommendations with two of the ophthalmologists for review and revision. The final dataset of the test cases and the approved recommendations were used as the reference standard in assessing the performance of the system.
For validity testing, we simulated the use of the EyeChoose tool by applying each test case, with manual inputs of the medical history and personal preferences. We then compared the recommendation generated by EyeChoose against the reference standard. EyeChoose's accuracy was determined by dividing the number of test cases when the recommendation aligned with the reference standard by the total number of test cases.
Survey
After the validity testing with simulated cases, we conducted a survey to gauge users' self-reported usefulness and usability of EyeChoose. We recruited the participants through (1) the e-newsletters of the Barrett Honors College and the College of Health Solutions (CHS) at Arizona State University (ASU), (2) professors from W.P. Carey School of Business and CHS at ASU, and (3) the social media platforms of Instagram and Snapchat. Participants were included, if they (1) were between the ages of 18 and 24, (2) were attending college or had attended college previously, (3) were suffering from myopia, hyperopia, and/or astigmatism, and (4) consented to participation. Participants were excluded if suffering from uncontrolled eye diseases or uncontrolled autoimmune disorders, which were absolute contraindications to refractive surgery. Five randomly selected survey participants were compensated with a $15 Amazon e-gift card.
We administered the survey through Google Forms. After obtaining the consent, we provided a link to the patient decision aid and instructed the participants to use the tool prior to answering survey questions. The survey started with three questions about users' demographics (race, ethnicity, and sex) and three questions on users' medical history (presence or absence of myopia, hyperopia, and/or astigmatism). It then provided 16 statements of the EyeChoose tool—8 on usefulness, 5 on usability, and 3 on general impressions, with the responses formulated on a 5-point Likert scale (strongly agree, agree, undecided, disagree, or strongly disagree). The construct of the survey questions was based upon the International Patient Decision Aids Standards and the Technology Acceptance Model.[17] [18] The survey closed with an open, free-text question requesting general feedback on the application. The consent form and the survey questionnaire can be found in the [Supplementary Material] (available in the online version only).
For data analyses of the Likert scale questions, we grouped the answers of strongly agree and agree as the positive responses, and the answers of disagree and strongly disagree as the negative responses. In addition, we examined the distribution of responses among the strata of users' race, ethnicity, sex, and medical history. For statements with less than 80% positive responses, we further conducted covariate analyses with the demographic backgrounds and medical history. For the last question, we reviewed the free-text responses to derive the common themes deductively and semantically.[19] [20] [21] [22] [23] [24]
Focus Group
Following the survey, we conducted a focus group interview to gain in-depth understanding of user feedback on EyeChoose. When recruiting participants, we used the same inclusion/exclusion criteria as the survey and similar approaches for outreaching. We specifically selected the focus group participants to ensure an adequate representation of different sex and race/ethnicity. We obtained consent from the participants. Each participant was compensated with a $30 Amazon e-gift card for the 2-hour session of the focus group.
We prepared the focus group questions based upon the International Patient Decision Aids Standards.[17] These questions were designed to solicit participants' opinions on (1) the completeness of the decision factors; (2) the elicitation of user preferences for decision factors; (3) the integration of user preferences with the recommendations; (4) the education on the various surgical modalities; (5) the layout, ease of use, and clear presentation of information of the application. The consent form and the focus group interview questions can be found in the [Supplementary Material] (available in the online version only).
We recorded the focus group session through Zoom. We assigned a pseudonym to each focus group participant for the session and instructed them to leave their cameras off through the session to ensure anonymity. During the focus group session, we took note of participants' responses. For data analysis, we reviewed the notes and the Zoom recording for commonly repeated phrases, which were then used to derive themes both inductively (by synthesizing the themes from the recorded data) and semantically (by extracting the explicit content from the verbiage).[19] [20] [21] [22] [23] [24]
Results
Case Simulations
The 245 test cases, generated through stratified sampling, represented a variety of decision factors that included contraindications, refractive errors, and user preferences, as shown in [Table 1].
Decision factors |
Number of test cases |
Percentage of total cases |
---|---|---|
Contraindications |
||
Absolute |
1 |
0.41% |
Relative |
122 |
49.80% |
None |
122 |
49.80% |
Refractive errors[a] |
||
None |
4 |
1.64% |
Myopia and Myopic Astigmatism |
96 |
39.33% |
Hyperopia and Hyperopic Astigmatism |
84 |
34.44% |
Mixed Astigmatism |
28 |
11.48% |
Unknown |
32 |
13.11% |
Preferences and lifestyles[b] |
||
Minimally invasive |
72 |
29.88% |
Cost-effective |
88 |
36.51% |
Lower risk of dry eye postoperatively |
136 |
56.43% |
No influence with subsequent cataract surgery |
8 |
3.32% |
Shorter recovery period |
88 |
36.51% |
Less painful recovery period |
72 |
29.88% |
Fewer follow-up appointments |
120 |
49.79% |
Reversible |
24 |
9.96% |
Suited for military or participants of contact sports |
96 |
39.83% |
a Only for patients without an absolute contraindication to refractive surgery.
b Three preferences and lifestyle statements can be selected.
Simulated execution indicated that EyeChoose generated surgical modality recommendations in compliance with the reference standard for 243 of the 245 test cases (99%). We reviewed the two cases with incorrect recommendations and found that the errors were due to the equal scores of two surgical modalities leading to recommendation of a surgical modality based on the alphabetical order, which was not the best answer according to the reference standard.
Survey
We recruited 55 participants for the survey. They responded to all the survey questions, with no missing values. Among the participants, 39 (71%) were females, 15 (27%) males, and 1 (2%) preferred not to share. In terms of racial/ethnic background, 28 (51%) participants were Asian, 25 (45%) Caucasian, 2 (4%) Black, and 4 (7%) Hispanic/Latinx. With regard to the refractive errors, 47 (86%) participants were myopic, 4 (7%) hyperopic, and 30 (55%) astigmatic.
The overall responses to the 16 survey questions on usefulness, usability, and general impression were very positive, with 14 questions recording more than 80% of the answers as strongly agree or agree. In terms of usefulness, 95% responses agreed/strongly agreed that the tool provided an unbiased description of the pros and cons for each surgical modality; 91% responses agreed/strongly agreed that the decision aid informed user of the rationale behind its recommendations. With regard to usability, 87% responses agreed/strongly agreed that the general instructions accompanying the decision aid were easy to understand; 87% responses agreed/strongly agreed that the instructions to teach user to provide preferences for/against each set of decision factors were clear. For general impression, 87% responses agreed/strongly agreed that they would use the tool; 87% responses agreed/strongly agreed that they liked the tool. A summary of the survey results is shown in [Fig. 2].


For the two questions that received less than 80% positive responses, we conducted covariate analysis on race, ethnicity, sex, and medical history. One-way analysis of variance indicated that respondents with a medical history of astigmatism were more receptive to a surgical consultation with a recommended surgeon (p = 0.01).
For the last open question in the survey, we collected 20 comments. Among them, five praised specific features of the EyeChoose tool, such as comprehensive patient education materials, personalized surgical modality recommendation, and usefulness to users as well as their friends and families. In 16 comments, the participants suggested potential improvement of the tool, such as clarifying the navigation of the tool, replacing medical jargon with layman's terms in patient education and medical history elicitation, reducing the length of additional information pertaining to surgical modality recommendations, and embedding a map within the surgeon recommendation feature to help locate the referred surgeons.
Focus Group
We recruited 10 participants for the focus group. Among them, three had participated in a previous focus group to develop EyeChoose, five had responded to the survey, and the remaining two were new participants but had used the application before participating in the interview.[11] We observed the representations from Caucasian, Black, Asian, and Hispanic/Latinx participants, with five males and five females. Regarding the refractive errors, nine participants had myopia, of which four had myopic astigmatism, and the other participants had hyperopia.
The participants confirmed EyeChoose's features related to patient education, decision-assistance, and surgeon referral to be useful. Specifically, they found that (1) the patient education clarified challenging concepts and improved participants' confidence in further discussion with refractive surgeons, (2) the elicitation of participants' preferences addressed users' values through the range of options provided, and (3) the referral tool was helpful in finding a surgeon for consultation, especially for those participants from Arizona. In terms of usability, the participants found that (1) the information was easy to understand through the embedded multimedia resources; (2) the tabular format provided visual comparison among the surgical modalities and the patients' preferences aligned with each; (3) the overall design was aesthetically pleasing. The participants had a positive general impression of the EyeChoose tool, as indicated by their willingness to share it with friends and family.
The focus group participants also identified specific aspects of the EyeChoose tool that could be further improved, including (1) offering nuances in soliciting users' preferences, for example, differentiating “must haves” from “would like to haves,” (2) providing additional details on pricing, effectiveness, and reversibility of surgeries, (3) expanding the function of surgeon referral beyond the state of Arizona, and (4) fixing certain usability issues.
A summary of the focus group findings can be found in [Table 2].
Abbreviation: LASIK, laser-assisted in situ keratomileusis.
Discussion
We leveraged a mixed methods design with three components, i.e. case simulation, survey, and focus group interview, in this evaluative study of EyeChoose. Similar to several other studies,[25] [26] each component here provided unique perspectives in evaluation of EyeChoose. Many findings from the different components complemented each other, as shown in the results. Compared to the two existing tools developed by Healthwise and Mayo Clinic,[9] [10] EyeChoose has a few unique features, including (1) providing information on three surgical modalities (LASIK, PRK, and Phakic IOLs) approved by Food and Drug Administration at the time as well as their side-by-side comparisons; (2) addressing a comprehensive set of factors to assist patient decision-making; (3) generating customized recommendations based on a patient's medical history and personal preferences; (4) delivering effective patient education through multimedia resources; (5) connecting patients to local surgeons for further consultations. Here we discuss some of the findings related to these unique features.
In evaluation of EyeChoose's customized recommendations of an appropriate surgical modality, the results from the simulation of testing cases demonstrated a high level of accuracy (99%). The survey data corroborated this finding, with 87% respondents agreeing or strongly agreeing that the decision aid presented recommendations relevant to their preferences. Related to the customized recommendations, the survey data indicated that the users had very positive feedback on EyeChoose to represent the factors in selection of a refractive surgery modality (84%) and to help clarify these decision factors (89%). The focus group data further confirmed these findings. However, not all users were satisfied with the recommendations. Although the elicitation of user preferences was based upon the needs assessment, some participants suggested that nuances should be provided, for example, by differentiating “must haves” and “would like to haves.”[11] Given the complexity in selection of a refractive surgery and the heuristic scoring system used, it is possible that even a top-scored recommendation generated by EyeChoose may still not be able to address all the preferences specified by a user. Building a better decision algorithm is, therefore, a direction for our future work.
EyeChoose provided effective patient education through text, embedded multimedia resources, as well as side-by-side comparisons of refractive errors and refractive surgery modalities. This was shown in both the survey data, such as the high-level positive responses (95%) to presentation of pros and cons of each surgical modality, and the focus group findings, such as the informational text and the multimedia resources that clarified challenging concepts. In particular, the participants felt more confident to conduct educated discussions with a refractive surgeon after using this patient decision aid.
In addition to providing patient education and decision assistance, EyeChoose connected users to the local health care resources by listing three refractive surgeons top-ranked by Google within the Phoenix metropolitan area based upon the recommended surgical modality. It also embedded Google reviews for the listed surgical practices. Many users found these links to the local surgeons useful, as indicated in both the surveys (65% positive responses) and the focus group. The users outside of Arizona wished that this function could be expanded to include the surgeons in their local communities, as shown in the focus group findings. Eventually, the use of these local resources would depend on individual users' specific situations, for example, the covariate analysis from the survey indicated that users with a medical history of astigmatism were more receptive to consider surgical consultations.
There were several limitations in this study. First, the survey and focus group participants were recruited from the Barrett Honors College and CHS at ASU. Many of them studied in health-related disciplines, and therefore were better prepared with the health knowledge needed to use EyeChoose. Additionally, the focus group approach has its intrinsic limitations in participants' hesitance to express opinions differing from the majority. The sample size for the survey was also relatively small. Thus, generalization of the findings from this study needs to be further examined in future research by recruiting participants from more diversified backgrounds and increasing the sample size. Second, two of the three ophthalmologists who reviewed the reference standard for validation of the case simulation were also involved in the development of the tool.[11] Although we leveraged other resources (such as AAO's guidelines) in development of the reference standard and conducted two rounds of review and revision, it might still introduce potential biases in evaluation of the system's performance.[16] Third, the scoring algorithm was designed based upon heuristics. Preliminary validation testing yielded high accuracy; however, the robustness of the system should be further examined using a larger sample of test cases, addressing specific user preferences, and based on a gold standard for evaluation. Last, the current implementation of the EyeChoose tool was a prototype client-side application for proof of concept.[11] The study design for the evaluation was in a lab-setting. While these were reasonable approaches for initial development and assessment, future research is required for full-scale implementation of the tool and its evaluation in natural settings when used by patients in real-world, addressing additional issues such as different technical platforms, users' health literacy and technology competency, inclusion of refractive surgeons across the nation, patient outreach, as well as data security and interoperability.[27] [28] [29]
Conclusion
An initial evaluation of EyeChoose using simulated test cases, a survey, and a focus group interview suggests this tool could provide effective patient education, generate appropriate recommendations customized to individual users, connect them to local refractive surgeons, and demonstrate good system usability in a test environment. Future research is required to (1) enhance the system functions with a better decision system and expanded coverage of local health care resources; (2) fully implement the system and to conduct evaluations in natural settings; (3) examine the generalizability of the findings from this study to other populations and settings.
Clinical Relevance Statement
EyeChoose is a novel patient decision aid to assist college-aged patients in selection of a refractive surgery modality. Its features include patient education, decision assistance, and surgeon referral. The positive user experiences with EyeChoose indicate that the application has the potential to support patients' decision-making with further improvement and evaluation.
Multiple Choice Questions
-
Which of the following approaches are typically used to solicit user feedback to evaluate a patient decision aid?
-
Focus group
-
Survey
-
Music video
-
Both a. and b.
Correct answer: d. Patient decision aids should be evaluated to ensure that they meet the needs of their target demographic. This can be done by running a focus group and a survey.
-
-
Which of the following are typical features of patient decision aids?
-
Artificial intelligence
-
Patient education
-
Best–worst scaling
-
Extended reality
Correct answer: b. Patient decision aids assist patients with health care decision-making. Patient education is a typical function of patient decision aid to help the user make an informed decision by providing educational resources.
-
Conflict of Interest
None declared.
Acknowledgments
We would like to thank the following contributors to this study: (1) ASU Barrett Honors College for providing funding support, (2) Swagel Wooton Eye Institute for surgical consultation, (3) Dr. James O'Neil for connections to local refractive surgeons, (4) Molly Redman for supporting application development, and (5) Daniel Sezanoff for application deployment onto the cloud.
Protection of Human and Animal Subjects
The study was performed in compliance with the World Medical Association Declaration of Helsinki on ethical principles for medical research involving human subjects and was reviewed and approved by Arizona State University Institutional Review Board.
* These authors are considered co-second authors.
-
References
- 1 Wedner S, Dineen B. Refractive errors. Trop Doct 2003; 33 (04) 207-209
- 2 National Institute of Health National Eye Institute. Refractive errors. Updated June 10, 2022 . Accessed March 26, 2023 at: https://www.nei.nih.gov/learn-about-eye-health/eye-conditions-and-diseases/refractive-errors
- 3 Turbert D. Alternative refractive surgery procedures. American Academy of Ophthalmology. Reviewed April 25, 2023. Accessed March 26, 2023 at: https://www.aao.org/eye-health/treatments/refractive-surgery-alternative-procedures
- 4 Dunkin MA. A Guide to Refractive and Laser Eye Surgery. WebMD. Reviewed March 6, 2023 . Accessed March 26, 2023 at: https://www.webmd.com/eye-health/overview-refractive-laser-eye-surgery
- 5 Kohnen T, Strenger A, Klaproth OK. Basic knowledge of refractive surgery: correction of refractive errors using modern surgical procedures. Dtsch Arztebl Int 2008; 105 (09) 163-170 , quiz 170–172
- 6 eyewire + . Market scope: Refractive surgery to grow 9.6% a year through 2025,. despite COVID-19. 2021 . Accessed September 9, 2022 at: https://eyewire.news/articles/market-scope-refractive-surgery-to-grow-9-6-a-year-through-2025-despite-covid-19/?c4src=article:infinite-scroll
- 7 An introduction to patient decision aids. Drug Ther Bull 2012; 50 (08) 90-92
- 8 National Institute for Health and Care Excellence. Medicines Optimisation: The Safe and Effective Use of Medicines to Enable the Best Possible Outcomes. https://www.nice.org.uk/guidance/ng5. Published March 2015. Accessed March 26, 2023
- 9 Healthwise. Nearsightedness: Should I have laser surgery?. Updated October 12, 2022 . Accessed September 9, 2022 at: https://www.healthwise.net/ohridecisionaid/Content/StdDocument.aspx?DOCHWID=aa127024
- 10 Mayo Clinic. LASIK surgery: Is it right for you?. 2021 . Accessed September 9, 2022 at: https://www.mayoclinic.org/tests-procedures/lasik-eye-surgery/in-depth/lasik-surgery/art-20045751
- 11 Subbaraman B, Ahmed K, Heller M, Essary AC, Patel VL, Wang D. Development of a patient decision aid for refractive eye surgery. AMIA Annu Symp Proc 2023; 2022: 1022-1031
- 12 Wilkinson JM, Cozine EW, Kahn AR. Refractive eye surgery: helping patients make informed decisions about LASIK. Am Fam Physician 2017; 95 (10) 637-644
- 13 Eydelman M, Hilmantel G, Tarver ME. et al. Symptoms and satisfaction of patients in the Patient-Reported Outcomes With Laser In Situ Keratomileusis (PROWL) studies. JAMA Ophthalmol 2017; 135 (01) 13-22
- 14 U.S. Food and Drug Administration. Before, during & after surgery. Updated January 8, 2018 . Accessed March 26, 2023 at: https://www.fda.gov/medical-devices/phakic-intraocular-lenses/during-after-surgery
- 15 Wachler BB. Phakic IOLs (implantable lenses). All About Vision. 2019 . Accessed March 26, 2023 at: https://www.allaboutvision.com/lens-implant-surgery/
- 16 Chuck RS, Jacobs DS, Lee JK. et al; American Academy of Ophthalmology Preferred Practice Pattern Refractive Management/Intervention Panel. Refractive Errors & Refractive Surgery Preferred Practice Pattern®. Ophthalmology 2018; 125 (01) P1-P104
- 17 Elwyn G, O'Connor A, Stacey D. et al; International Patient Decision Aids Standards (IPDAS) Collaboration. Developing a quality criteria framework for patient decision aids: online international Delphi consensus process. BMJ 2006; 333 (7565) 417
- 18 Rahimi B, Nadri H, Lotfnezhad Afshar H, Timpka T. A systematic review of the Technology Acceptance Model in health informatics. Appl Clin Inform 2018; 9 (03) 604-634
- 19 Boyatzis RE. Transforming Qualitative Information: Thematic Analysis and Code Development. Thousand Oaks, CA: Sage Publications;; 1998
- 20 Nowell LS, Norris JM, White DE, Moules NJ. Thematic analysis: striving to meet the trustworthiness criteria. Int J Qual Methods 2017; 16 (01) 1-13
- 21 Vashistha V, Poonnen PJ, Snowdon JL. et al. Medical oncologists' perspectives of the Veterans Affairs National Precision Oncology Program. PLoS ONE 2020; 15 (07) e0235861
- 22 Abraham J, Nguyen V, Almoosa KF, Patel B, Patel VL. Falling through the cracks: information breakdowns in critical care handoff communication. AMIA Annu Symp Proc 2011; 2011: 28-37
- 23 Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol 2006; 3 (02) 77-101
- 24 Maguire M, Delahunt B. Doing a thematic analysis: a practical, step-by-step guide for learning and teaching scholars. AISHE-J 2017; 9 (03) 3-5
- 25 Le XH, Doll T, Barbosu M, Luque A, Wang D. Evaluation of an Enhanced Role-Based Access Control model to manage information access in collaborative processes for a statewide clinical education program. J Biomed Inform 2014; 50: 184-195
- 26 Redman M, Brian J, Wang D. Evaluation of an online decision aid for selection of contraceptive methods. Appl Clin Inform 2023; 14 (01) 153-163
- 27 Kushniruk AW, Patel VL. Cognitive and usability engineering methods for the evaluation of clinical information systems. J Biomed Inform 2004; 37 (01) 56-76
- 28 Puccinelli-Ortega N, Cromo M, Foley KL. et al. Facilitators and barriers to implementing a digital informed decision making tool in primary care: a qualitative study. Appl Clin Inform 2022; 13 (01) 1-9
- 29 Dharod A, Bellinger C, Foley K, Case LD, Miller D. The reach and feasibility of an interactive lung cancer screening decision aid delivered by patient portal. Appl Clin Inform 2019; 10 (01) 19-27
Address for correspondence
Publikationsverlauf
Eingereicht: 31. Juli 2023
Angenommen: 06. Dezember 2023
Accepted Manuscript online:
08. Dezember 2023
Artikel online veröffentlicht:
24. Januar 2024
© 2024. Thieme. All rights reserved.
Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany
-
References
- 1 Wedner S, Dineen B. Refractive errors. Trop Doct 2003; 33 (04) 207-209
- 2 National Institute of Health National Eye Institute. Refractive errors. Updated June 10, 2022 . Accessed March 26, 2023 at: https://www.nei.nih.gov/learn-about-eye-health/eye-conditions-and-diseases/refractive-errors
- 3 Turbert D. Alternative refractive surgery procedures. American Academy of Ophthalmology. Reviewed April 25, 2023. Accessed March 26, 2023 at: https://www.aao.org/eye-health/treatments/refractive-surgery-alternative-procedures
- 4 Dunkin MA. A Guide to Refractive and Laser Eye Surgery. WebMD. Reviewed March 6, 2023 . Accessed March 26, 2023 at: https://www.webmd.com/eye-health/overview-refractive-laser-eye-surgery
- 5 Kohnen T, Strenger A, Klaproth OK. Basic knowledge of refractive surgery: correction of refractive errors using modern surgical procedures. Dtsch Arztebl Int 2008; 105 (09) 163-170 , quiz 170–172
- 6 eyewire + . Market scope: Refractive surgery to grow 9.6% a year through 2025,. despite COVID-19. 2021 . Accessed September 9, 2022 at: https://eyewire.news/articles/market-scope-refractive-surgery-to-grow-9-6-a-year-through-2025-despite-covid-19/?c4src=article:infinite-scroll
- 7 An introduction to patient decision aids. Drug Ther Bull 2012; 50 (08) 90-92
- 8 National Institute for Health and Care Excellence. Medicines Optimisation: The Safe and Effective Use of Medicines to Enable the Best Possible Outcomes. https://www.nice.org.uk/guidance/ng5. Published March 2015. Accessed March 26, 2023
- 9 Healthwise. Nearsightedness: Should I have laser surgery?. Updated October 12, 2022 . Accessed September 9, 2022 at: https://www.healthwise.net/ohridecisionaid/Content/StdDocument.aspx?DOCHWID=aa127024
- 10 Mayo Clinic. LASIK surgery: Is it right for you?. 2021 . Accessed September 9, 2022 at: https://www.mayoclinic.org/tests-procedures/lasik-eye-surgery/in-depth/lasik-surgery/art-20045751
- 11 Subbaraman B, Ahmed K, Heller M, Essary AC, Patel VL, Wang D. Development of a patient decision aid for refractive eye surgery. AMIA Annu Symp Proc 2023; 2022: 1022-1031
- 12 Wilkinson JM, Cozine EW, Kahn AR. Refractive eye surgery: helping patients make informed decisions about LASIK. Am Fam Physician 2017; 95 (10) 637-644
- 13 Eydelman M, Hilmantel G, Tarver ME. et al. Symptoms and satisfaction of patients in the Patient-Reported Outcomes With Laser In Situ Keratomileusis (PROWL) studies. JAMA Ophthalmol 2017; 135 (01) 13-22
- 14 U.S. Food and Drug Administration. Before, during & after surgery. Updated January 8, 2018 . Accessed March 26, 2023 at: https://www.fda.gov/medical-devices/phakic-intraocular-lenses/during-after-surgery
- 15 Wachler BB. Phakic IOLs (implantable lenses). All About Vision. 2019 . Accessed March 26, 2023 at: https://www.allaboutvision.com/lens-implant-surgery/
- 16 Chuck RS, Jacobs DS, Lee JK. et al; American Academy of Ophthalmology Preferred Practice Pattern Refractive Management/Intervention Panel. Refractive Errors & Refractive Surgery Preferred Practice Pattern®. Ophthalmology 2018; 125 (01) P1-P104
- 17 Elwyn G, O'Connor A, Stacey D. et al; International Patient Decision Aids Standards (IPDAS) Collaboration. Developing a quality criteria framework for patient decision aids: online international Delphi consensus process. BMJ 2006; 333 (7565) 417
- 18 Rahimi B, Nadri H, Lotfnezhad Afshar H, Timpka T. A systematic review of the Technology Acceptance Model in health informatics. Appl Clin Inform 2018; 9 (03) 604-634
- 19 Boyatzis RE. Transforming Qualitative Information: Thematic Analysis and Code Development. Thousand Oaks, CA: Sage Publications;; 1998
- 20 Nowell LS, Norris JM, White DE, Moules NJ. Thematic analysis: striving to meet the trustworthiness criteria. Int J Qual Methods 2017; 16 (01) 1-13
- 21 Vashistha V, Poonnen PJ, Snowdon JL. et al. Medical oncologists' perspectives of the Veterans Affairs National Precision Oncology Program. PLoS ONE 2020; 15 (07) e0235861
- 22 Abraham J, Nguyen V, Almoosa KF, Patel B, Patel VL. Falling through the cracks: information breakdowns in critical care handoff communication. AMIA Annu Symp Proc 2011; 2011: 28-37
- 23 Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol 2006; 3 (02) 77-101
- 24 Maguire M, Delahunt B. Doing a thematic analysis: a practical, step-by-step guide for learning and teaching scholars. AISHE-J 2017; 9 (03) 3-5
- 25 Le XH, Doll T, Barbosu M, Luque A, Wang D. Evaluation of an Enhanced Role-Based Access Control model to manage information access in collaborative processes for a statewide clinical education program. J Biomed Inform 2014; 50: 184-195
- 26 Redman M, Brian J, Wang D. Evaluation of an online decision aid for selection of contraceptive methods. Appl Clin Inform 2023; 14 (01) 153-163
- 27 Kushniruk AW, Patel VL. Cognitive and usability engineering methods for the evaluation of clinical information systems. J Biomed Inform 2004; 37 (01) 56-76
- 28 Puccinelli-Ortega N, Cromo M, Foley KL. et al. Facilitators and barriers to implementing a digital informed decision making tool in primary care: a qualitative study. Appl Clin Inform 2022; 13 (01) 1-9
- 29 Dharod A, Bellinger C, Foley K, Case LD, Miller D. The reach and feasibility of an interactive lung cancer screening decision aid delivered by patient portal. Appl Clin Inform 2019; 10 (01) 19-27



