Appl Clin Inform 2024; 15(01): 075-084
DOI: 10.1055/a-2224-8000
Research Article

Evaluation of a Patient Decision Aid for Refractive Eye Surgery

Authors

  • Bhavani Subbaraman

    1   College of Health Solutions, Arizona State University, Tempe, Arizona, United States
  • Kamran Ahmed*

    2   College of Health Solutions, Phoenix Children's Hospital, Phoenix, Arizona, United States
  • Matthew Heller*

    3   College of Health Solutions, Eye Doctors of Arizona, Phoenix, Arizona, United States
  • Alison C. Essary

    1   College of Health Solutions, Arizona State University, Tempe, Arizona, United States
  • Vimla L. Patel

    1   College of Health Solutions, Arizona State University, Tempe, Arizona, United States
  • Dongwen Wang

    1   College of Health Solutions, Arizona State University, Tempe, Arizona, United States

Funding Barrett, the Honors College at Arizona State University
 

Abstract

Background We developed a prototype patient decision aid, EyeChoose, to assist college-aged students in selecting a refractive surgery. EyeChoose can educate patients on refractive errors and surgeries, generate evidence-based recommendations based on a user's medical history and personal preferences, and refer patients to local refractive surgeons.

Objectives We conducted an evaluative study on EyeChoose to assess the alignment of surgical modality recommendations with a user's medical history and personal preferences, and to examine the tool's usefulness and usability.

Methods We designed a mixed methods study on EyeChoose through simulations of test cases to provide a quantitative measure of the customized recommendations, an online survey to evaluate the usefulness and usability, and a focus group interview to obtain an in-depth understanding of user experience and feedback.

Results We used stratified random sampling to generate 245 test cases. Simulated execution indicated EyeChoose's recommendations aligned with the reference standard in 243 (99%). A survey of 55 participants with 16 questions on usefulness, usability, and general impression showed that 14 questions recorded more than 80% positive responses. A follow-up focus group with 10 participants confirmed EyeChoose's useful features of patient education, decision assistance, surgeon referral, as well as good usability with multimedia resources, visual comparison among the surgical modalities, and the overall aesthetically pleasing design. Potential areas for improvement included offering nuances in soliciting user preferences, providing additional details on pricing, effectiveness, and reversibility of surgeries, expanding the function of surgeon referral, and fixing specific usability issues.

Conclusion The initial evaluation of EyeChoose suggests that it could provide effective patient education, generate appropriate recommendations, connect to local refractive surgeons, and demonstrate good system usability in a test environment. Future research is required to enhance the system functions, fully implement and evaluate the tool in naturalistic settings, and examine the findings' generalizability to other populations.


Background and Significance

Refractive errors are eye disorders wherein the length of the eye, shape of the cornea, and/or rigidity of the lens prevents light from focusing on retina, resulting in blurry vision.[1] Refractive surgeries, such as Laser-assisted In Situ Keratomileusis (LASIK), Photorefractive Keratectomy (PRK), and Phakic Intraocular Lenses (Phakic IOLs), are an alternative to prescription eye glasses and contact lenses, which permanently correct refractive errors.[2] [3] [4] [5] Over 2.3 billion patients diagnosed with refractive errors have access to ophthalmic care, creating a large market for refractive eye surgery.[6] The expected global demand for refractive surgical procedures will increase from 3.6 million in 2020 to 5.8 million by 2025.[6] The total cost of refractive surgery is expected to grow from $6.5 billion in 2019 to $10.3 billion in 2025.[6]

Between the ages of 16 and 21, when their vision begins to stabilize, patients with refractive errors start to explore the option of refractive eye surgery with their optometrist/ophthalmologist. College-aged patients, around this age range, need to make the decisions on (1) whether to undergo surgery and (2) if so, which surgical modality they should select.

A potential tool to assist patients with refractive errors is a decision aid. Decision aids are applications that facilitate patient decision-making regarding particular health problems. Features of these tools can include (1) patient education, (2) patient preference elicitation, and (3) customized recommendations based on individual patient's preferences.[7] [8] Two existing patient tools on refractive eye surgery, developed by Healthwise and the Mayo Clinic, can provide many useful features to assist patient decisions.[9] [10] However, these tools lack important functions, such as comprehensive information about refractive errors and refractive surgeries, customized recommendations of refractive surgery modalities, and recommendations of local surgeons, which are deemed necessary as shown in our previous study on needs assessment.[11]

To address the user needs and the limitations in the existing tools, we developed a patient decision aid, EyeChoose, to assist college-aged students in selection of a refractive eye surgery modality. The major functions of EyeChoose include (1) providing patient education on existing surgical modalities for correction of myopia, hyperopia, and astigmatism, (2) constructing individualized ranked recommendations for surgical modalities, and (3) referring patients to the top surgeon(s) for the recommended surgical modalities.[11] The development of EyeChoose was reported in our previous publication.[11] Partial screenshots of the EyeChoose tool are shown in [Fig. 1].[11]

Zoom
Fig. 1 Partial screenshots of the major functions of the EyeChoose tool, including patient education, medical history, personal preferences, customized recommendation of surgical modalities, and referral to surgeons.

Objectives

In this paper, we report an evaluative study of EyeChoose to assess the alignment of surgical modality recommendations with a user's medical history and personal preferences and to examine the tool's usefulness and usability.


Methods

Overview

We designed a mixed methods study with three components: (1) a quantitative assessment of the customized recommendations generated by EyeChoose, through the use of case simulations; (2) a qualitative evaluation of the usefulness and usability of EyeChoose, via an online survey; (3) an in-depth understanding of user experience and feedback, through a focus group interview.


Simulations with Test Cases

The individualized ranked recommendation of surgical modalities generated by EyeChoose was based on a scoring algorithm (see our previous paper)[11] with a series of input variables, including (1) user's medical history, such as uncontrolled eye diseases, uncontrolled autoimmune diseases, uveitis, and other related contraindications to one or more of the refractive eye surgeries; (2) user's vision prescription indicating the refractive errors; (3) user's preferences and lifestyles, such as invasiveness of surgery, cost-effectiveness, risk of dry eye after surgery, influence upon subsequent cataract surgery, recovery period, pain, follow-ups, reversibility of surgery, career in military, and participation in contact sports.[11] [12] [13] [14] [15] From a total of 15,881 possible combinations of these variables, we used stratified sampling to obtain a set of 245 test cases to ensure a fair representation of different contraindications, refractive errors, and user preferences. For each test case, we manually developed the most appropriate recommendations, based upon (1) the American Academy of Ophthalmology's (AAO) Preferred Practice Patterns[16] and (2) consultations with three practicing ophthalmologists in the Phoenix metropolitan area. We then shared these test cases and recommendations with two of the ophthalmologists for review and revision. The final dataset of the test cases and the approved recommendations were used as the reference standard in assessing the performance of the system.

For validity testing, we simulated the use of the EyeChoose tool by applying each test case, with manual inputs of the medical history and personal preferences. We then compared the recommendation generated by EyeChoose against the reference standard. EyeChoose's accuracy was determined by dividing the number of test cases when the recommendation aligned with the reference standard by the total number of test cases.


Survey

After the validity testing with simulated cases, we conducted a survey to gauge users' self-reported usefulness and usability of EyeChoose. We recruited the participants through (1) the e-newsletters of the Barrett Honors College and the College of Health Solutions (CHS) at Arizona State University (ASU), (2) professors from W.P. Carey School of Business and CHS at ASU, and (3) the social media platforms of Instagram and Snapchat. Participants were included, if they (1) were between the ages of 18 and 24, (2) were attending college or had attended college previously, (3) were suffering from myopia, hyperopia, and/or astigmatism, and (4) consented to participation. Participants were excluded if suffering from uncontrolled eye diseases or uncontrolled autoimmune disorders, which were absolute contraindications to refractive surgery. Five randomly selected survey participants were compensated with a $15 Amazon e-gift card.

We administered the survey through Google Forms. After obtaining the consent, we provided a link to the patient decision aid and instructed the participants to use the tool prior to answering survey questions. The survey started with three questions about users' demographics (race, ethnicity, and sex) and three questions on users' medical history (presence or absence of myopia, hyperopia, and/or astigmatism). It then provided 16 statements of the EyeChoose tool—8 on usefulness, 5 on usability, and 3 on general impressions, with the responses formulated on a 5-point Likert scale (strongly agree, agree, undecided, disagree, or strongly disagree). The construct of the survey questions was based upon the International Patient Decision Aids Standards and the Technology Acceptance Model.[17] [18] The survey closed with an open, free-text question requesting general feedback on the application. The consent form and the survey questionnaire can be found in the [Supplementary Material] (available in the online version only).

For data analyses of the Likert scale questions, we grouped the answers of strongly agree and agree as the positive responses, and the answers of disagree and strongly disagree as the negative responses. In addition, we examined the distribution of responses among the strata of users' race, ethnicity, sex, and medical history. For statements with less than 80% positive responses, we further conducted covariate analyses with the demographic backgrounds and medical history. For the last question, we reviewed the free-text responses to derive the common themes deductively and semantically.[19] [20] [21] [22] [23] [24]


Focus Group

Following the survey, we conducted a focus group interview to gain in-depth understanding of user feedback on EyeChoose. When recruiting participants, we used the same inclusion/exclusion criteria as the survey and similar approaches for outreaching. We specifically selected the focus group participants to ensure an adequate representation of different sex and race/ethnicity. We obtained consent from the participants. Each participant was compensated with a $30 Amazon e-gift card for the 2-hour session of the focus group.

We prepared the focus group questions based upon the International Patient Decision Aids Standards.[17] These questions were designed to solicit participants' opinions on (1) the completeness of the decision factors; (2) the elicitation of user preferences for decision factors; (3) the integration of user preferences with the recommendations; (4) the education on the various surgical modalities; (5) the layout, ease of use, and clear presentation of information of the application. The consent form and the focus group interview questions can be found in the [Supplementary Material] (available in the online version only).

We recorded the focus group session through Zoom. We assigned a pseudonym to each focus group participant for the session and instructed them to leave their cameras off through the session to ensure anonymity. During the focus group session, we took note of participants' responses. For data analysis, we reviewed the notes and the Zoom recording for commonly repeated phrases, which were then used to derive themes both inductively (by synthesizing the themes from the recorded data) and semantically (by extracting the explicit content from the verbiage).[19] [20] [21] [22] [23] [24]



Results

Case Simulations

The 245 test cases, generated through stratified sampling, represented a variety of decision factors that included contraindications, refractive errors, and user preferences, as shown in [Table 1].

Table 1

Profiles of the 245 test cases

Decision factors

Number of test cases

Percentage of total cases

Contraindications

 Absolute

1

0.41%

 Relative

122

49.80%

 None

122

49.80%

Refractive errors[a]

 None

4

1.64%

 Myopia and Myopic Astigmatism

96

39.33%

 Hyperopia and Hyperopic Astigmatism

84

34.44%

 Mixed Astigmatism

28

11.48%

 Unknown

32

13.11%

Preferences and lifestyles[b]

 Minimally invasive

72

29.88%

 Cost-effective

88

36.51%

 Lower risk of dry eye postoperatively

136

56.43%

 No influence with subsequent cataract surgery

8

3.32%

 Shorter recovery period

88

36.51%

 Less painful recovery period

72

29.88%

 Fewer follow-up appointments

120

49.79%

 Reversible

24

9.96%

 Suited for military or participants of contact sports

96

39.83%

a Only for patients without an absolute contraindication to refractive surgery.


b Three preferences and lifestyle statements can be selected.


Simulated execution indicated that EyeChoose generated surgical modality recommendations in compliance with the reference standard for 243 of the 245 test cases (99%). We reviewed the two cases with incorrect recommendations and found that the errors were due to the equal scores of two surgical modalities leading to recommendation of a surgical modality based on the alphabetical order, which was not the best answer according to the reference standard.


Survey

We recruited 55 participants for the survey. They responded to all the survey questions, with no missing values. Among the participants, 39 (71%) were females, 15 (27%) males, and 1 (2%) preferred not to share. In terms of racial/ethnic background, 28 (51%) participants were Asian, 25 (45%) Caucasian, 2 (4%) Black, and 4 (7%) Hispanic/Latinx. With regard to the refractive errors, 47 (86%) participants were myopic, 4 (7%) hyperopic, and 30 (55%) astigmatic.

The overall responses to the 16 survey questions on usefulness, usability, and general impression were very positive, with 14 questions recording more than 80% of the answers as strongly agree or agree. In terms of usefulness, 95% responses agreed/strongly agreed that the tool provided an unbiased description of the pros and cons for each surgical modality; 91% responses agreed/strongly agreed that the decision aid informed user of the rationale behind its recommendations. With regard to usability, 87% responses agreed/strongly agreed that the general instructions accompanying the decision aid were easy to understand; 87% responses agreed/strongly agreed that the instructions to teach user to provide preferences for/against each set of decision factors were clear. For general impression, 87% responses agreed/strongly agreed that they would use the tool; 87% responses agreed/strongly agreed that they liked the tool. A summary of the survey results is shown in [Fig. 2].

Zoom
Fig. 2 A summary of the survey results by positive, neutral, and negative responses (n = 55).

For the two questions that received less than 80% positive responses, we conducted covariate analysis on race, ethnicity, sex, and medical history. One-way analysis of variance indicated that respondents with a medical history of astigmatism were more receptive to a surgical consultation with a recommended surgeon (p = 0.01).

For the last open question in the survey, we collected 20 comments. Among them, five praised specific features of the EyeChoose tool, such as comprehensive patient education materials, personalized surgical modality recommendation, and usefulness to users as well as their friends and families. In 16 comments, the participants suggested potential improvement of the tool, such as clarifying the navigation of the tool, replacing medical jargon with layman's terms in patient education and medical history elicitation, reducing the length of additional information pertaining to surgical modality recommendations, and embedding a map within the surgeon recommendation feature to help locate the referred surgeons.


Focus Group

We recruited 10 participants for the focus group. Among them, three had participated in a previous focus group to develop EyeChoose, five had responded to the survey, and the remaining two were new participants but had used the application before participating in the interview.[11] We observed the representations from Caucasian, Black, Asian, and Hispanic/Latinx participants, with five males and five females. Regarding the refractive errors, nine participants had myopia, of which four had myopic astigmatism, and the other participants had hyperopia.

The participants confirmed EyeChoose's features related to patient education, decision-assistance, and surgeon referral to be useful. Specifically, they found that (1) the patient education clarified challenging concepts and improved participants' confidence in further discussion with refractive surgeons, (2) the elicitation of participants' preferences addressed users' values through the range of options provided, and (3) the referral tool was helpful in finding a surgeon for consultation, especially for those participants from Arizona. In terms of usability, the participants found that (1) the information was easy to understand through the embedded multimedia resources; (2) the tabular format provided visual comparison among the surgical modalities and the patients' preferences aligned with each; (3) the overall design was aesthetically pleasing. The participants had a positive general impression of the EyeChoose tool, as indicated by their willingness to share it with friends and family.

The focus group participants also identified specific aspects of the EyeChoose tool that could be further improved, including (1) offering nuances in soliciting users' preferences, for example, differentiating “must haves” from “would like to haves,” (2) providing additional details on pricing, effectiveness, and reversibility of surgeries, (3) expanding the function of surgeon referral beyond the state of Arizona, and (4) fixing certain usability issues.

A summary of the focus group findings can be found in [Table 2].

Table 2

Summary of the focus group findings and illustrative quotes

Findings

Quotes

Usefulness

• Patient education: Users found that the embedded multimedia educational resources clarified challenging concepts.

• “...I feel like for a lot of the surgeries, going in, I was like I don't even know what this is, so the fact that there were pictures was good. I also liked the pictures for the nearsightedness and farsightedness—the pictures that show the normal eye to a person who has farsightedness or nearsightedness.”

• “…just because [my grandmother] does tend to be a visual learner, and I think the videos would be very helpful to her for understanding the differences between the processes, and then being able to reread a chart again over and over. I think it would be very useful.”

• Patient education: Users found that the text was informational.

• “I actually thought that the description of each surgical process was very informative.”

• “I think the way that everything was described was good…”

• Patient education: After reading through the patient education materials provided, users felt confident enough to have an informed discussion with a refractive surgeon.

• “I would say that, absolutely. That having read through this, I have a better platform, if not to be fully educated, at least, to understand the terminology, that identifies with my current situation: So I would understand that nearsightedness is myopia, or etcetera, etcetera. Using this application provides me the basis to have that [surgical consultation].”

• “I mean, I really knew nothing about the surgeries, except for, I had heard a little about LASIK. But after doing the EyeChoose, I definitely knew much more about it; I was much more informed.”

• Decision-making: Users thought that all the decision factors they valued were represented and found that the exercise of selecting decision factors informed them of their values.

• “So I primarily focused on the factors of invasiveness and cost effectiveness, and I thought that those were very well represented through your selectional process. So for me there were no [additional] factors that I needed represented through that decision aid. I thought it hit all of the different facets that I valued.”

• “…Additionally, those factors listed factors I hadn't even thought about, such as pain levels and also recovery times. So that provided me more information just in picking between them.”

• “…it did a good job of clarifying [my preferences]. Seeing the different list of values and having me rank them actually made me think about what I want from [my preferred surgical modality].”

• Surgeon referral: Users liked the referrals to reputable surgeons.

• “I would recommend this, because a lot of my friends have asked where to get LASIK surgery, and I did not know anyone who had had it, besides hearing negative reviews from a third-party source. So I would have sent them to this resource to find information on the topic.”

• “I liked being able to see the Google reviews for the suggested centers to perform the surgery...”

Usability

• Patient education: Users liked that the multimedia educational resources were embedded.

• “I really appreciated how the videos were embedded into the webpage, and that it wasn't a link to click to take to a different tab, just because I find that very annoying. So I did enjoy that everything that was in the tool was embedded into the tab I was already in.”

• Decision-making: Users found the questions to be concise and pertinent.

• “I liked how the decision aid had concise wording. It did not ask extraneous information. This made it easy to use.”

• Decision-making: Users thought that the tables provided easy way to visually compare surgical modalities and patients' preferences that align with each.

• “I felt the comparison tables were very strong between both the [surgical modality] Recommendation page and the Additional Information page.”

• “I also liked the table of all the checkmarks and Xs when talking about the different surgeries in comparison to our choices and values.”

• Overall design: Users found the design to be aesthetically pleasing.

• “As a blind person trying to read this, I did like that your webpage was white, but then the information is generally in these grey boxes. I think it's very easy to read, and that's great.”

• “It was aesthetically pleasing to look at, while also being easy to use.”

• “I liked the user-friendly approach of the decision aid. It had a very simple visual style that was not overwhelming. It kept me focused on answering questions on my preferences without making me feel bored.”

General impressions

• Users would share EyeChoose with their friends and family.

• “Blindness and both farsightedness and nearsightedness are very common in my family, and we've talked about getting refractive surgeries before. So using this, especially to an aunt or someone I don't speak frequently, I would absolutely sent this to them …”

• “I would definitely recommend this to my grandmother ….”

Areas of improvement

• Decision-making: Users would prefer to provide their preferences in an alternative way to providing a partial ranking.

• “I think, for me, it worked fine, but, for others, where things like reversibility or invasiveness are non-negotiable things, it might be better to have stuff like 'I would prefer to have something' versus 'I must have something'.”

• “To have a category like 'these are my non-negotiables' and 'these are my strong preferences'—to be able to separate it out in that manner—would probably be more beneficial, instead of having to pick between my third top choice.”

• Decision-making: Users would have liked to see certain specific information about the surgical modalities.

• “…it could go into a little more detail about pricing. I know it varies a lot…I think a general ballpark would be useful to have.”

• “One thing that I wished had a little more clarification on…the reversibility of the surgery…kind of defining what reversibility means…why people would want to reverse it?”

• “…you could add a statistic on how effective each surgery is or how frequently you see regression in particular demographics.”

• Surgeon referral: Users outside of Arizona found the surgeons referred to be inaccessible to them, due to geography.

• “Well, for me, I'm not from Arizona, so I didn't really look at that section of the thing too much.”

• Usability: Users would have liked the surgical centers' websites to be embedded as hyperlinks.

• “For me, I would prefer if I could immediately go to those specific websites—so each of the organizations' websites—through this specific page.”

• Usability: Users found the medical terminology to be challenging to use across multiple pages.

• “You do a very good job of listing out what you mean by medical terminology; however, it's the page of Medical History that has a lot of those terms isn't on the same page that you define and list terminology. So, for me, it was a just little annoying to have flip back and forth or to have to Google terms I wasn't sure about.”

• Usability: Users would have liked clearer navigation.

• “If I went to the corner of the website, and selected a different page from the menu, the menu options would not close out; I had to manually close it, and [the patient decision aid] would already be on the corresponding page.”

• “I think there should be arrows at both the top and the bottom of the page.”

• “…when you mouse over the Additional Resources, they go to links, which I think is really slick. I just think it would be really great if it said that in Additional Resources.”

• Usability: Users would have liked text and icons to be better visible.

• “To be honest, I thought the font could just be a bit bigger. It was, kind of, a bit small.”

• “On the first page, the black text over the stethoscope picture is a little hard to read. I would just change the aesthetics of that.”

• “Looking at the charts of what was produced for the recommendations, because the fonts of the X and the check are black, it takes a little bit more effort for me to look through the chart, than it would be if, let's say, if the checks were highlighted in a different color that would still match the color scheme, but still would still be able to scan the page and see a summary of, well this is where there's the most light green, so that's where I'm going to be directed to with all the checks.”

Abbreviation: LASIK, laser-assisted in situ keratomileusis.




Discussion

We leveraged a mixed methods design with three components, i.e. case simulation, survey, and focus group interview, in this evaluative study of EyeChoose. Similar to several other studies,[25] [26] each component here provided unique perspectives in evaluation of EyeChoose. Many findings from the different components complemented each other, as shown in the results. Compared to the two existing tools developed by Healthwise and Mayo Clinic,[9] [10] EyeChoose has a few unique features, including (1) providing information on three surgical modalities (LASIK, PRK, and Phakic IOLs) approved by Food and Drug Administration at the time as well as their side-by-side comparisons; (2) addressing a comprehensive set of factors to assist patient decision-making; (3) generating customized recommendations based on a patient's medical history and personal preferences; (4) delivering effective patient education through multimedia resources; (5) connecting patients to local surgeons for further consultations. Here we discuss some of the findings related to these unique features.

In evaluation of EyeChoose's customized recommendations of an appropriate surgical modality, the results from the simulation of testing cases demonstrated a high level of accuracy (99%). The survey data corroborated this finding, with 87% respondents agreeing or strongly agreeing that the decision aid presented recommendations relevant to their preferences. Related to the customized recommendations, the survey data indicated that the users had very positive feedback on EyeChoose to represent the factors in selection of a refractive surgery modality (84%) and to help clarify these decision factors (89%). The focus group data further confirmed these findings. However, not all users were satisfied with the recommendations. Although the elicitation of user preferences was based upon the needs assessment, some participants suggested that nuances should be provided, for example, by differentiating “must haves” and “would like to haves.”[11] Given the complexity in selection of a refractive surgery and the heuristic scoring system used, it is possible that even a top-scored recommendation generated by EyeChoose may still not be able to address all the preferences specified by a user. Building a better decision algorithm is, therefore, a direction for our future work.

EyeChoose provided effective patient education through text, embedded multimedia resources, as well as side-by-side comparisons of refractive errors and refractive surgery modalities. This was shown in both the survey data, such as the high-level positive responses (95%) to presentation of pros and cons of each surgical modality, and the focus group findings, such as the informational text and the multimedia resources that clarified challenging concepts. In particular, the participants felt more confident to conduct educated discussions with a refractive surgeon after using this patient decision aid.

In addition to providing patient education and decision assistance, EyeChoose connected users to the local health care resources by listing three refractive surgeons top-ranked by Google within the Phoenix metropolitan area based upon the recommended surgical modality. It also embedded Google reviews for the listed surgical practices. Many users found these links to the local surgeons useful, as indicated in both the surveys (65% positive responses) and the focus group. The users outside of Arizona wished that this function could be expanded to include the surgeons in their local communities, as shown in the focus group findings. Eventually, the use of these local resources would depend on individual users' specific situations, for example, the covariate analysis from the survey indicated that users with a medical history of astigmatism were more receptive to consider surgical consultations.

There were several limitations in this study. First, the survey and focus group participants were recruited from the Barrett Honors College and CHS at ASU. Many of them studied in health-related disciplines, and therefore were better prepared with the health knowledge needed to use EyeChoose. Additionally, the focus group approach has its intrinsic limitations in participants' hesitance to express opinions differing from the majority. The sample size for the survey was also relatively small. Thus, generalization of the findings from this study needs to be further examined in future research by recruiting participants from more diversified backgrounds and increasing the sample size. Second, two of the three ophthalmologists who reviewed the reference standard for validation of the case simulation were also involved in the development of the tool.[11] Although we leveraged other resources (such as AAO's guidelines) in development of the reference standard and conducted two rounds of review and revision, it might still introduce potential biases in evaluation of the system's performance.[16] Third, the scoring algorithm was designed based upon heuristics. Preliminary validation testing yielded high accuracy; however, the robustness of the system should be further examined using a larger sample of test cases, addressing specific user preferences, and based on a gold standard for evaluation. Last, the current implementation of the EyeChoose tool was a prototype client-side application for proof of concept.[11] The study design for the evaluation was in a lab-setting. While these were reasonable approaches for initial development and assessment, future research is required for full-scale implementation of the tool and its evaluation in natural settings when used by patients in real-world, addressing additional issues such as different technical platforms, users' health literacy and technology competency, inclusion of refractive surgeons across the nation, patient outreach, as well as data security and interoperability.[27] [28] [29]


Conclusion

An initial evaluation of EyeChoose using simulated test cases, a survey, and a focus group interview suggests this tool could provide effective patient education, generate appropriate recommendations customized to individual users, connect them to local refractive surgeons, and demonstrate good system usability in a test environment. Future research is required to (1) enhance the system functions with a better decision system and expanded coverage of local health care resources; (2) fully implement the system and to conduct evaluations in natural settings; (3) examine the generalizability of the findings from this study to other populations and settings.


Clinical Relevance Statement

EyeChoose is a novel patient decision aid to assist college-aged patients in selection of a refractive surgery modality. Its features include patient education, decision assistance, and surgeon referral. The positive user experiences with EyeChoose indicate that the application has the potential to support patients' decision-making with further improvement and evaluation.


Multiple Choice Questions

  1. Which of the following approaches are typically used to solicit user feedback to evaluate a patient decision aid?

    • Focus group

    • Survey

    • Music video

    • Both a. and b.

    Correct answer: d. Patient decision aids should be evaluated to ensure that they meet the needs of their target demographic. This can be done by running a focus group and a survey.

  2. Which of the following are typical features of patient decision aids?

    • Artificial intelligence

    • Patient education

    • Best–worst scaling

    • Extended reality

    Correct answer: b. Patient decision aids assist patients with health care decision-making. Patient education is a typical function of patient decision aid to help the user make an informed decision by providing educational resources.



Conflict of Interest

None declared.

Acknowledgments

We would like to thank the following contributors to this study: (1) ASU Barrett Honors College for providing funding support, (2) Swagel Wooton Eye Institute for surgical consultation, (3) Dr. James O'Neil for connections to local refractive surgeons, (4) Molly Redman for supporting application development, and (5) Daniel Sezanoff for application deployment onto the cloud.

Protection of Human and Animal Subjects

The study was performed in compliance with the World Medical Association Declaration of Helsinki on ethical principles for medical research involving human subjects and was reviewed and approved by Arizona State University Institutional Review Board.


* These authors are considered co-second authors.


Supplementary Material


Address for correspondence

Bhavani Subbaraman, BS
College of Health Solutions, Arizona State University
3910 South Emerson Street, Chandler, AZ 85248
United States   

Publikationsverlauf

Eingereicht: 31. Juli 2023

Angenommen: 06. Dezember 2023

Accepted Manuscript online:
08. Dezember 2023

Artikel online veröffentlicht:
24. Januar 2024

© 2024. Thieme. All rights reserved.

Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany


Zoom
Fig. 1 Partial screenshots of the major functions of the EyeChoose tool, including patient education, medical history, personal preferences, customized recommendation of surgical modalities, and referral to surgeons.
Zoom
Fig. 2 A summary of the survey results by positive, neutral, and negative responses (n = 55).