Subscribe to RSS
DOI: 10.1055/s-0043-1777413
The Big Data Gap: Asymmetric Information in the Ophthalmology Residency Match Process and the Argument for Transparent Residency Data
Abstract
Background The ophthalmology match is an important step for graduating medical students that defines their future career. Residency programs demonstrate significant variability due to differences in size, location, research output, subspecialty exposure, surgical case load, and alumni fellowship/practice placement. Despite the importance of informed decision-making, applicants often find limited, inconsistent information about potential programs.
Purpose The purpose of this study was to characterize and identify gaps in the information available to residents in the 2022 to 2023 Match.
Methods The SFMatch Web site was reviewed to identify programs included as well as characteristics cited on each program's webpage. Program webpages were used to evaluate availability and consistency of data on site surgical caseload, fellowship slots, and teaching staff.
Results Of the 121 programs included on SFMatch, 23 (19%) provided no data on August 15, 2022 (15 days prior to application submission deadline) and 9 (7%) lacked program data on October 15, 2022. Though most programs provided mean cataract volume, data on volume of other procedures for graduating residents was highly variable and occasionally misleading. Programs did not provide information on several academic and social considerations that may influence match ranking choice.
Conclusion Applicants often must read “between the lines” to identify residency program strengths and weaknesses. Data crucial to informing the application process remain sparse, unavailable, or spread across resources. Limited data increases applicant dependence on word-of-mouth knowledge to inform decision-making. This might reduce diversity by limiting successful applicants to those with existing connections within the field.
#
The proportion of fourth year medical students applying to ophthalmology has increased dramatically, rising 18% from 2020 (635 applicants) to 2023 (742 applicants).[1] Yet, the number of available spots in the Match has only increased by 3%.[1] As a result, the ophthalmology match has become dramatically more competitive, with match rates dropping from 78% in 2020 to 69% in 2023.[1] Furthermore, the match rate for applicants that failed to match the first time is dramatically lower, suggesting that failure to match can be career-ending for burgeoning ophthalmologists. Given the consequences of failing to match in an increasingly competitive specialty, students are casting wider nets in order to maximize their chances of matching anywhere.[2] [3] The number of applications submitted by students has increased from 48 in 2008 to 88 in 2023, representing a 87% increase or an extra $1,400 spent per student.[1] [4] Thus, it is not surprising that a study by Venincasa et al determined that the greatest driver of the increased number of applications is “fear of failure to match.”[4]
The increased number of applications per student has created considerable additional burden for both students and residency program directors. Students are expected to take on more financial burden to apply to more programs, while program directors are expected to review and screen a greater number of applications for a relatively stagnant number of positions. Interestingly, the number of interview invitations received by students does not meaningfully increase proportional to the number of programs to which they applied.[1] [5] The average number of interviews offered was 9.3 for all students submitting > 40 applications, with a range of 5 to 15.[6] Per SFMatch, historical data demonstrates that applicants with ≥ 10 interviews have a 90% success rate of matching.[6] This suggests that there is a marginal return to applying to a greater number of programs.[5] Furthermore, this suggests that there is room for applicants to cut down on the number of applications by screening out programs where they are not expected to be a strong fit.
Given the match process is inefficient, stakeholders have proposed or implemented several potential solutions to improve efficiency and reduce burden on students and faculty. Commonly proposed improvements across all specialties include implementation of an application cap, creation of a standardized program database, utilization of standardized letters of evaluation, and preinterview screening.[7] Within the ophthalmology match, SFMatch has made recent strides by graduating application costs, setting a limit on the number of accepted interviews, and implementing virtual interviews for all programs, all of which exert an effect by theoretically altering applicant behavior. Despite proposed and attempted improvements, the current match process still leaves considerable gaps for students who lack ophthalmology connections.
It is an unfortunate paradox that medical students, who have undergone years of rigorous training focused on practicing evidence-based medicine, must make arguably one of the most important choices of their career based on incomplete or potentially biased information and without the benefit of objective data.[8] [9] [10] In an effort to improve transparency, SFMatch started to publish data on participating ophthalmology residency programs in 2020.[11] The intent of such a site is to consolidate relevant information for applicants, which may allow applicants to tailor where they apply to based on objective program characteristics that they would deem a strong fit for their interests. To date, there has been limited analysis on the approach SFMatch has taken and its success. Thus, the purpose of this study was to characterize and identify gaps in the information available to residents in the 2022to 2023 match on the SFMatch site.
Methods
Explanation of the SFMatch Tool
The SFMatch hosts a Web site that is updated yearly to provide medical students with access to a list of participating programs in that year's residency match, as well as select pieces of data provided by each residency program.[11] Data provided by SFMatch includes the type of residency program (i.e., integrated vs. joint), the number and specialty of teaching staff, surgical volume caseload, and proportion of residents participating in certain postgraduate career paths (i.e., fellowship vs. direct-to-practice).
#
Data Acquisition and Analysis
A total of 121 U.S. ophthalmology residency programs were identified using the 2022 SFMatch Ophthalmology Program Profile Information page.[11] The Web site was accessed on two separate dates (August 15, 2022 and October 15, 2022) to catalogue general availability of residency program information before and after residency applications were released to programs on September 01, 2022. Program SFMatch profiles from participating residency programs were evaluated on October 15, 2022 to evaluate availability and consistency of data on location, accreditation status, intern year type, association with medical school, class size, fellowship availability, research faculty, and surgical volume. Surgical volume of participating SFMatch programs were catalogued based on verbatim descriptions of surgery type (e.g., “Laser Surgery” and “Panretinal Laser Photocoagulation” were considered different types of surgery despite using similar technologies). Finally, the home webpages of participating residency programs were reviewed to identify other characteristics that were not provided on the SFMatch site that applicants may deem useful in determining where to apply. All collected data was catalogued in a Microsoft Excel spreadsheet. Analysis of data was conducted using IBM SPSS.
#
#
Results
Of the 121 programs included on SFMatch, 98 (81%) provided data by August 15, 2022 (15 days prior to application submission deadline) and 112 (93%) had provided residency program data by October 15, 2022 ([Table 1]). Of the 112 providing data, 108 (96.4%) reported an association with a medical school. All included programs had transitioned to a joint or integrated postgraduate year 1. Average class size was 4.6, while average number of research faculty was 6.0.
Abbreviations: SD, standard deviation; VA, Veterans Affairs.
Each program was required to provide cataract surgical volume, and had the option to provide up to three additional surgical volumes ([Table 2]). The 112 participating programs provided surgical volumes for 23 different surgeries, which included a range of surgical classifications (e.g., laser surgery) as well as specific procedures (e.g., chalazia excision). A total of 111 (92%) programs provided any data on cataract volume. Notably, two programs indicated that they did not have any cataract surgical volume by entering “0” for the number of average annual surgeries performed with the resident as primary surgeon. The most reported surgeries other than cataracts were oculoplastics (67/121, 55%), glaucoma (55/121, 45%), and strabismus (52/121, 43%).
Abbreviations: LASIK, laser-assisted in situ keratomileusis; N/A, not available; PRK, photorefractive keratectomy; SLT, selective laser trabeculoplasty.
Program home Web sites were reviewed, and additional residency considerations not provided in SFMatch were characterized ([Table 3]). Of the 122 program Web sites evaluated, a small minority of programs provided applicable datapoints regarding applicant considerations, with the majority of those programs stating that they enlisted a holistic application review process to determine applicants to interview. No program provided a discrete STEP 1 or 2 cutoff, though two programs implied a favorable outcome for applicants in the 75th percentile or greater. An additional 12 programs specifically indicated that there was no United States Medical Licensing Examination (USMLE) score cutoff for consideration. The most frequently cited program characteristics were the location and timing of rotations during residency (82.0%) as well as gender balance (80.3%). Notably, only 21.3% of programs provided any data on resident surgical volume. Approximately two-thirds of programs shared program benefits, though notably a large proportion of these programs shared the benefits on the hospital system Web site rather than the specific residency program Web site. Interestingly, unionized programs were more likely to have provided a comprehensive list of benefits available to residents versus nonunionized programs.
Missing datapoints provided on the residency program Web sites |
Programs providing datapoint, N (%) |
---|---|
Applicant considerations |
|
Enlist a “Holistic” application review process |
22 (18.0) |
Average STEP 1 score for current residents |
4 (3.3) |
STEP 1 score cutoff |
13 (10.7) |
No cutoff |
11 (9.0) |
Score > 75th percentile viewed favorably |
2 (1.6) |
Average STEP 2 score for current residents |
2 (1.6) |
STEP 2 score cutoff |
14 (11.5) |
No cutoff |
12 (9.8) |
Score > 75th percentile viewed favorably |
2 (1.6) |
3rd and 4th year grade cutoff |
0 (0) |
Typical matched programs[a] |
1 (0.8) |
Letters of recommendation requirements[b] |
7 (5.7) |
Additional application requirements[c] |
6 (4.9) |
Program characteristics |
|
Average STEP 3 score |
0 (0) |
Gender balance |
98 (80.3) |
Explicit percentage provided |
2 (1.6) |
Provided via resident profiles |
96 (78.7) |
Research/Publication volume |
12 (9.8) |
International rotations |
30 (24.6) |
Patient population size |
32 (26.2) |
Resident surgical volume |
26 (21.3) |
Cataract volume only |
7 (5.7) |
Multiple surgical volumes |
19 (15.6) |
Area cost of living |
14 (11.5) |
Rotation locations |
100 (82.0) |
Rotation time allocation |
100 (82.0) |
Call structure |
51 (41.8) |
Research requirements |
67 (54.9) |
Required but no specifics provided |
35 (28.7) |
Participation in scholarly research project |
16 (13.1) |
Publication of paper or case report |
11 (9.0) |
Poster presentation |
8 (6.6) |
Quality improvement project |
3 (2.5) |
Program benefits[d] |
|
Salary information |
74 (60.7) |
Medical/Dental benefits |
73 (59.8) |
Vacation/PTO |
78 (63.9) |
Parental leave |
39 (32.0) |
Retirement programs |
40 (32.8) |
Travel funds |
50 (41.0) |
Research funds |
55 (45.1) |
Other benefits[e] |
58 (47.5) |
Abbreviation: PTO, paid time off.
a Though no program provided data regarding “target” medical schools, one program provided percentages on regions where matched applicants resided.
b Seven programs provided specific requirements or preferences for letters of recommendations; typically 3 letters of recommendation (of which at least 2 were from ophthalmologists) were required. Three programs also required a Dean's letter.
c Most common additional application requirement was a program-specific statement of interest, which was to be sent by the applicant directly to the program director.
d Of note, unionized residency programs were more likely to have a comprehensive list of benefits publicly available.
e A list of additional benefits mentioned by programs include: adoption assistance, addiction services, blood donor club, business cards, campus currency, care coordination, cell phone discounts, childcare services, commuter services discounts, discount entertainment tickets, financial counseling, genetic testing, guaranteed resident housing, housing stipends, in-house gym, life/disability insurance, MBA program, meal stipends, mentorship programs, moving allowances, professional organization memberships, protected research time, scrubs, smartphone stipends, tax-sheltered annuities, tuition reduction, unionization, work hour limitations, and white coats
#
Discussion
Calls for reform within the ophthalmology match have grown in response to greater competitiveness and increasing application volume, which have increased burden on both students and residency program directors.[7] Though a number of changes have been proposed, implemented reform measures (e.g., interview caps, virtual interviews, and graduated application costs) have focused on influencing applicant behavior by increasing the cost burden of superfluous applications and improving applicant cost equity.[6] However, these do not address the core drivers of applicant behavior, including the fear of not matching, lack of transparent data, or inequity in the residency selection process.[12] Thus, further reform is necessary and should seek to target aspects of the match process that continue to drive inefficiencies in applicant behavior and/or review.
One key inefficiency that can be influenced directly by programs and residency directors is the lack of transparency of program data. Programs use filters that can automatically reject student applications if they do not meet certain “nonnegotiables” (e.g., a minimum STEP 1 or STEP 2 score).[8] [9] A student that does not meet this requirement would have wasted time and money on an application that was guaranteed not to bear fruit. Furthermore, students with sufficient board scores may still not be a good fit for a program, given programs may have soft requirements or additional considerations for extending interviews (e.g., research, geographic preference, third year rotation grades, Alpha Omega Alpha [AOA] status, or program preference).[8] [9] Conversely, applicants may be interested in weighing considerations such as program size, geographic location, research focus, training sites, alumni placement, subspecialty exposure, surgical case load, and cultural preferences into whether or not they would apply to or accept an interview at a specific program.[4] However, information on program applicant considerations or characteristics is often not readily available to most students, and thus students rely on word of mouth and advising from administration to titrate their application decision-making.[2] Students with connections in the space or that attend a medical school with an associated residency program immediately have an advantage, as program directors may be able to guide students on both soft and hard requirements for application to certain residency programs. Understandably, students with a home residency program are 40% more likely to match than those without one.[9]
Improved transparency of data may include residency characteristics and benefits which may reduce the number of applications by allowing applicants to filter programs based on personal criteria, or absolute board score or rotation grade cutoffs which may also reduce the number of applications by identifying programs where it would be futile for select applicants to apply. SFMatch attempted to improve data transparency by publicizing specific program characteristics in 2020. Given there has not been a formal peer-reviewed evaluation of this effort, we sought to characterize the effectiveness of these efforts and identify options that may provide the best path forward.
Our study demonstrated that there was significant variation in the type and usability of data provided by ophthalmology residency programs to the SFMatch. Approximately 81% of programs had provided data prior to the application release date, suggesting that a sizeable number of programs had not provided any information to help applicants tailor their applications prior to the application release date on September 1. Of the programs that provided information, there was significant variation in the way that data was expressed, potentially forcing students to read between the lines.
Surgical volume, at face value, can be one of many objective measures to help determine the focus of programs and opportunities available for training for residents. Programs participating in the SFMatch were required to provide average volume of cataracts where residents were the primary surgeon, as well as volume in up to three other procedures. Per Accreditation Council for Graduate Medical Education (ACGME), residents are required to demonstrate competence as the primary surgeon in a number of surgeries in order to graduate, including cataract, cornea, glaucoma, globe trauma, oculoplastics/orbit, retinal/vitreous, strabismus, and laser procedures (e.g., YAG capsulotomy, laser trabeculoplasty, laser iridotomy, panretinal laser photocoagulation).[13] However, despite clear guidelines on achieving competence in specific surgeries, the data provided on surgical volume was varied and occasionally did not meet the definition of achieving competence. Interestingly, while an overwhelming majority of programs were able to provide resident surgical volume on SFMatch, only 21% provided any data on surgical volumes on their home Web site.
Furthermore, the type and quantity of procedures reported on SFMatch were varied and difficult to compare. The four most commonly reported surgeries outside of cataracts were oculoplastics (55%), glaucoma (45%), strabismus (43%), and globe trauma (23%), all of which are included in the ACGME requirements. A greater proportion of programs would specify one type of surgery within a given class, rather than the broader class (e.g., vitrectomy rather than Retina/Vitreous). In total, 23 different surgeries or classes of surgery were mentioned. This lack of consistency in reporting creates ambiguity when cross-comparing residency programs, as a program that indicates an average of 70 oculoplastics cases is difficult to compare with a program that cites an average of 20 eyelid laceration repairs. Thus, applicants must read between the lines to determine what subspecialties a program might be “stronger” in or the expected surgical training at a given program. Accurate and reliable data are essential to the validity and usefulness of an online tool. Given there is a lack of consistency in data provided to SFMatch, as well as a lack of reliability on specific measures, it is difficult to leverage the SFMatch tool as it stands to compare programs on key metrics of importance to applicants.
In addition to concerns regarding the reliability of data presented on SFMatch, there also remain key gaps in the data provided. Data provided by SFMatch is largely focused on topics related to clinical experience and subspecialty exposure. This can be useful to help applicants determine whether they are interested in a program, especially if they have a specific subspecialty in mind. Indeed, a study by Venincasa et al demonstrated that surgical caseload and prior fellowship match results were key drivers of choice to apply to a specific program.[4] However, the database lacks information on academic, social, and economic considerations, which were found to be at least a moderate influence on students' interest in a specific program.[4] A majority (54.9%) of programs provided information on their individual program research requirements, which were sometimes above and beyond the ACGME requirement (e.g., publication of a first-author manuscript or annual poster presentations). Furthermore, residency program Web sites often provided data on benefits that could be directly compared between programs, with 60.7% providing data on salaries/stipends, 59.8% on medical and dental benefits, and 63.9% on vacation policies. Fewer programs provided data on parental leave (32.0%) or retirement programs (32.8%). Such data would be relatively straightforward to provide on the SFMatch Program Profiles, and an objective method to compare programs within the same geographic region with similar cost-of-living.
Ultimately, the SFMatch Program Profiles fail to solve some of the core inefficiencies in the match process. Without data on cutoffs or “requirements” to be a candidate for an interview, students still are unable to determine whether they would be a candidate for a specific program, and thus may send in an application that will likely be rejected. Specifically, data on STEP 1 and STEP 2 averages and/or other potential sources of hard cutoffs were lacking. Notably, only a few programs were willing to provide this data on their Web sites, with 3.3% providing current resident averages on USMLE exams, 9.8% specifically stating there was no cutoff for consideration, and two stating that USMLE scores > 75th percentile would be “viewed favorably.” Given applicants are largely driven by the ranking and reputation of programs, knowledge of cutoffs are a necessity to inform feasibility of application to specific programs.[4]
Consolidation of residency data is a critical and necessary step toward ensuring transparency in the residency match process. Key datapoints are spread across several platforms with variable accessibility and accuracy. For example, the Texas STAR is a student-reported application database that provides insight into average characteristics (e.g., STEP scores, AOA percentage, research experience, number of honored clerkships) of students who interviewed and matched at participating programs.[14] The generalizability of the data provided in the platform is questionable, as not every medical school participates and student participation has ranged from 38 to 47% between 2018 and 2022.[14] Individual program Web sites may provide data on research stipends, benefits, housing, and other residency characteristics, but are highly variable in what they publicly share.
The scattered nature of data critical to applicant decision-making may enforce similar inequities in the application process as medical schools that do not have an associated residency program. Medical schools that are willing to pay for access to or participate in databases will have more broad access to these data, whereas students at medical schools that do not participate may be forced to pay out-of-pocket for database access or forgo the data and make less-informed choices. Thus, these students are still required to rely on word of mouth, or do individualized research on hundreds of programs to determine which might be a good fit and worth applying to.
The SFMatch Web site is uniquely positioned to shape and reform the ophthalmology match process, given it is the only program used for the match and all applicants presumably have access to it. Thus, it is the best candidate to collect and provide consolidated information on residency programs. However, the onus is largely on program directors to share accurate and complete information, and likely requires a cultural shift to achieve the desired level of transparency. Residency programs may be hesitant to share such data given there will be data that are less flattering for certain programs and thus may be politically untenable at an institutional level.[10] As such, it may be necessary for the SFMatch to implement requirements on data disclosure that would improve transparency in order for programs to participate in the SFMatch. This shift to a more transparent match process may have additional benefits as well—sharing of characteristics such as resident benefits may accelerate adoption of benefits that applicants find important (e.g., access to childcare, educational stipends) but may not be universal across programs.
As with any study, there are limitations. This study only evaluated the SFMatch program Web site, though other studies have explored separate databases in greater detail.[2] Additionally, this study only evaluated the presence of specific data points (e.g., applicant considerations, program characteristics and benefits) on the program Web site. Thus, the study risks misrepresentation of the total number of programs that may offer a benefit. Indeed, programs may not have provided data on a specific topic (e.g., parental leave, international rotations) but still offer that particular benefit.
#
Conclusion
Applicants often must read “between the lines” to identify residency program strengths. Data crucial to informing the application process remain sparse, unavailable, or spread across resources. Limited data increases applicant dependence on word-of-mouth knowledge to inform decision-making. This might reduce diversity by limiting successful applicants to those with existing connections within the field.
Erratum: An erratum has been published for this article (DOI: 10.1055/s-0044-1778713).
#
#
Conflict of Interest
None declared.
* Investigation performed at New England Eye Center in Boston, MA. Stephen Le Breton is a medical student at Tufts University School of Medicine and a research assistant at the New England Eye Center. Dr. Shilpa Desai is a retinal specialist at New England Eye and Ear and an Assistant Professor at Tufts University School of Medicine. In addition to seeing patients, she is the Director of Medical Student Education for the department of ophthalmology.
-
References
- 1 2023 Summary Report - Ophthalmology Residency Match. SFMatch; 2023
- 2 Markle JC, Ahmed H, Pandya K. et al. Transparency in the ophthalmology residency match: background, study, and implications. Cureus 2021; 13 (11) e19826
- 3 Berger JS, Cioletti A. Viewpoint from 2 graduate medical education deans application overload in the residency match process. J Grad Med Educ 2016; 8 (03) 317-321
- 4 Venincasa MJ, Cai LZ, Gedde SJ, Uhler T, Sridhar J. Current applicant perceptions of the ophthalmology residency match. JAMA Ophthalmol 2020; 138 (05) 460-466
- 5 Siatkowski RM, Mian SI, Culican SM. et al; Association of University Professors of Ophthalmology. Probability of success in the ophthalmology residency match: three-year outcomes analysis of San Francisco Matching Program Data. J Acad Ophthalmol (2017) 2018; 10 (01) e150-e157
- 6 2022 - 2023 Ophthalmology Residency Match FAQs SFMatch.org: Association of University Professors of Ophthalmology; 2022. Accessed February 20, 2023 at: https://sfmatch.org/files/fcb2882dbb2b4c5bb394f87d089bc1df
- 7 Zastrow RK, Burk-Rafel J, London DA. Systems-level reforms to the US Resident Selection Process: a scoping review. J Grad Med Educ 2021; 13 (03) 355-370
- 8 SuzieKaran. Confessions of a Program Director: Interacting with the Electronic Residency Application Service (ERAS) on September 15th, 2019. Accessed November 23, 2023 at: https://thalamusgme.com/electronic-residency-application-service/
- 9 Loh AR, Joseph D, Keenan JD, Lietman TM, Naseri A. Predictors of matching in an ophthalmology residency program. Ophthalmology 2013; 120 (04) 865-870
- 10 Texas STAR [Internet]. UT Southwestern Medical Center; 2023
- 11 Liebman DL, Armstrong GW, Shah AS, Lorch AC, Miller JW, Chodosh J. The case for transparency in the ophthalmology residency match. Ophthalmology 2021; 128 (02) 185-187
- 12 SFMatch Ophthalmology Program Profile Info [Internet]; 2022. Accessed November 23, 2023 at: https://sfmatch.org/specialty-programs-compare/97baf738-9b5b-4b50-b715-444111a28b6d
- 13 Caretta-Weyer HA. An outcomes-oriented approach to residency selection: implementing novel processes to align residency programs and applicants. Acad Med 2022; 97 (05) 626-630
- 14 ACGME. ACGME Program Requirements for Graduate Medical Education in Ophthalmology; 2020
Address for correspondence
Publication History
Received: 28 February 2023
Accepted: 09 November 2023
Article published online:
11 December 2023
© 2023. The Author(s). This is an open access article published by Thieme under the terms of the Creative Commons Attribution-NonDerivative-NonCommercial License, permitting copying and reproduction so long as the original work is given appropriate credit. Contents may not be used for commercial purposes, or adapted, remixed, transformed or built upon. (https://creativecommons.org/licenses/by-nc-nd/4.0/)
Thieme Medical Publishers, Inc.
333 Seventh Avenue, 18th Floor, New York, NY 10001, USA
-
References
- 1 2023 Summary Report - Ophthalmology Residency Match. SFMatch; 2023
- 2 Markle JC, Ahmed H, Pandya K. et al. Transparency in the ophthalmology residency match: background, study, and implications. Cureus 2021; 13 (11) e19826
- 3 Berger JS, Cioletti A. Viewpoint from 2 graduate medical education deans application overload in the residency match process. J Grad Med Educ 2016; 8 (03) 317-321
- 4 Venincasa MJ, Cai LZ, Gedde SJ, Uhler T, Sridhar J. Current applicant perceptions of the ophthalmology residency match. JAMA Ophthalmol 2020; 138 (05) 460-466
- 5 Siatkowski RM, Mian SI, Culican SM. et al; Association of University Professors of Ophthalmology. Probability of success in the ophthalmology residency match: three-year outcomes analysis of San Francisco Matching Program Data. J Acad Ophthalmol (2017) 2018; 10 (01) e150-e157
- 6 2022 - 2023 Ophthalmology Residency Match FAQs SFMatch.org: Association of University Professors of Ophthalmology; 2022. Accessed February 20, 2023 at: https://sfmatch.org/files/fcb2882dbb2b4c5bb394f87d089bc1df
- 7 Zastrow RK, Burk-Rafel J, London DA. Systems-level reforms to the US Resident Selection Process: a scoping review. J Grad Med Educ 2021; 13 (03) 355-370
- 8 SuzieKaran. Confessions of a Program Director: Interacting with the Electronic Residency Application Service (ERAS) on September 15th, 2019. Accessed November 23, 2023 at: https://thalamusgme.com/electronic-residency-application-service/
- 9 Loh AR, Joseph D, Keenan JD, Lietman TM, Naseri A. Predictors of matching in an ophthalmology residency program. Ophthalmology 2013; 120 (04) 865-870
- 10 Texas STAR [Internet]. UT Southwestern Medical Center; 2023
- 11 Liebman DL, Armstrong GW, Shah AS, Lorch AC, Miller JW, Chodosh J. The case for transparency in the ophthalmology residency match. Ophthalmology 2021; 128 (02) 185-187
- 12 SFMatch Ophthalmology Program Profile Info [Internet]; 2022. Accessed November 23, 2023 at: https://sfmatch.org/specialty-programs-compare/97baf738-9b5b-4b50-b715-444111a28b6d
- 13 Caretta-Weyer HA. An outcomes-oriented approach to residency selection: implementing novel processes to align residency programs and applicants. Acad Med 2022; 97 (05) 626-630
- 14 ACGME. ACGME Program Requirements for Graduate Medical Education in Ophthalmology; 2020