Keywords
OSCE - assessment - medical students - evaluation - fundoscopy
The Objective Structured Clinical Examination (OSCE) has been established as a reliable and effective way to evaluate competence in medical education. It was first introduced in 1975 as a series of rotating stations and has been used in multiple diverse educational environments including evaluation of medical students and residents.[1]
[2] In addition, the OSCE format is used in the Clinical Skills (CS) portion of the Step 2 USMLE examination.
Evaluation of medical students' competence in ophthalmology is usually performed by paper-based scenarios. While written case scenarios can assess knowledge, they are poor at evaluating clinical competence.[3] Correct use of the direct ophthalmoscope and performance of an ocular examination is an essential skill of every graduating medical student and can be crucial in cases such as papilledema.[4] Paper-based scenarios cannot assess competence in performing the ocular examination. Web-based[5] and simulator-based[6] evaluation of clinical skills do provide reproducible skills-based evaluation, but they lack the natural setting provided by a standardized patient encounter. In addition, standardized patients provide an accurate and reproducible method for evaluation of medical students' clinical skills.[7] The purpose of this report is to illustrate the development of a novel and robust ophthalmology OSCE to evaluate medical student competence in obtaining a history of headache and performing the ocular examination by third and fourth year medical students after a one-week ophthalmology clerkship.
Methods
Ophthalmology at the University of Wisconsin-Madison School of Medicine and Public Health required a one-week clerkship until May 2013. It was part of a six-week Neurosciences Clerkship which also included rotations in Neurology, Neurosurgery, Neuroradiology, and Rehabilitation Medicine. Students could choose to take the Neurosciences Clerkship in either their third or fourth year of medical school.
Prior to 2008, evaluation of medical student performance on the ophthalmology clerkship included faculty evaluation of students in clinic, faculty evaluation of student performance in small group sessions/case presentations, a multiple choice test, and a paper-based OSCE on papilledema. While these performance measures provided a good evaluation of a student's knowledge, it was felt that they did not provide a good measure of a student's clinical skills. Competent use of the direct ophthalmoscope and performance of the ocular examination is an essential skill of every graduating medical student. Therefore, in 2008, an ophthalmology standardized patient-based OSCE was added in place of the paper-based papilledema OSCE. The ophthalmology OSCE is part of a three-station Neurosciences OSCE. Neurology, Rehabilitation Medicine, Neurosurgery, and Ophthalmology alternate, having a station on the OSCE.
Development of the ophthalmology OSCE began with careful delineation of the ophthalmology history and physical exam skills that every graduating medical student should be able to perform. A case of headache was chosen since this is a problem commonly seen by primary care providers and emphasizes a situation in which examination of the ocular fundus would be a natural part of the physical examination. The author then taught standardized patients how to present the history of headache as well as how to act during the physical examination. Standardized patients were also taught how to feign a predetermined visual field defect. A photo of the right optic nerve of each standardized patient was taken and the author examined each standardized patient to ensure that there was no large discrepancy in difficulty in examining the optic nerves between standardized patients. It was decided that the standardized patients would not be dilated for the OSCE to more fully simulate a primary care type of clinical encounter.
Students are given 10 minutes to complete the ophthalmology OSCE. A door scenario ([Fig. 1]) is placed on the outside closed door of the standardized patient encounter and the student reads the scenario prior to entering the room. The student then performs a history and ocular examination. As [Fig. 1] shows, the student is also asked to choose a photo of the standardized patient's right optic nerve. There are four photos to choose from. [Fig. 2] shows a representative photo.
Fig. 1 Door scenario used during the OSCE. Students read this before entering the room.
Fig. 2 Representative photo of a standardized patient's optic nerve. The students match the standardized patient's optic nerve to one of four photos in the room.
Students are graded on a checklist ([Figs. 3]
[4]
[5]
[6]), with their overall performance reported as pass, marginal, or fail. History taking, confrontational visual field testing, pupillary testing, use of the direct ophthalmoscope, and communication skills are emphasized on the checklist. Students are able to miss up to three items on the history checklist. Students receive a pass if they miss one item on the physical examination checklist. They receive a marginal score if they miss two items. They receive a failing grade for the physical examination portion if they miss three or more items. They receive an overall failing grade if they fail either the history or physical examination portion of the checklist.
Fig. 3 History skills checklist.
Fig. 4 Physical examination checklist.
Fig. 5 Overall performance checklist. This is used to assess the overall quality of the students' physical examination skills.
Fig. 6 Overall communication checklist. This is used to assess the overall quality of the students' history taking skills.
The OSCE is video recorded and graded remotely. The author and one other faculty member grade the OSCE. It is important to note that when the OSCE was first implemented, the author reviewed 10 of the other faculty member's students to ensure inter-rater reliability. The author also reviews the video of any student that fails. Students who fail are remediated by a separate faculty member. This involves performance of the ocular examination with the faculty member as well as examination of the faculty member's optic nerve.
Results
All medical students were able to complete the scheduled tasks within the allotted 10 minutes. As can be seen in [Table 1], 384 students have taken the OSCE since 2008. [Table 1] illustrates the pass, marginal, and fail rates over the academic years from 2008 to 2012. Overall, 84% of the students have passed, 11% received a marginal score, and 5% failed. Nearly all the medical students who have taken the OSCE since 2008 passed the history portion of the OSCE. All of the students who failed the OSCE were successfully remediated.
Table 1
OSCE results 2008–2012
Academic year
|
Pass
|
Marginal
|
Fail
|
2008–2009 (N = 114 students)
|
110 (96.5%)
|
4 (3.5%)
|
0 (0%)
|
2009–2010 (N = 75 students)
|
65 (87%)
|
8 (10%)
|
2 (3%)
|
2010–2011 (N = 98 students)
|
79 (81%)
|
12 (12%)
|
7 (7%)
|
2011–2012 (N = 97 students)
|
69 (71%)
|
18 (19%)
|
10 (10%)
|
Overall (N = 384 students)
|
323 (84%)
|
42 (11%)
|
19 (5%)
|
Discussion
Background
The OSCE format has been used to evaluate clinical skills in many diverse academic settings.[1]
[2] To the author's knowledge, this is the first paper to describe a robust ophthalmology OSCE to assess competence in performing the ocular examination in the United States. The OSCE format is also used on the CS portion of the Step 2 USMLE examination and clinical scenarios on the CS exam are presented in which competent use of the direct ophthalmoscope could be needed (i.e., neurological cases). Implementation of the ophthalmology standardized patient-based OSCE at the University of Wisconsin-Madison School of Medicine and Public Health started in 2008. It provides an evaluation tool of clinical competence that cannot be addressed by paper-based tests.
Future Directions
One of the most positive aspects of the OSCE format is the flexibility that it provides. For example, changing the number or complexity of the photos used can increase or decrease the difficulty of the exam. In addition, it has been shown that it is possible to train standardized patients to complete the checklist in an accurate manner rather than have a separate grader.[8] This would greatly decrease the amount of work needed to grade the OSCE.
Other Applications
Another benefit of having a robust OSCE such as this is that ophthalmology education continues to compete with other educational activities at medical schools across the nation.[9]
[10] An ophthalmology OSCE is easily adapted to school-wide graduation skills examinations. For example, the ophthalmology OSCE was adapted to the Year-End Professional Skills Assessment (YEPSA) that all the third-year medical students take at the end of their third year at the University of Wisconsin-Madison School of Medicine and Public Health. Trends could be followed should the amount of ophthalmology education be increased or decreased, and if performance is deficient, evaluation data could be used to emphasize the need for increased ophthalmology education.
Final Thoughts
In 2007, the Association of University Professors in Ophthalmology Medical Student Educators Task Force developed a list of core ophthalmology curriculum for medical students (which can be found at http://www.aupomse.org). This was later endorsed by the American Academy of Ophthalmology in 2008. Core examination skills include evaluation of pupils, ocular motility, confrontational visual fields, and funduscopy. The ophthalmology OSCE provides a reliable educational tool to evaluate competence in performing these essential clinical skills.