Keywords publication speed - ophthalmology - bibliometric - journal impact factor - Eigenfactor
score - CiteScore
With an exponentially increasing amount of available peer-reviewed literature, publishing
in a highly regarded journal has become even more necessary for increased visibility
of one's research and may be a factor for academic promotion.[1 ] In recent years, the majority of medical journals have made the transition to online
publication of peer-reviewed manuscripts ahead of printed format. As a result, the
relative ease of access to published research in a theoretically timely fashion is
unparalleled; however, high-impact journals report increasing numbers of submissions
every year, and this could theoretically lead to increased times to publication because
of the arduous process that must go into peer review of each article. Slower publication
speeds may result in delay in dissemination of crucial information, and as the novel
corona virus disease 2019 (COVID-19) viral pandemic has illustrated, expedient availability
of new research is critical to institute new diagnostic and treatment measures.
There is little research to tie speed to publication to the “impact” of a journal
in the ophthalmology literature. In 2013, Chen et al evaluated the time from submission
to printed publication, time from acceptance to printed publication, and time from
submission to acceptance for articles published in 2010 to determine if these parameters
correlated to the journal impact factor (JIF) in ophthalmology journals and found
no statistically significant correlations between journal impact factors and these
parameters; however, they reported that journals with advance online publication had
significantly higher JIFs when compared with print only journals.[2 ] For context, only 26 journals in 2010 had online advance publications,[2 ] and it is unknown whether greater speed of online publication is correlated with
JIF.
Today, every ophthalmology journal offers online publication, and the current study
evaluates how time from submission to online publication, from acceptance to online
publication, and from online publication to print publication are correlated with
JIF. In addition, this study examines two other bibliometric measures, CiteScore (CS),
and Eigenfactor score (ES), which have become more prominent in recent literature,
and assesses whether the study design influenced speed of publication.[3 ]
[4 ]
[5 ]
[6 ]
[7 ]
[8 ]
Methods
The Journal Citation Reports for 2018 was accessed and filtered for the category of
ophthalmology (available at:
http://www.webofknowledge.com/JCR
; May 29, 2020). Sixty journals were present in the report. Review journals were excluded
to diminish the effects of distinct publication timelines for invited review articles.
A list of bibliometric measures for all identified ophthalmology journals was obtained.
JIF and ES were selected for evaluation. Additionally, a list of the CS for these
55 journals was obtained through Scopus (available at:
https://www.scopus.com/sources.uri
; May 29, 2020). When sufficient articles were available, 12 articles were randomly
selected from the 2018 volumes of each journal. One article from each issue was chosen
using a random number generator for journals with 12 issues. In journals with six
issues, two articles were selected at random from each issue. In journals that did
not have 6 or 12 issues, one article was randomly chosen from each issue and then
a random number generator was used to select issues from which to choose the remainder
of articles. Supplementary issues were excluded. All articles were selected from the
original investigations section of an issue. Review articles were excluded for the
same reason as outlined above for review journals.
The submission, revision, acceptance, online publication, and printed dates were recorded
from the full text of each article if available. All randomly selected articles were
included despite missing information unless submission date was not provided and the
journal had other articles that provided submission dates. In these cases, another
article from the same issue was chosen at random. If the journal did not report submission
dates for any articles, all initial randomly selected articles were included. The
article study type (basic or clinical), study design (observational, interventional,
or laboratory experiments), and study results (positive or negative) were recorded
and totaled up for stratification. Kruskal–Wallis test was performed to assess for
parameter differences between the three study designs. Mann–Whitney U -test further analyzed grouped differences in study designs identified by Kruskal–Wallis
test.
The time lag between submission and revision, acceptance, online publication, and
printed publication was calculated (in days) along with the period from acceptance
to online publication and printed publication. The median for each parameter, as well
as the interquartile range (IQR), was calculated for each journal. Spearman's correlations
between each parameter and the JIF, ES, and CS were analyzed. Spearman's correlation
between JIF, ES, and CS was analyzed to confirm equal measurements. Wilcoxon's signed-rank
test was used to compare online publication times and print publication times for
each journal. Wilcoxon's test was also performed to analyze the data as it compared
with the study by Chen et al.[2 ] Spearman's correlations in Chen et al were also compared with those included in
this study. All statistical analyses were performed with SPSS software Version 26.0
(SPSS, Inc., Chicago, IL).
Results
Journal Characteristics
After exclusion of all review journals, 55 journals were analyzed ([Table 1 ]). Every journal in the study had online publications of their articles available.
Five journals (9%) did not report submission dates for any article and were excluded
from final analysis of all time points except online to print publication time. Thirty-six
journals (65.4%) did not report revision dates. Six journals (10.9%) did not report
acceptance dates. Nine journals (16%) did not report their online publication dates;
however, online ahead publications were present for all nine. Nine journals (16%)
did not report print publication dates of which four were online only journals.
Table 1
Time lag and bibliometric measures in ophthalmology journals
Journal title
Journal impact factor
CiteScore
Eigenfactor score
Submission to acceptance date median (IQR), days
Acceptance to online publication median (IQR), days
Acceptance to print publication median (IQR), days
Submission to online publication median (IQR), days
Submission to print publication median (IQR), days
Online publication to print publication median (IQR), days
Ocul Surf
9.108
7.32
0.00506
77 (58.8–119.5)
2 (1–4.5)
90.5 (76.3–104)
83 (65–124.8)
180.5 (138–221)
88 (75–102.3)
Ophthalmology
7.732
4.19
0.05026
160 (92.5–194.5)
39.5 (36.5–43)
183 (165.8–190)
203.5 (133.3–246)
343 (281–375.5)
139.5 (123.8–149)
JAMA Ophthalmol
6.167
2.46
0.02153
118.5 (94.3–144.3)
57 (52–60)
107 (95.3–120.3)
172.5 (159.8–189.8)
229.5 (193.5–252.8)
54 (37.5–67.8)
Am J Ophthalmol
4.483
3.27
0.02818
99(81.5–125.5)
N/A
N/A
107.5 (90.8–132.8)
194 (157.3–207.8)
76.5 (60.3–82)
Retina
3.815
2.83
0.02076
N/A
N/A
N/A
N/A
N/A
N/A
Invest Ophth Vis Sci
3.812
3.21
0.0658
130 (83.3–140)
31.5 (21.5–37.3)
31.5 (21.5–37.3)
148 (121.8–177.5)
148 (121.8–177.5)
0
Br J Ophthalmol
3.615
3.38
0.02131
123.5 (101.8–147.3)
19 (14.8–43.3)
267.5 (144.5–287.5)
160.5 (118.5–181.5)
360 (333.8–409.3)
234 (120–258.3)
Clin Exp Ophthalmol
3.411
1.48
0.00475
103.5 (79.3–167.3)
13 (6.8–24.5)
249.5 (242–258.3)
130 (98–180.8)
371.5 (318.3–413.5)
237 (229–251)
Acta Ophthalmol
3.153
2.22
0.0119
144.5 (114.3–163.5)
97.5 (89.3–109.5)
270.5 (243.8–291.8)
259.5 (209.8–315.3)
414 (349–523)
170.5 (91.8–203.5)
J Refract Surg
3
2.39
0.00621
136 (115.5–184)
77 (51.8–85)
77 (51.8–85)
222 (192–269.8)
222 (192–269.8)
0
Exp Eye Res
2.998
3.09
0.01425
122 (77.3–166)
2.5 (1.8–4)
93.5 (84–107.5)
125 (79–169.8)
216.5 (173.3–268.8)
91.5 (80.3–103.5)
Ophthal Epidemiol
2.868
2.51
0.00258
210.5 (178.5–292.8)
21.5 (17.5–42.5)
147.5 (111.5–196.8)
229.5 (201.8–330.8)
355 (306.8–441.3)
113.5 (93–143)
Eye Vision
2.683
N/A
0.00149
139.5 (122.8–169)
15.5 (15–16)
N/A
154.5 (138.3–184.3)
N/A
N/A
Ophthal Physl Opt
2.561
2.01
0.0039
129 (75.3–151.3)
44 (37.5–47)
63.5 (45.8–75.3)
183 (109–194.5)
183 (158.5–204)
0
J Neuroophthalmol
2.509
1.17
0.00252
N/A
N/A
N/A
N/A
N/A
N/A
Trans Vis Sci Techn
2.399
2.45
0.00326
175.5 (148.3–194.8)
68 (47.8–74.3)
N/A
261.5 (201.3–277)
N/A
N/A
Eye Contact Lens
2.386
1.66
0.00275
N/A
N/A
550 (366.3–624.3)
N/A
N/A
N/A
Eye
2.366
1.72
0.01104
201 (124.8–290)
43.5 (37.5–66)
153 (143.8–168)
253.5 (181.8–367.5)
362.5 (292.5–499.8)
107 (96.3–116)
Cornea
2.313
2.06
0.01314
86.5 (66–109.8)
46 (45–51.5)
136.5 (130–169.8)
146 (116–163)
242 (228.5–254)
83 (71–88.5)
Graef Arch Clin Exp
2.25
1.94
0.01188
129 (83.5–192)
12 (8.75–21)
67.5 (58.3–71.3)
138.5 (99–198.8)
194.5 (158–254.8)
53 (44–62)
J Cataract Refr Surg
2.238
1.57
0.01141
110 (59–142)
N/A
82.5 (76.3–94.8)
N/A
209 (130.5–248.5)
N/A
Ocul Immunol Inflamm
2.231
1.52
0.0031
166 (131.5–218.5)
49 (42.8–57.8)
N/A
216 (192.8–260.5)
N/A
N/A
Vision Res
2.178
2.44
0.01065
186.5 (131–246.3)
27 (23–83)
70 (56.5–200.3)
244 (209–309.5)
337.5 (245.5–431.8)
43 (35.3–55.8)
Mol Vis
2.174
2.32
0.00592
184 (138.5–203.3)
2 (2–2)
N/A
186 (140.5–205.3)
N/A
N/A
J Vision
2.089
2.01
0.01951
N/A
N/A
N/A
205.5
N/A
N/A
Contact Lens Anterio
1.985
1.74
0.00235
190.5 (113.8–249)
6.5 (5–16.8)
128.5 (117–178.3)
201 (122.5–257.8)
343.5 (273.5–382.8)
115 (99–165.8)
J Ocul Pharmacol Th
1.792
1.73
0.00328
130 (91–194.5)
56.5 (46.5–65.3)
131.5 (115.8–140.3)
183.5 (137.5–271.5)
270.5 (238.8–328.8)
73 (53.8–92)
Ophthalmologica
1.781
1.82
0.00234
102.5 (86.3–153.8)
67 (51.2–75.3)
142.5 (109.3–161.5)
187 (159.5–209.8)
266 (228.5–298.8)
64 (52.8–95.3)
Eur J Ophthalmol
1.716
1.23
0.00317
56.5 (29.3–103)
62 (40.5–120)
214 (187.8–226.5)
149 (69.5–241.3)
280.5 (227.8–329)
162 (95–176.3)
Ophthalmic Res
1.685
1.85
0.00177
124.5 (85–195)
61 (57.8–80)
134 (107–194.3)
207.5 (143.8–255.8)
252.5 (200.3–362.5)
65.5 (29.5–99.3)
Curr Eye Res
1.672
1.68
0.00639
163.5 (104.5–197.8)
17.5 (15.8–27.5)
110 (104.8–122.5)
188.5 (120.8–228.3)
280 (200.5–320.8)
86.5 (73.8–102.3)
J Glaucoma
1.661
1.66
0.00605
126.5 (117–142.5)
N/A
68.5 (56.3–75.5)
N/A
201 (186.8–217.8)
N/A
Jpn J Ophthalmol
1.653
1.96
0.00214
197.5 (176.5–242)
49 (40.5–57.3)
104.5 (86.8–120)
251.5 (218.5–290.5)
309.5 (283.8–336.3)
53 (37.8–70.3)
Visual Neurosci
1.645
N/A
0.00151
109 (85–124.5)
80 (52.1–98.5)
N/A
202 (164–217.3)
310 (296.9–323.1)
54 (45.9–62.1)
J Ophthalmol
1.58
1.78
0.00694
95 (68.8–150)
40.5 (33.5–54.5)
N/A
133.5 (109.5–196)
N/A
N/A
Optometry Vision Sci
1.577
1.56
0.00713
246.5 (181.8–269.5)
N/A
97.5 (71.8–123.5)
N/A
334.5 (288.8–375.3)
N/A
Clin Exp Optom
1.559
1.34
0.00241
128.5 (86.3–164)
64.5 (37.3–112)
248 (218.8–260)
204.5 (164.8–251)
385.5 (329–465.8)
172.5 (129–211)
Perception
1.503
1.34
0.00345
135 (69–212)
24.5 (20–31.3)
77.5 (66.5–91)
165 (93.3–236.5)
212.5 (162.3–289.5)
52 (40.5–68.5)
Int Ophthalmol
1.496
1.18
0.00317
140.5 (115.8–193.8)
13 (6.8–29.3)
422 (408.8–428.8)
177.5 (151–216.5)
560 (540–609)
402.5 (392.3–409.8)
Doc Ophthalmol
1.46
2.33
0.00181
110 (96.5–170.5)
9.5 (7.3–17.3)
64.5 (54.3–70.3)
119.5 (105–180.8)
196 (160–221.3)
51.5 (21.3–62.5)
BMC Ophthalmol
1.431
1.87
0.00558
180 (125.5–308.3)
12.5 (9.5–17.3)
N/A
199.5 (136.5–317.3)
N/A
N/A
Ophthal Surg Las Im
1.422
1.37
0.00457
N/A
N/A
N/A
151 (130.5–175)
318 (295–331.3)
168 (63.3–199.5)
J Eye Movement Res
1.315
1.72
0.00084
N/A
N/A
N/A
206 (173.5–232.8)
N/A
N/A
Can J Ophthalmol
1.305
0.79
0.00287
128 (84.5–170)
69.5 (65–75.8)
245 (228.5–286)
195 (158–239)
385 (336.3–421.3)
184 (163–209.5)
Ophthalmic Genet
1.285
0.97
0.00134
159.5 (97–212.5)
29 (19–38.8)
82 (50–129.8)
198.5 (114.5–251.5)
218 (164–353.3)
52.5 (19.3–103.5)
Int J Ophthalmol
1.189
1.18
0.0046
105 (58.8–140.8)
N/A
111 (72.3–126.8)
N/A
221.5 (177.5–249)
N/A
Ophthal Plast Recons
1.134
0.75
0.00284
N/A
N/A
408 (327.3–454)
N/A
N/A
N/A
Cutan Ocul Toxicol
1.079
N/A
0.00109
51.5 (36–67)
24.5 (21–31.5)
218.5 (185–261.3)
78.5 (63.8–97.5)
287.5 (234–314.3)
194.5 (160.8–219.8)
J AAPOS
1.056
1.03
0.00382
215.5 (151–283.3)
97.5 (83.8–109)
145 (123.5–147.8)
324.5 (243.3–377.3)
351.5 (285.3–447)
40 (25.3–52)
J Pediat Ophth Strab
1.054
0.61
0.0011
188.5 (115–224.3)
144.5 (134.8–177.8)
263.5 (241.3–300.5)
312 (269–381)
431 (366.8–476)
103.5 (83–119)
Indian J Ophthalmol
0.977
0.81
0.00306
83.5 (67–116.3)
88 (78.3–97)
96 (86.8–100.8)
167 (151.3–209.3)
176.5 (159–215.3)
7 (5.5–10.3)
Arq Bras Oftalmol
0.859
0.96
0.00135
155 (88–241)
N/A
146 (142.5–204.8)
N/A
323 (294.3–384.3)
N/A
Klin Monatsbl Augenh
0.792
0.44
0.0012
55 (51.8–165.5)
74 (54.5–87.8)
236 (86.5–344.3)
168 (99.8–236)
443.5 (130.3–474.8)
137.5 (11.8–259.3)
Ophthalmologe Der
0.679
0.53
0.0013
N/A
N/A
N/A
N/A
N/A
343 (325–366.3)
J Fr Ophthalmol
0.557
0.29
0.00115
75.5 (48.5–99.5)
193.5 (163.3–229.3)
199.5 (163.3–211.8)
279 (218.8–325.3)
291.5 (228.3–312.5)
0 (−15.75–10.5)
Abbreviations: IQR, interquartile range; N/A, not applicable.
Article Characteristics
A total of 657 articles from 55 journals were included for analysis; one journal had
only nine available original research articles for 2018 (Visual Neuroscience). For
all randomly selected articles, 541 articles were clinical, 110 were basic science
papers, and 6 articles were not assigned to either category due to their theoretical/mathematical
nature. Forty-one study results were negative (null hypothesis was not rejected) and
610 were positive. For study design, 185 articles were interventional, 362 were observational,
and 110 were laboratory experiments.
Impact of Study Design on Speed to Publication
Laboratory experiments differed from both observational and interventional studies
in time lag from submission to online publication (p = 0.002) and from acceptance to online publication (p < 0.001), but there was no difference between observational and interventional studies
([Fig. 1 ]). Laboratory experiments were published approximately 30 to 36 days faster than
other study designs, with a median time of 161.5 days from submission date to online
publication date compared with 190 days in observational studies (p < 0.001) and 196 days in interventional studies (p < 0.001). After articles were accepted, laboratory experiments were published online
twice as fast as observational and interventional studies with a median time of 22
days compared with 44 days for observational (p = 0.002) and 46 days for interventional studies (p = 0.002). No other statistically significant difference existed between study designs
for time from acceptance to print publication (p = 0.341), submission to print publication (p = 0.07), submission to acceptance (p = 0.115), submission to revision (p = 0.617), revision to acceptance (p = 0.608), nor online publication to print publication (p = 0.267).
Fig. 1 Box and Whisker plots of time lag differences between laboratory, observational,
and interventional study designs. (A ) Depicts differences in time lag from submission to online publication and (B ) depicts acceptance to online publication. Kruskal–Wallis Test showed a statistically
significant difference in both parameters of p = 0.002 and p = 0.001, respectively. Post hoc analysis with Mann–Whitney U -test found laboratory experiments had a statistically significant difference when
compared with both groups and no differences between observational and interventional
studies. Outliers are represented with circles.
Publication Timing Characteristics
The median time lag for all journals from submission to online publication was 187
days with an IQR of 150 to 211.75 days. The median time lag from submission to print
publication was 284 days with an IQR of 215 to 352 days. The median days from online
publication to print publication was 84.75 days with an IQR of 52.3 to 163.5 days.
The difference in the time from submission to online publication and submission to
print publication for all journals was statistically significant (p < 0.001). The median times from submission to acceptance and from acceptance to online
publication were 129 days (IQR: 105 to 166) and 43.5 days (IQR: 16.5–67), respectively.
The median time from acceptance to print publication was 134 days with an IQR of 86.5
to 227.25 days. The median times from submission to revision and revision to acceptance
were 103.3 days (IQR: 83.9–147.1) and 12 days (IQR: 6.5–16.25), respectively.
Correlations of Publication Speed with Journal Impact Factor, CiteScore, Eigenfactor
Score
All three examined journal measures significantly positively correlated with each
other: JIF to CS had an r value of 0.815 (p < 0.001), JIF to ES had an r of 0.667 (p < 0.001), and CS to ES had an r value of 0.618 (p < 0.001). A statistically significant correlation was found between the JIF and the
time from acceptance to online publication (r = –0.332, p = 0.034). A statistically significant correlation was found between the CS and three
parameters: submission to print publication (r = –0.326, p = 0.04), acceptance to print publication (r = –0.375, p = 0.017), and acceptance to online publication (r = –0.447, p = 0.005). Neither statistically significant correlations were found for ES nor any
other parameters with CS and JIF ([Fig. 2 ]).
Fig. 2 Scatterplots demonstrating the correlations between CiteScore and various time periods.
Spearman's Correlation of CiteScore with (A ) time from submission to online publication is r = – 0.224 (p = 0.154), (B ) time from submission to print publication is r = –0.326 (p = 0.04), (C ) time from acceptance to online publication is r = –0.447 (p = 0.005), and (D ) time from acceptance to print publication is r = –0.375 (p = 0.017).
Comparison to 2010 Study
Comparative analysis to Chen et al revealed no statistically significant difference
in the changes from 2010 to 2018 for the three parameters measured in their study.
The time lag from acceptance to print in 2010 was 87 days (IQR: 58.5 to 166.8) for
all journals compared with 142.5 days (IQR: 96 to 218.5) in 2018 (p = 0.113). Time from submission to acceptance and submission to print was 133.5 days
in 2010 and 128.5 days in 2018 from submission to acceptance (p = 0.635), and 244 days in 2010 to 284 days in 2018 from submission to print publication
(p = 0.215). Similar to Chen et al, our study found no correlation between JIF and days
from acceptance to print publication, submission to print publication, and submission
to acceptance date (p = 0.245, 0.167, and 0.768, respectively).
Discussion
As technology and innovation continue to accelerate in ophthalmology, faster times
to publication are necessary to keep the rate of knowledge dissemination on par with
the rate of knowledge acquisition. Some argue, however, that faster publication times
may be detrimental, citing a lower quality peer review process as a possible repercussion
of seeking faster publication times.[9 ]
[10 ] This debate has become more prominent during the COVID-19 viral pandemic. Major
medical journals, including the New England Journal of Medicine and Lancet , have issued retractions for COVID-19-related articles that were fast tracked to
publication and whose results influenced trial design by major institutions, including
the World Health Organization.[11 ] Relevant to ophthalmology, an expedited article in Lancet characterizing optical coherence tomography (OCT) findings in patients with COVID-19
infection has received criticism for misinterpretation of normal retinal anatomy in
the images, indicating that perhaps the review process was not thorough enough.[12 ]
[13 ]
Finding the perfect balance between being timely and still ensuring accuracy is critical
for any peer-reviewed journal. Our results suggest that this pressure is greatest
on the most prominent journals in ophthalmology, as online publication speed was significantly
and positively correlated with higher bibliometric measures of journal's “impact”
and “prestige.” As their higher status of journal impact and prestige demonstrate
that they are the most consumed information in the field, these journals may feel
the need to publish data as rapidly as possible when new changes in the field arise.
This can be seen with the current viral pandemic in which most articles involving
COVID-19 were submitted to and published in these journals. Unfortunately, likely
due to attempts to disseminate this information immediately, key data were overlooked
and articles were ultimately retracted. This expectation to publish articles as soon
as possible for the public may have been the reason for the fast turnaround time of
publication and consequently, the errors made. Alternatively, it is possible that
these journals may have more resources available and allows them to expedite publication
times for submissions faster than other less regarded journals. This may also explain
the correlation to faster online publication speeds found in this study or can point
to possibly a combination of both the need to publish quickly and having more access
to resources as reasons. More research in this field of study is required to clearly
delineate this correlation.
It is noteworthy that JIF was not correlated to submission to print, acceptance to
print, or submission to acceptance times. This implies that the greatest portion of
acceleration by “higher impact” journals occurs when taking an article from acceptance
to online publication. Indeed, the time from submission to online publication was
significantly shorter than time to print publication, allowing editors more time to
proof articles prior to commitment to physically printing. Still, once available online,
peer-reviewed manuscripts are considered vetted by the publishing journal and consumed
by the scientific public regardless of the pending print status.[14 ]
[15 ]
[16 ] Thus, we would argue that ophthalmology's highly regarded journals must be cautious
and should approach time from submission to online publication as the most critical
“rate-limiting step” where errors and research gaps must be caught.
One would expect that because of the prolonged time and effort that it takes to carry
out laboratory experiments, especially additional validation experiments requested
after peer review, the time to publish laboratory experiments would be longer than
clinical studies. In fact, the opposite was found to be true. There are various possible
explanations. One study reported that the “impact” of clinical studies may be deemed
to be smaller than that of basic science studies in the literature.[17 ] This perception of higher impact of basic science articles may be the cause of shortened
publication times; laboratory studies submitted to and published in ophthalmology
journals, which tend to be more clinically oriented, may be viewed as important findings
that would benefit clinicians instantly or encourage immediate pursuit of applied
research in these areas. Laboratory studies are also not as commonly published in
ophthalmology journals, possibly inflating the perceived importance in the instances
that these articles are submitted and published, leading to faster publication times.
Similarly, a shift toward widespread interest in translational medicine in recent
years may be a contributing factor to necessitating faster publication times of the
basic sciences.[18 ] Research on rates of basic science submission and the immediate impact of basic
science in the clinical setting is scarce and future research investigating these
matters is warranted.
While the JIF was not initially intended to be used as a method for researchers to
rank the top journals in a field of study, over the years, it has become the gold
standard for this purpose.[19 ] Likewise, other bibliometric measures to be evaluated for this purpose have gained
traction. Although each measure accounts for different characteristics in a journal,
as seen here for ophthalmology journals and in the literature for other fields of
study, these factors are strongly positively correlated in relation to each other.[20 ] For this reason, one suggested method for evaluating a journal's “prestige” is to
evaluate and rate various bibliometric measures simultaneously and rank the journals
in a given field based on highest ratings across these measures.[5 ] This would counter some of the inherent flaws of each bibliometric measure and allow
for a more robust evaluation of top journals in a field.
Limitations
There were several limitations to this study. The bibliometric measures assessed are
only three of countless available measures. The use of three measurements arose from
recent recommendations that suggest relying less on only JIF, and instead including
several bibliometric measures for assessment of a journal.[21 ]
[22 ]
[23 ] Due to random selection, this study did not achieve a high-enough proportion of
basic science and negative result studies to determine whether those parameters impact
publication times. Future studies may aim to actively search for these articles during
data collection to have a more representative sampling for basic science studies and
negative studies. Another limitation was the absence of revision dates and counts
in the majority of journals. Time to revision from article to article likely varies
widely due to several factors and is valuable information that is not readily available.
For this reason, we did not continue with our assessment of revision times, although
we recorded these values whenever available. Future studies addressing these variables
would likely provide more definitive answers to the role these possible confounders
may play. Lastly, this study was limited to only ophthalmology journals. One would
suspect that the speed at which articles are published likely plays a role in every
field of study, and that repeating this study in other fields may provide invaluable
information for these specialties. Additionally, there are ophthalmic/eye research
journal articles that are published in nonophthalmology journals and we did not assess
those articles.
Conclusion
In conclusion, our study demonstrates that ophthalmology journals with higher bibliometric
measure scores have significantly higher online publication speed, emphasizing the
importance placed on rapid dissemination of knowledge by these most widely read journals.
Publishing manuscripts faster must be balanced with the risks of retractions, corrections,
and other errors that may have major downstream effects given the potential impact
of these journals' articles on clinical care and ongoing scientific research.[24 ]
[25 ] Future studies may shed light on how common these speed-influenced errors are and
may help determine the optimal publication speed to balance the simultaneous objectives
of scientific accuracy and expediency.