Mega Journals 2: Promising or Predatory?

— by Fanny Liu

Introduction 

In the previous post, we discussed major characteristics and the niche of mega journals, such as open access, wider scope, soundness only peer-review and higher acceptance rates. In this post, we will focus on some controversies. 

Concerns 

“Soundness-only” peer review: Yet to be defined? 

Senior executives and editors of mega journals shared a consensus that soundness-only peer review referred to evaluating the rigour and ethics of the research (Spezi et al., 2018). However, in the context of the arts and humanities, there was no agreed definition of soundness. Some even expressed that soundness just could not be translated into these disciplines. 

Volume and value 

Strictly speaking, a mega journal should publish anything technically sound, be it lack of novelty or significance. However, negative views of soundness-only peer review, particularly from historians and education researchers arose, often because the concept was perceived to lead to increasing number of articles, and therefore to a form of information overload (Wakeling, Spezi, et al., 2019). 

PLOS ONE, initially known for publishing any sound research, was faced with credibility issues due to an influx of “replications of meta-analyses”, papers aimed only at boosting authors’ publication records (Spezi et al., 2018). In response, the journal introduced a requirement that papers must be “worthy of inclusion in the published scientific record”, a change which could represent a concession — that not all sound science should be published. 

[Image curtesy of いらすとや

Journal Impact Factor, again 

While the Journal Impact Factor (JIF) is based on citations, and has its limitations, it is still influential, and widely used in hiring, tenure, and promotion decisions. 

A survey (see figure 1) revealed that Chinese, Taiwanese, and Spanish mega-journal authors were more likely to value JIF when selecting journal, with 78.5% (±3.5%) of respondents from these countries rating JIF as “very” or “extremely” important, compared with 60.7% (±1.3%) of those from other countries (Wakeling, Creaser, et al., 2019, p. 761). 

Mega journals, which remove selective criteria of novelty or significance, and potentially impact, are likely to publish large numbers of articles generating comparatively fewer citations, and not conducive to achieving and maintaining high Journal Impact Factors (Spezi et al., 2018). This may have created a tension between the journals’ vision and potential authors’ desire to publish in high impact factor journals. 

Figure 1: Importance of Journal Impact factor in their decision to submit to a specific journal by mega journal authors; Created with data by Creaser et al. (2018) 

Selectivity and standard 

While mega journals conduct soundness-only peer review, some authors may view mega journals as lower-quality outlets or even “dumping grounds”. While refuting the argument, some senior executives and editors of mega journals found it acceptable for mega journals to publish papers ranging from top-tier to marginal quality, all across the quality spectrum (Spezi et al., 2018). 

In the survey by Wakeling, Creaser, et al. (2019), mega journal authors considered quality of the journal the most important factor in selecting a journal; While publishers’ reputation was also rated highly by the survey respondents, it could be because perceptions of journal quality are closely connected with publishers’ prestige (see figure 2). 

Figure 2: Proportion (%) of mega journal authors selecting “very important” or “extremely important” for each factor in their decision to submit to a specific journal (n=5,751); Created with data by Wakeling, Creaser, et al. (2019, Appendix 4) 

Talent war? 

While the volume of academic journal articles grows over time, the difficulty of recruiting peer-reviewers could become a limiting factor. Especially in mega journals, several reasons could discourage scholars from accepting the reviewing invitations (Björk, 2021): 

  • Scholars may feel intellectually less motivating to only assess the scientific soundness of a manuscript, in comparison to evaluating both soundness, novelty and scholarly significance. 
  • The journals are not scientifically leading in their fields. 
  • While leading journals in niche areas are usually part of a social network of scholars or the flagship journals of influential academic societies, mega journals usually have broad scopes, and are not closely knit with communities. 

While peer review is a key process in ensuring quality, concerns have been raised regarding the practices of some mega journals – that they could not uphold the standard and may have become predatory. One example of publishers faced with controversies is MDPI. Study found that in 2019, MDPI published 15 new journals and 64.1% more articles than 2018, but the average time from submission to first decision from MDPI-journals remained 19 days in both 2019 and 2018 (Oviedo-García, 2021). In 2022, MDPI’s median time from submission to acceptance was 37 days, an obviously shorter time lapse as compared to the 200 days at the PLOS family of journals, another large, open-access publisher (Brainard, 2023). This gave rise to critique that it is impossible to provide up-to-standard peer review, considering the time it requires to recruit reviewers and revise manuscripts. 

Conclusion 

Since the inception of PLOS ONE in 2006, mega journals have emerged and secured a niche in scholarly publishing while also sparking debate. Mega journals are characterized by open-access, soundness-only peer reviews, large publishing volumes, wide disciplinary scopes, and higher acceptance rates. 

The soundness-only peer review may attract research that might not meet the novelty or significance criteria of traditional journals. However, this approach has led to concerns about a potential reduction in the perceived quality of published research. 

While mega journals are not necessarily predatory journals, authors should be cautious and critical in order to select journals which are trusted and transparent. 

References 

Björk, B.-C. (2021). Publishing speed and acceptance rates of open access megajournals. Online Information Review, 45(2), 270-277. https://doi.org/10.1108/OIR-04-2018-0151

Brainard, J. (2023). Fast-growing open-access journals stripped of coveted impact factors. Science, 379(6639), 1283-1284. https://doi.org/10.1126/science.adi0098

Creaser, C., Fry, J., Wakeling, S., Pinfield, S., Willett, P., & Spezi, V. (2018). Open Access Mega Journal survey data Loughborough University. https://doi.org/10.17028/rd.lboro.7211924.v1 

Oviedo-García, M. Á. (2021). Journal citation reports and the definition of a predatory journal: The case of the Multidisciplinary Digital Publishing Institute (MDPI). Research Evaluation, 30(3), 405-419a. https://doi.org/10.1093/reseval/rvab020

Spezi, V., Wakeling, S., Pinfield, S., Fry, J., Creaser, C., & Willett, P. (2018). “Let the community decide”? The vision and reality of soundness-only peer review in open-access mega-journals. Journal of Documentation, 74(1), 137-161. https://doi.org/10.1108/JD-06-2017-0092

Wakeling, S., Creaser, C., Pinfield, S., Fry, J., Spezi, V., Willett, P., & Paramita, M. (2019). Motivations, understandings, and experiences of open-access mega-journal authors: Results of a large-scale survey. Journal of the Association for Information Science and Technology, 70(7), 754-768. https://doi.org/10.1002/asi.24154

Wakeling, S., Spezi, V., Fry, J., Creaser, C., Pinfield, S., & Willett, P. (2019). Academic communities: The role of journals and open-access mega-journals in scholarly communication. Journal of Documentation, 75(1), 120-139. https://doi.org/10.1108/JD-05-2018-0067

Share