Predatory journals/publishers are becoming an increasingly urgent concern in academia, with some estimates suggesting 420,000 articles published by 8,000 journals in 2014 . The issue is complex and controversial. In this post, I describe some of the problems and implications associated with predatory publishing, how our College is attempting to deal with the issue, and some suggestions that can hopefully help faculty and administrators alike. We are still in the process of refining and implementing our policies and procedures in this rapidly evolving environment, and as such we invite our colleagues at other institutions to share their experiences and suggestions as well.
Predatory journals/publishers are becoming an increasingly urgent concern in academia, with some estimates suggesting 420,000 articles published by 8,000 journals in 2014  . The issue is complex and controversial. In this post, I describe some of the problems and implications associated with predatory publishing, how our College is attempting to deal with the issue, and some suggestions that can hopefully help faculty and administrators alike. We are still in the process of refining and implementing our policies and procedures in this rapidly evolving environment, and as such we invite our colleagues at other institutions to share their experiences and suggestions as well.
What are predatory journals?
Predatory open access (OA) publishing is an “exploitative open-access publishing business model that involves charging publication fees to authors without providing the editorial and publishing services associated with legitimate journals (open access or not)”. These outlets are profit-driven rather than based on rigorous research/knowledge dissemination; they are typically low quality, with questionable (or absent) peer-review practices  . Bohannon (2013) conducted an elaborate sting operation by sending a “hopelessly flawed” study to hundreds of journals and achieving a disturbing acceptance rate of 61.57% . A typical example would be a final decision with no sign of peer review (60% of all submissions) and a rapid response (acceptance took 40 days on average) requesting payment for publishing the paper.
It is important to note that many legitimate and high-quality journals also charge publishing fees, so fees alone are not an indicator of predatory publishing practices—in some disciplines, publishing fees are the norm. Another consideration is that simply being ‘subscription-based’ (i.e., not open-access) does not make a journal reputable per se. The existence of low quality/questionable outlets happened long before OA or online journals were developed—but OA platforms have enabled these practices to increase exponentially.
Until recently, one way to identify predatory journals and predatory publishers was through something called “Beall’s List.” Librarian Jeffrey Beall published an online list of what he terms “potential, possible, or probable predatory scholarly open-access publishers.” Beall used extensive criteria to define what is considered “predatory,” including, but not limited to characteristics of the editorial team, false claims about impact or indexing, unethical review practices, republishing papers from other outlets, false identifiers, etc.
Beall has been praised worldwide for his work, but his list and inclusion criteria are not without their critics. Some have argued that the list is biased against outlets from less-developed countries, that the criteria are ambiguous, or that Beall lacks disciplinary knowledge of many of the journals  . Nonetheless, it was perhaps one of the best sources of information to protect scholars from disreputable publishers/journals. I say “was” because Beall’s List was taken down in January 2017. Surely we are not alone in waiting to hear what will happen to this resource going forward. The plot thickens.
How the Edwards School of Business is currently managing the issue
In May 2016, our Faculty Council passed a motion to disallow publications in predatory outlets from collegial processes considerations (like merit review, tenure, or promotion), reflecting a desire to ensure the legitimacy and quality of our research. A policy and associated procedures were subsequently created in order to fulfill that motion, which was passed unanimously at Faculty Council. The intent of our policy is to (a) assist researchers in protecting their work and targeting legitimate scholarly outlets, (b) guide collegial process committees in their decision making, and (c) support and enhance the quality and impact of our research achievements.
We had been using Beall’s List as the basis for identifying predatory journals, which provided a reasonable starting point. However, the journals and publishers on Beall’s List (and likewise for nearly all lists and rankings of journal impact or quality) are regularly reviewed and change over time. That is, a journal that is not listed one year may be listed the next year, and vice versa. This can happen for many reasons, including changes in editorial practices or a journal being purchased by a different publisher. Recognizing the challenge this can create, we offered some additional procedures (described next) to help faculty navigate this complexity.
Although the statistics seem scary and the landscape may feel intimidating or unpredictable, there are a number of ways to avoid predatory journals. Some schools take an opposite approach to what we outlined above; that is, rather than having a “no” list (like Beall’s, for instance), they might have a “yes” list (or “white list”) of approved, acceptable, or exclusive journals that will be recognized, rewarded, or otherwise counted.
If using something like Beall’s List, and recognizing that such lists (and any given journal’s status) can change over time, authors might consider evaluating and establishing a case for journal/publisher legitimacy by documenting evidence throughout the review/publication process, such as:
• journal/publisher does not appear on Beall’s List at the time of submission/publication;
• journal appears on accepted ratings or rankings lists (e.g., ABDC, ABS);
• journal adheres to appropriate Editorial and peer review practices;
• records of communication (e.g., email) with Editorial team that demonstrate appropriate timeframes and peer-review processes;
• the Editorial Board is comprised of reputable scholars;
• journal is affiliated with a recognized scholarly or professional association.
These same practices should be helpful to anyone who is conducting or consuming research (e.g., individual scholars, graduate students, practitioners). The legitimacy of journals can no longer be assumed: sometimes it is obvious in the multi-coloured and badly-spelled spam email solicitations, but sometimes it is not so obvious—for instance, there are predatory publishers that create ‘mimic’ websites to look like similarly (or even identically) named legitimate outlets. Wary background checking in advance will save a lot of pain later. (PS. I created a “quick reference guide” of journal impact and quality indicators that our faculty and students have found useful, and I’m happy to share it. Feel free to email me for a copy.)
The issue of predatory publishers and journals is complex and rapidly shifting—and now predatory conferences are a topic of concern as well, as evidenced by an accepted conference paper written by iOs autocorrect . It’s unlikely that there is a “one size” approach to fit all schools—white lists, establishing journal legitimacy, excluding journals on Beall’s List—these are all just options that can help scholars protect their work, as well as protect the credibility and legitimacy of science and research more broadly.
Other sources and interesting stories:
 Shen, C., & Bjork, B.C. (2015). ‘Predatory’ open access: A longitudinal study of article volumes and market characteristics. BMC Medicine, 13:230, 1-15.
 Berger, M. & Cirasella, J. (2015). Beyond Beall's List. College & Research Libraries News, March 2015, 132–135. Available at: http://crln.acrl.org/content/76/3/132.full.
 I got this information from a Wiki page https://en.wikipedia.org/wiki/Predatory_open_access_publishing, but here are some of the original sources for critiques of Beall’s List:
Murray-Rust, P. (February 18, 2014). "Beall's criticism of MDPI lacks evidence and is irresponsible".
© CSIOP 2017.