Research Ethics MonthlyISSN 2206-2483

Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

In a world of hijacked, clone and zombie publishing, where shouldn’t I publish?

 


When we talk to research higher degree candidates and early career researchers about publication ethics, one question comes up repeatedly. Indeed, it is a question we are frequently asked by experienced researchers, particularly those who wish to publish in a new field – where should I publish? That’s a difficult question to answer in the abstract so first we would like to remove some distractions from the decisions that need to be made. In this piece, we look at the other side of the coin and explore where researchers should not publish.

Research institutions often provide their staff with incentives to publish in top-ranking journals as determined by impact factor. Publishing in these journals can boost the university’s standing in some international rankings and national research assessment exercises. Consequently, performance indicators, promotion and recruitment criteria, track records for grant assessment and even financial bonuses may be aligned with these outlets.

Good research takes a long time and we should take care where we place our outputs. If we want our papers to be read, we need to look for a journal that reaches our prospective audience. In some fields, this might mean a niche but highly rated journal linked to a particular professional association; in other cases, we seek a journal that is covered by reputable indexes and databases like Medline, PubMed, Scopus or the Web of Science. Only then would a paper be included by subsequent systematic review or meta-analysis, for example.

However, many researchers may find it tough to break into the top 25%, let alone the top 10% of journals. Even if they can, the process can prove lengthy and frustrating as journals use robust peer review processes and may call for repeated and extensive and perhaps even unwarranted revisions. In the face of this, some scholars may come under pressure to publish quickly, particularly if the award of a doctorate or confirmation of the first job is dependent on having something in print. And, for some purposes (including Australian institutional block research funding until quite recently), quantity may trump quality.

There are traps for the unwary who find themselves in this position. Everyone wants to avoid predatory journals and publishers and, yet, not everyone does. Not even some top researchers manage to avoid these outlets according to one study of academic economists (Wallace and Perri, 2016). Researchers, it seems, can be seduced by an invitation from journal editors, an invitation sometimes filled with ‘flattering salutations, claims that they had read the recipient’s papers despite being out of the journals claimed area of study, awkward sentence structure and spelling mistakes, and extremely general topics’ (Moher and Srivastava, 2015).

While many researchers have been duped, publication scammers are not always given a free ride. A few have come under some pressure from legal authorities. In 2016, the Federal Trade Commission (FTC) filed a brief in the US District Court against the OMICS Group and related entities. The brief reveals a little about what is known about these journals. OMICS, for instance, is a Nevada-registered, Hyderabad-based entity that claims to run 700 journals. The FTC alleged that OMICS deliberately misled potential authors by misrepresenting the composition of the editorial board, the process of review, the journal’s impact factor and the fee for publication:

…the academic experts identified by Defendants lack any connection with Defendants’ journals. Further, in many instances, articles submitted for publishing do not undergo standard peer review before publishing. And Defendants’ journals’ impact factors are not calculated by Thomson Reuters, nor are their journals included in PubMed Central. Moreover, Defendants fail to disclose, or disclose adequately, that consumers must pay a publishing fee for each published article. (p.5)

Recently, OMICS has diversified its strategy. In 2016, Canadian journalists reported OMICS had bought at least the trading name of reputable Canadian publishers and appeared to have also picked up their publishing contracts with well-regarded journals. This, it seems, was done so that OMICS could use these names as a front to attract articles to its predatory publishing stable (Puzic, 2016). Some professional associations who found their publishing contracts taken over have declared their intention to break their connection to OMICS.

When assessing which journals to target for your work, you might:

  1. Read recent issues of the journal. Are the papers of a quality you would cite? Can you find evidence of good editorial standards? Would your work fit among the papers published there? What, Macquarie University counsels its staff to consider, is its relevance, reputation, visibility and validity?
  2. Check the standing of the publication’s editors. Are they members of the Committee on Publication Ethics (COPE) or, if their journals are online, the Open Access Scholarly Publishers’ Association (OASPA); predatory publishers are less likely to be members. COPE has also created the Check.Submit.campaign to support authors’ decision-making.
  3. Talk to a research librarian, your peers and mentors about the potential publisher. If you know anyone who is on the Editorial Advisory Board, ask them about the journal at the same time that you seek to establish whether whether the journal might be interested in your work. Some leading academics have found their names on Editorial Advisory Boards of predatory journals and have discovered that it is easier to join these lists than have their name removed.
  4. Read the publisher’s policies and editorial review practices. Are they coherent, do they provide detailed information on submission guidelines and peer review processes? If they guarantee a speedy turnaround, that is often a warning sign. Check whether they impose ‘article processing charges’ (APCs). Then, check again.

Reach out to researchers that have previously been published there to discuss their experiences and impressions.

Not every legitimate journal can extract itself from predatory publishers. Where once respected journals are hijacked by criminal enterprises, they can continue their existence as ‘zombie journals’, trading off the reputation built up in the past but behaving like any other predatory journal. There are other kinds of dishonest practices. Some predatory publishers have established ‘clone journals’ who use the same title as a legitimate journal and reproduce the original journal’s website or make minor changes to the title in an attempt to deceive unwary authors (Prasad, 2017). Hijacked, clone and zombie publishing can turn a glowing recommendation into a trap for the unwitting. Analysis of criminal activities in publishing has taken a little time to catch up with offending patterns. In recent work, Moher and Shamseer (2017) argued that the term predatory journal should be replaced by ‘illegitimate entities’, refuting the idea that such entities were entitled to clothe themselves in the language of the legitimate publication industry.

So, what advice should we give to researchers about being prudent with the treasured fruit of their labours? Until recently, one quick answer might have been to avoid journals on Beall’s List, a list of ‘predatory journals’ that Jeffrey Beall, a US-based librarian, had placed on a ‘blacklist’. The list always had its critics (Naylon, 2017). Variables used by Beall such as open access, fees to publish, locations in low- to medium-income countries, and novel peer review practices are not automatically predictors of a predatory publisher. Nor does the converse necessarily guarantee a publisher is a safe choice. However, whatever its longstanding flaws, Beall’s List is rapidly losing its currency. Earlier this year Beall decided to ‘unpublish’ his list; the list is no longer updated and is only available on cache sites. Institutions seeking a successor to Beall’s List can look towards a commercial provider, Cabells, which has announced its own Blacklist. Anyone using a blacklist should also check journals against a ‘white list’ like the Directory of Open Access Journals or even the old Excellence in Research Australia journal rankings (removed from the research council websites, but still circulated discreetly like a samizdat newsletter among Australian academics).

Unfortunately, unless black and white lists are updated continuously, they can never keep up with changes in the publication industry. Some publishers once regarded as predatory genuinely improve their practices over time. On the other hand, illegitimate practices have also changed over time. Over the last few years, we have seen the movement of organised and unorganised crime into the industry, attracted by the roughly US$100m in fees that Shamseer et al. (2017) estimated (very roughly) that predatory publishers might be obtaining.

So, the quick answer to the question ‘where shouldn’t I publish?’ is that since the demise of Beall’s List, researchers need to engage in critical enquiry and reflection about a potential publisher. This should not come as a shock – the same advice would have been true long before the end of Beall’s List.

In recent weeks, we’ve been including in the Resource Library discussion pieces, papers and strategies that propose how to assess publishers.

These include:

Not the ‘Beall’ and end-all: the death of the blacklist, AOAG Webinar Series (Dr Andy Pleffer & Susan Shrubb | April 2017)

Beyond Beall’s List: Better understanding predatory publishers, Association of College & Research Libraries (Monica Berger and Jill Cirasella | March 2015)

Black lists, white lists and the evidence: exploring the features of ‘predatory’ journals, BioMed Central Blog (David Moher & Larissa Shamseer | March 2017)

Warning: conmen and shameless scholars operate in this area. Times Higher Education (James McCrostie | January 2017)

Blacklists are technically infeasible, practically unreliable and unethical. Period. – LSE Blog (Cameron Neylon | January 2017)

Beware! Academics are getting reeled in by scam journals – UA/AU (Alex Gillis | January 2017)

References

Moher, D. and Shamseer, L. (2017) Black lists, white lists and the evidence: exploring the features of ‘predatory’ journals. BioMed Central Blog 16 Mar 2017. https://blogs.biomedcentral.com/bmcblog/2017/03/16/black-lists-white-lists-and-the-evidence-exploring-the-features-of-predatory-journals/

Moher, D. and Srivastava, A. (2015) You are invited to submit…. BMC Medicine, 13(1), p.180. https://bmcmedicine.biomedcentral.com/articles/10.1186/s12916-015-0423-3

Neylon C. (2017) Blacklists are technically infeasible, practically unreliable and unethical. Period. LSE Blog. https://cameronneylon.net/blog/blacklists-are-technically-infeasible-practically-unreliable-and-unethical-period/

Prasad, R. (2017) Predatory journal clones of Current Science spring up. The Hindu, 14 July. http://www.thehindu.com/sci-tech/science/predatory-journal-clones-of-current-science-spring-up/article19277858.ece

Puzic, S. (2016) Offshore firm accused of publishing junk science takes over Canadian journals. CTV News. 28 September. http://www.ctvnews.ca/health/health-headlines/offshore-firm-accused-of-publishing-junk-science-takes-over-canadian-journals-1.3093472?hootPostID=00bc7834da5380548a8b2d58e40c8b29

Shamseer, L, Moher, D., Maduekwe, O., Turner, L., Barbour, V., Burch, R., Clark, J., Galipeau, J., Roberts J. and Shea, B.J. (2017) Potential predatory and legitimate biomedical journals: can you tell the difference? A cross-sectional comparison. BMC Medicine 15:28. https://doi.org/10.1186/s12916-017-0785-9

Wallace, F. and Perri, T. (2016) Economists behaving badly: publications in predatory journals. MPRA Paper No. 73075, posted 15 August. https://mpra.ub.uni-muenchen.de/73075/1/MPRA_paper_73075.pdf

Contributors
Prof. Mark Israel, senior consultant AHRECS, Mark’s AHRECS bio – mark.israel@ahrecs.com
Dr Gary Allen, senior consultant AHRECS, Gary’s AHRECS bio – gary.allen@ahrecs.com

This post may be cited as:
Israel M & Allen G. (2017, 26 July) In a world of hijacked, clone and zombie publishing, where shouldn’t I publish? Research Ethics Monthly. Retrieved from: https://ahrecs.com/research-integrity/world-hijacked-clone-zombie-publishing-shouldnt-publish

Andy Pleffer says:

Many thanks Gary and Mark for the plug, and to Ginny and AOASG for their ongoing support of the work Susan and I have been doing for the last two years.

The quick answer is that there are no quick answers. Just as there are no quick research papers. And if you’re after quick answers, perhaps you’re asking the wrong questions.

Ginny Barbour says:

I agree with all you say here and have also been a long term critic of black lists. Beall especially was problematic because he was on the record of being very anti-OA generally. There is no doubt that problematic publishing practices predate OA.
Your point that “researchers need to engage in critical enquiry and reflection about a potential publisher. This should not come as a shock – the same advice would have been true long before the end of Beall’s List.” is well made and as you note is part of the Think Check Submit campaign (which is not just a COPE initiative but has the support of a number of groups, including publishers, who actually started it).
I am always astonished that anyone would publish in a place they had never heard of. In many ways we need to just ensure the same commonsense principles that we would apply to any transaction are applied to publishing.
Finally, just to note, academic libraries are always an excellence source of advice on these topics.
COI – I am the immediate past chair of COPE, which is mentioned here

Profile photo of Admin Admin says:

Many thanks Ginny for the great contributions and all you do in the research integrity sphere. Given the time, effort and intellectual capital involved in producing a research output it’s astounding that researchers would scrutinise the bonafides of an online retailer more closely than checking the practices of a potential publisher. Gary



Leave a Reply

Your email address will not be published. Required fields are marked *

captcha

Please enter the CAPTCHA text