ACN - 101321555 Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

Resource Library

Research Ethics MonthlyAbout Us

ResourcesPublication ethics

Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

Australian universities must wake up to the risks of researchers linked to China’s military – The Conversation (Clive Hamilton | July 2019)0

Posted by Admin in on July 28, 2019
 

Two Australian universities, University of Technology Sydney and Curtin University, are conducting internal reviews of their funding and research approval procedures after Four Corners’ revealed their links to researchers whose work has materially assisted China’s human rights abuses against the Uyghur minority in Xinjiang province.

UTS, in particular, is in the spotlight because of a major research collaboration with CETC, the Chinese state-owned military research conglomerate. In a response to Four Corners, UTS expressed dismay at the allegations of human rights violations in Xinjiang, which were raised in a Human Rights Watch report earlier this year.

Yet, UTS has been aware of concerns about its collaboration with CETC for two years. When I met with two of the university’s deputy vice chancellors in 2017 to ask them about their work with CETC, they dismissed the concerns.

Read the rest of this discussion piece

Knowledge and motivations of researchers publishing in presumed predatory journals: a survey (Papers: Kelly D Cobey, et al | March 2019)0

Posted by Admin in on July 27, 2019
 

Abstract
Objectives
To develop effective interventions to prevent publishing in presumed predatory journals (ie, journals that display deceptive characteristics, markers or data that cannot be verified), it is helpful to understand the motivations and experiences of those who have published in these journals.

Design
An online survey delivered to two sets of corresponding authors containing demographic information, and questions about researchers’ perceptions of publishing in the presumed predatory journal, type of article processing fees paid and the quality of peer review received. The survey also asked six open-ended items about researchers’ motivations and experiences.

Participants
Using Beall’s lists, we identified two groups of individuals who had published empirical articles in biomedical journals that were presumed to be predatory.

Results
Eighty-two authors partially responded (~14% response rate (11.4%[44/386] from the initial sample, 19.3%[38/197] from second sample) to our survey. The top three countries represented were India (n=21, 25.9%), USA (n=17, 21.0%) and Ethiopia (n=5, 6.2%). Three participants (3.9%) thought the journal they published in was predatory at the time of article submission. The majority of participants first encountered the journal via an email invitation to submit an article (n=32, 41.0%), or through an online search to find a journal with relevant scope (n=22, 28.2%). Most participants indicated their study received peer review (n=65, 83.3%) and that this was helpful and substantive (n=51, 79.7%). More than a third (n=32, 45.1%) indicated they did not pay fees to publish.

Conclusions
This work provides some evidence to inform policy to prevent future research from being published in predatory journals. Our research suggests that common views about predatory journals (eg, no peer review) may not always be true, and that a grey zone between legitimate and presumed predatory journals exists. These results are based on self-reports and may be biased thus limiting their interpretation.

This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial

Cobey KD, Grudniewicz A, Lalu MM, et al Knowledge and motivations of researchers publishing in presumed predatory journals: a survey. BMJ Open 2019;9:e026516. doi: 10.1136/bmjopen-2018-026516
Publisher (Open Access): https://bmjopen.bmj.com/content/9/3/e026516

It’s Time to Lift the Veil on Peer Review – UnDark (Dalmeet Singh Chawla | June 2019)0

Posted by Admin in on July 24, 2019
 

Data analysis can improve the vetting of scientific papers, but first publishers must agree to make the information public.

THE JOB OF A PEER REVIEWER is thankless. Collectively, academics spend around 70 million hours every year evaluating each other’s manuscripts on the behalf of scholarly journals — and they usually receive no monetary compensation and little if any recognition for their effort. Some do it as a way to keep abreast with developments in their field; some simply see it as a duty to the discipline. Either way, academic publishing would likely crumble without them.

In recent years, some scientists have begun posting their reviews online, mainly to claim credit for their work. Sites like Publons allow researchers to either share entire referee reports or simply list the journals for whom they’ve carried out a review. Just seven years old, Publons already boasts more than 1.7 million users.

The rise of Publons suggests that academics are increasingly placing value on the work of peer review and asking others, such as grant funders, to do the same. While that’s vital in the publish-or-perish culture of academia, there’s also immense value in the data underlying peer review. Sharing peer review data could help journals stamp out fraud, inefficiency, and systemic bias in academic publishing. In fact, there’s a case to be made that open peer review — in which the content of their reviews is published, sometimes with the name of reviewers who carried out the work — should become the default option in academic publishing.

Read the rest of this discussion piece

The Rise of Junk Science – The Walrus (Alex Gillis | July 2019)0

Posted by Admin in on July 23, 2019
 

Fake publications are corrupting the world of research—and influencing real news

In early 2017, Eduardo Franco, a professor in the Faculty of Medicine at McGill University, sent an email to his colleagues, warning them of a global “epidemic” of scams by academic journals that was corrupting research and, in effect, endangering the public. As head of the oncology department, where he oversees approximately 230 people, Franco promised to comb through every CV and annual evaluation in the department to flag any colleagues’ resumés that listed journals and conferences that weren’t reputable or, in some cases, even real. He didn’t spell out the consequences, but the implication was clear: the faculty members would be held accountable.

The AHRECS team have started to observe this worrying trend in our other roles.  It is essential research institutions direct researchers (via policy, guidance material and professional development strategies) away from junk science.  Funding bodies also need to play a key role in this regard.

A scholar for forty years, Franco has followed the rise of junk publishers for about a decade. He has seen them go from anomalous blights on academics’ credentials to widespread additions on scholarly resumés, nearly indistinguishable from legitimate work. Now, he says, “there’s never been a worse time to be a scientist.” Typically, when a scholar completes work they want to see published, they submit a paper to a reputable journal. If the paper is accepted, it undergoes a rigorous editing process—including peer review, in which experts in the field evaluate the work and provide feedback. Once the paper is published, it can be cited by others and inspire further research or media attention. The process can take years. Traditionally, five publishers have dominated this $25 billion industry: Wiley-Blackwell, Springer, Taylor & Francis, RELX Group (formerly Reed Elsevier), and Sage. But, before the turn of the century, a new model of online publishing, “open access,” began opening doors for countless academics—and for thousands of scams in the process.
.

The new online model created an opportunity for profits: the more papers publishers accepted, the more money they generated from authors who paid to be included—$150 to $2,000 per paper, if not more, and often with the support of government grants. Researchers also saw substantial benefits: the more studies they posted, the more positions, promotions, job security, and grant money they received from universities and agencies. Junk publishers—companies that masquerade as real publishers but accept almost every submission and skip quality editing—elbowed their way in.
.

Read the rest of this discussion piece

0