ACN - 101321555 Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

Resource Library

Research Ethics MonthlyAbout Us

ResourcesAnalysis

Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

Journals’ Plagiarism Detectors May Flag Papers in Error – The Scientist (Diana Kwon | June 2019)0

Posted by Admin in on August 1, 2019
 

One recent case, in which a scientist claims his submitted manuscript was rejected despite a lack of actual plagiarism, highlights the limitations of automated tools.

If the researcher’s claims are true, this case points to an uncomfortable situation: Institutional research misconduct approaches need to be more robust and not rely solely on automated detection tools.

Last week, Jean-François Bonnefon, a behavioral scientist at the French Centre National de la Recherche Scientifique, tweeted that a scientific manuscript he submitted to a journal had been rejected by a bot. The program had flagged his paper for plagiarism, highlighting the methods, references, and authors’ affiliations. “It would have taken 2 [minutes] for a human to realize the bot was acting up,” Bonnefon wrote in one of his tweets. “But there is obviously no human in the loop here.”
.

In a massive Twitter thread that followed, several other academics noted having similar experiences.
.

“I found [Bonnefon’s] experience quite disconcerting,” Bernd Pulverer, chief editor of The EMBO Journal, writes in an email to The Scientist. “Despite all the AI hype, we are miles from automating such a process.” Plagiarism is a complex issue, he adds, and although tools to identify text duplication are an invaluable resource for routine screening, they should not be used in lieu of a human reviewer.
.

Read the rest of this discussion piece

Retracted papers die hard: Diederik Stapel and the enduring influence of flawed science (Papers – preprint: Luis Morís Fernández Miguel Vadillo | June 2019)0

Posted by Admin in on July 31, 2019
 

Abstract
Self-correction is a defining feature of science. However, science’s ability to correct itself is far from optimal as shown, for instance, by the persistent influence of papers that have been retracted due to faulty methods or research misconduct. In this study, we track citations to the retracted work of Diederik Stapel. These citations provide a powerful indicative of the enduring influence of flawed science, as the (admittedly fabricated) data reported in these retracted papers provide no evidence for or against any hypothesis and this case of fraud was widely known due to the extensive media coverage of the scandal. Our data show that Stapel’s papers are still cited in a favorable way within and without the psychological literature. To ameliorate this problem, we propose that papers should be screened during the review process to monitor citations to retracted papers.

Tags
Citation, Retraction, Self-correction, Stapel

Morís Fernández, L., & Vadillo, M. A. (2019, June 19). Retracted papers die hard: Diederik Stapel and the enduring influence of flawed science. https://doi.org/10.31234/osf.io/cszpy
(Pre-print CC) https://psyarxiv.com/cszpy

Knowledge and attitudes among life scientists towards reproducibility within journal articles (Papers: Evanthia Kaimaklioti Samota and Robert P. Davey | June 2019)0

Posted by Admin in on July 16, 2019
 

Abstract

We constructed a survey to understand how authors and scientists view the issues around reproducibility, and how solutions such as interactive figures could enable the reproducibility of experiments from within a research article. This manuscript reports the results of this survey on the views of 251 researchers, including authors who have published in eLIFE Sciences, and those who work at the Norwich Biosciences Institutes (NBI). The survey also outlines to what extent researchers are occupied with reproducing experiments themselves and what are their desirable features of an interactive figure. Respondents considered various features for an interactive figure within a research article that would allow for them to better understand and reproduce in situ the experiment presented in the figure. Respondents said that the most important element that would enable the better reproducibility of published research would be that authors describe methods and analyses in detail. The respondents believe that having interactive figures in published papers is a beneficial element. Whilst interactive figures are potential solutions for demonstrating technical reproducibility, we find that there are equally pressing cultural demands on researchers that need to be addressed to achieve greater success in reproducibility in the life sciences.

Samota, E. K. and R. P. Davey (2019). Knowledge and attitudes among life scientists towards reproducibility within journal articles. bioRxiv: 581033. doi: https://doi.org/10.1101/581033
Publisher: https://www.biorxiv.org/content/10.1101/581033v2
This article is a preprint and has not been peer-reviewed

SPEECH: Actions to advance research integrity – Dr Alan Finkel AO (6th World Conference on Research Integrity | June 2019)0

Posted by Admin in on June 17, 2019
 

Looking around the room today, I’m reminded that research truly is a human pursuit: it thrives on face-to-face connections.

It’s easy to forget that, when you’re a student, and it’s late at night, and you’re the last person left in the lab – again.

So, every so often, it’s worth pausing to remember just how many people are out there, working hard, gathering data – just like you.

Worldwide, there are more than eight million researchers.

Every year, we produce well over a quarter of a million new PhDs.

China alone has added more than a million people to its research workforce since 2011.

Not all of these researchers will work in academia – but those who do are highly productive.

They publish in the order of four million academic journal articles every year, spread across more than 40,000 journals.

Read the rest of this speech

0