ACN - 101321555 Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

Resource Library

Research Ethics MonthlyAbout Us

ResourcesResearch results

Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

Fake Citations Kill a Career – Inside Higher Ed (Colleen Flaherty | September 2017)0

Posted by Admin in on September 13, 2019
 

Columbia says a historian’s acclaimed book on North Korea was plagiarized, and its publisher says it’s been taken out of print.

Charles Armstrong, Korea Foundation Professor of Korean Studies in the Social Sciences at Columbia University, plagiarized parts of his award-winning book on North Korea, Tyranny of the Weak: North Korea and the World, 1950-1992. He’s currently on sabbatical and will retire at the end of 2020, the university told Armstrong’s colleagues this week.

“These findings were made in accordance with our policy, which required a confidential preliminary review by an inquiry committee, an investigation by a separate ad hoc faculty committee, oversight and recommendations by the university’s standing Committee on the Conduct of Research, and final decisions by the executive vice president for research and the provost,” Maya Tolstoy, dean of the Faculty of Arts and Sciences, wrote in an email to professors that was obtained by Inside Higher Ed.

Findings of research misconduct are generally “communicated to the public through retractions or corrections published in the scholarly literature,” Tolstoy wrote. “Where such a retraction is not feasible, the university may choose to notify the relevant community.”

Read the rest of this discussion piece

(China) A 10-year follow up of publishing ethics in China: what is new and what is unchanged (Papers: Katrina A. Bramstedt & Jun Xu | September 2019)0

Posted by Admin in on September 13, 2019
 

Abstract

Background
Organ donation and transplantation in China are ethically complex due to questionable informed consent and the use of prisoners as donors. Publishing works from China can be problematic. The objective of this study was to perform a 10-year follow up on Chinese journals active in donation and transplant publishing regarding the evolution of their publishing guidelines.

Methods
Eleven Chinese journals were analyzed for 7 properties: (1) ethics committee approval; (2) procedure consent; (3) publishing consent; (4) authorship criteria; (5) conflict of interest; (6) duplicate publication; and (7) data integrity. Results were compared with our 2008 study data. Additionally, open access status, impact factor, and MEDLINE-indexing were explored.

Results
Most journals heightened the ethical requirements for publishing, compared to the results of 2008. All 11 now require their published manuscripts to have data integrity. Ten of 11 require ethics committee approval and informed consent for the publication of research studies, whereas in the original study only 2 journals evidenced these requirements. Nine of 11 have criteria for authorship, require conflict of interest disclosure, and forbid duplicate publishing. None of the journals have a policy to exclude data that was obtained from unethical organ donation practices. Nine of 11 journals are MEDLINE-indexed but only 2 are open-access.

Conclusions
Most journals have improved their general ethical publishing requirements but none address unethical organ donation practices.

Keywords:
China; Informed consent; Organ donation; Publishing; Research ethics; Research integrity

Bramstedt, K. and Xu, J. (20019) (China) A 10-year follow up of publishing ethics in China: what is new and what is unchanged. Research Integrity and Peer Review 4(17) https://doi.org/10.1186/s41073-019-0077-3.
Publisher (Open Access): https://researchintegrityjournal.biomedcentral.com/articles/10.1186/s41073-019-0077-3

European universities dismal at reporting results of clinical trials – Nature (Nic Fleming | April 2019)0

Posted by Admin in on September 11, 2019
 

Analysis of 30 leading institutions found that just 17% of study results had been posted online as required by EU rules.

Failing to post the results of a clinical trial is not only a technical breach, it is a waste of resources, places an unwarranted burden on volunteers, is a waste of resources and is a public health issue.  Does your institution follow-up to check if results have been reported?  Is action taken if it hasn’t?

Many of Europe’s major research universities are ignoring rules that require them to make public the results of clinical trials.

A report published on 30 April found that the results of only 162 of 940 clinical trials (17%) that were due to be published by 1 April had been posted on the European Union’s trials register. The 30 universities surveyed are those that sponsor the most clinical trials in the EU. Fourteen of these institutions had failed to publish a single results summary.

If three high-performing UK universities are excluded from the figures, the results of just 7% of the trials were made public on time. Campaigners say the resulting lack of transparency harms patients by undermining the efforts of doctors and health authorities to provide the best treatments, slows medical progress and wastes public funds.

Read the rest of this discussion piece

Why we shouldn’t take peer review as the ‘gold standard’ – The Washington Post (Paul D. Thacker and Jon Tennant | August 2019)0

Posted by Admin in on September 10, 2019
 

It’s too easy for bad actors to exploit the process and mislead the public

In July, India’s government dismissed a research paper finding that the country’s economic growth had been overestimated, saying the paper had not been “peer reviewed.” At a conference for plastics engineers, an economist from an industry group dismissed environmental concerns about plastics by claiming that some of the underlying research was “not peer reviewed.” And the Trump administration — not exactly known for its fealty to science — attempted to reject a climate change report by stating, incorrectly, that it lacked peer review.

Researchers commonly refer to peer review as the “gold standard,” which makes it seem as if a peer-reviewed paper — one sent by journal editors to experts in the field who assess and critique it before publication — must be legitimate, and one that’s not reviewed must be untrustworthy. But peer review, a practice dating to the 17th century, is neither golden nor standardized. Studies have shown that journal editors prefer reviewers of the same gender, that women are underrepresented in the peer review process, and that reviewers tend to be influenced by demographic factors like the author’s gender or institutional affiliation. Shoddy work often makes it past peer reviewers, while excellent research has been shot down. Peer reviewers often fail to detect bad research, conflicts of interest and corporate ghostwriting.

Meanwhile, bad actors exploit the process for professional or financial gain, leveraging peer review to mislead decision-makers. For instance, the National Football League used the words “peer review” to fend off criticism of studies by the Mild Traumatic Brain Injury Committee, a task force the league founded in 1994, which found little long-term harm from sport-induced brain injuries in players. But the New York Times later discovered that the scientists involved had omitted more than 100 diagnosed concussions from their studies. What’s more, the NFL’s claim that the research had been rigorously vetted ignored that the process was incredibly contentious: Some reviewers were adamant that the papers should not have been published at all.

Read the rest of this discussion piece

0