ACN - 101321555 Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)
Generic filters
Exact text matches only
Search into
Filter by Categories
Research integrity
Filter by Categories
Human Research Ethics

Resource Library

Research Ethics MonthlyAbout Us


Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

(China) A 10-year follow up of publishing ethics in China: what is new and what is unchanged (Papers: Katrina A. Bramstedt & Jun Xu | September 2019)0

Posted by Admin in on September 13, 2019


Organ donation and transplantation in China are ethically complex due to questionable informed consent and the use of prisoners as donors. Publishing works from China can be problematic. The objective of this study was to perform a 10-year follow up on Chinese journals active in donation and transplant publishing regarding the evolution of their publishing guidelines.

Eleven Chinese journals were analyzed for 7 properties: (1) ethics committee approval; (2) procedure consent; (3) publishing consent; (4) authorship criteria; (5) conflict of interest; (6) duplicate publication; and (7) data integrity. Results were compared with our 2008 study data. Additionally, open access status, impact factor, and MEDLINE-indexing were explored.

Most journals heightened the ethical requirements for publishing, compared to the results of 2008. All 11 now require their published manuscripts to have data integrity. Ten of 11 require ethics committee approval and informed consent for the publication of research studies, whereas in the original study only 2 journals evidenced these requirements. Nine of 11 have criteria for authorship, require conflict of interest disclosure, and forbid duplicate publishing. None of the journals have a policy to exclude data that was obtained from unethical organ donation practices. Nine of 11 journals are MEDLINE-indexed but only 2 are open-access.

Most journals have improved their general ethical publishing requirements but none address unethical organ donation practices.

China; Informed consent; Organ donation; Publishing; Research ethics; Research integrity

Bramstedt, K. and Xu, J. (20019) (China) A 10-year follow up of publishing ethics in China: what is new and what is unchanged. Research Integrity and Peer Review 4(17)
Publisher (Open Access):

European universities dismal at reporting results of clinical trials – Nature (Nic Fleming | April 2019)0

Posted by Admin in on September 11, 2019

Analysis of 30 leading institutions found that just 17% of study results had been posted online as required by EU rules.

Failing to post the results of a clinical trial is not only a technical breach, it is a waste of resources, places an unwarranted burden on volunteers, is a waste of resources and is a public health issue.  Does your institution follow-up to check if results have been reported?  Is action taken if it hasn’t?

Many of Europe’s major research universities are ignoring rules that require them to make public the results of clinical trials.

A report published on 30 April found that the results of only 162 of 940 clinical trials (17%) that were due to be published by 1 April had been posted on the European Union’s trials register. The 30 universities surveyed are those that sponsor the most clinical trials in the EU. Fourteen of these institutions had failed to publish a single results summary.

If three high-performing UK universities are excluded from the figures, the results of just 7% of the trials were made public on time. Campaigners say the resulting lack of transparency harms patients by undermining the efforts of doctors and health authorities to provide the best treatments, slows medical progress and wastes public funds.

Read the rest of this discussion piece

Why we shouldn’t take peer review as the ‘gold standard’ – The Washington Post (Paul D. Thacker and Jon Tennant | August 2019)0

Posted by Admin in on September 10, 2019

It’s too easy for bad actors to exploit the process and mislead the public

In July, India’s government dismissed a research paper finding that the country’s economic growth had been overestimated, saying the paper had not been “peer reviewed.” At a conference for plastics engineers, an economist from an industry group dismissed environmental concerns about plastics by claiming that some of the underlying research was “not peer reviewed.” And the Trump administration — not exactly known for its fealty to science — attempted to reject a climate change report by stating, incorrectly, that it lacked peer review.

Researchers commonly refer to peer review as the “gold standard,” which makes it seem as if a peer-reviewed paper — one sent by journal editors to experts in the field who assess and critique it before publication — must be legitimate, and one that’s not reviewed must be untrustworthy. But peer review, a practice dating to the 17th century, is neither golden nor standardized. Studies have shown that journal editors prefer reviewers of the same gender, that women are underrepresented in the peer review process, and that reviewers tend to be influenced by demographic factors like the author’s gender or institutional affiliation. Shoddy work often makes it past peer reviewers, while excellent research has been shot down. Peer reviewers often fail to detect bad research, conflicts of interest and corporate ghostwriting.

Meanwhile, bad actors exploit the process for professional or financial gain, leveraging peer review to mislead decision-makers. For instance, the National Football League used the words “peer review” to fend off criticism of studies by the Mild Traumatic Brain Injury Committee, a task force the league founded in 1994, which found little long-term harm from sport-induced brain injuries in players. But the New York Times later discovered that the scientists involved had omitted more than 100 diagnosed concussions from their studies. What’s more, the NFL’s claim that the research had been rigorously vetted ignored that the process was incredibly contentious: Some reviewers were adamant that the papers should not have been published at all.

Read the rest of this discussion piece

Could a New Project Expose Predatory Conferences? – Technology Networks (Paul Killoran, Ex Ordo | September 2019)0

Posted by Admin in on September 9, 2019

By now, predatory conferences should be on your radar. These “scholarly” events are organized on a strictly for-profit basis, pay lip service to peer review, and publish almost anything sent their way — for a fee, of course. (An associate professor submitted a nuclear physics paper written using iOS autocomplete to one such conference. It passed review with flying colors.)

For years, shady individuals have been exploiting early-career researchers’ eagerness to publish. But unless you were desperate  — or painfully naive — fake conferences were pretty easy to spot and avoid. Up till now.

Effective predators adapt, and today’s breed of predatory conference is a much better mimic of the real deal. Their organizers are tech-savvy enough to create counterfeit websites that masquerade as those belonging to learned societies. I know of at least one medical association that had its conference website cloned by scammers and placed online at a web address that was just close enough to the real thing to be believable.

Read the rest of this discussion piece