ACN - 101321555 Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)
Search
Generic filters
Exact text matches only
Search into
Filter by Categories
Research integrity
Filter by Categories
Human Research Ethics

Resource Library

Research Ethics MonthlyAbout Us

ResourcesResearch results

Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

Reply to de Winter and Dodou (2014): Growing bias and the hierarchy are actually supported, despite different design, errors, and disconfirmation-biases (Papers: Daniele Fanelli | 2014)0

Posted by Admin in on April 23, 2017
 

I appreciate the efforts that de Winter and Dodou (2014) have put into replicating and challenging claims made by Fanelli (2010, 2012), as well as those of Pautasso (2010). This is how all sciences should make progress, and it is therefore both a duty and an honour to respond to this challenge.

The results presented are largely in agreement with claims by Fanelli (2012 and 2010), but this fact is obfuscated by a somewhat selective interpretation of findings, reinforced by differences in study design, and major flaws in the sampling and analytical design.

FLAWS IN INTERPRETATION:

1) Fanelli (2012) claimed that negative results are disappearing in percentage, which is exactly what is found here. Even de Winter and Dodou (2014) quote Fanelli (2012) as using percentage figures, so I am quite baffled as to why they consider their results at odds with mine. For the record, the absolute number of negative results in Fanelli (2012) did not show a decline, and it was never claimed in the paper that it did

Fanelli D (2014) Reply to de Winter and Dodou (2014): Growing bias and the hierarchy are actually supported, despite different design, errors, and disconfirmation-biases. Peer J – commentary. (non peer-reviewed) This is the original pre-print the comment replies to.
Access this preprint

Conservative Tests under Satisficing Models of Publication Bias (Papers: Justin McCrary, et al 2016)0

Posted by Admin in on April 22, 2017
 

Abstract
Publication bias leads consumers of research to observe a selected sample of statistical estimates calculated by producers of research. We calculate critical values for statistical significance that could help to adjust after the fact for the distortions created by this selection effect, assuming that the only source of publication bias is file drawer bias. These adjusted critical values are easy to calculate and differ from unadjusted critical values by approximately 50%—rather than rejecting a null hypothesis when the t-ratio exceeds 2, the analysis suggests rejecting a null hypothesis when the t-ratio exceeds 3. Samples of published social science research indicate that on average, across research fields, approximately 30% of published t-statistics fall between the standard and adjusted cutoffs.

McCrary J, Christensen G, Fanelli D (2016) Conservative Tests under Satisficing Models of Publication Bias. PLoS ONE 11(2): e0149590. https://doi.org/10.1371/journal.pone.0149590
Publisher: http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0149590

Academic Research in the 21st Century: Maintaining Scientific Integrity in a Climate of Perverse Incentives and Hypercompetition (Papers: Marc A. Edwards and Siddhartha Roy | 2017)0

Posted by Admin in on April 21, 2017
 

Over the last 50 years, we argue that incentives for academic scientists have become increasingly perverse in terms of competition for research funding, development of quantitative metrics to measure performance, and a changing business model for higher education itself. Furthermore, decreased discretionary funding at the federal and state level is creating a hypercompetitive environment between government agencies (e.g., EPA, NIH, CDC), for scientists in these agencies, and for academics seeking funding from all sources—the combination of perverse incentives and decreased funding increases pressures that can lead to unethical behavior. If a critical mass of scientists become untrustworthy, a tipping point is possible in which the scientific enterprise itself becomes inherently corrupt and public trust is lost, risking a new dark age with devastating consequences to humanity. Academia and federal agencies should better support science as a public good, and incentivize altruistic and ethical outcomes, while de-emphasizing output.

Edwards Marc A. and Roy Siddhartha. Academic Research in the 21st Century: Maintaining Scientific Integrity in a Climate of Perverse Incentives and Hypercompetition. Environmental Engineering Science. January 2017, 34(1): 51-61. doi:10.1089/ees.2016.0223.
Publisher: http://online.liebertpub.com/doi/abs/10.1089/ees.2016.0223

Fraud by bone researcher takes down two meta-analyses, a clinical trial, and review – Retraction Watch (Victoria Stern | April 2017)0

Posted by Admin in on April 20, 2017
 

The troubles continue for a bone researcher, who’s lost multiple papers in recent months due to problems ranging from data issues to including authors without their consent.

Now, journals have retracted two more papers by Yoshihiro Sato. And in a sign of the downstream effects that fraud can have, another journal has retracted two meta-analyses by other authors that cited his work.

Earlier this month, the journal Current Medical Research and Opinion retracted the two meta-analyses because they were based on recently retracted papers by Sato, affiliated with Mitate Hospital. The two new retractions of Sato’s papers are a review and a randomized controlled trial.

Read the rest of this news story

0