ACN - 101321555 Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

Resource Library

Research Ethics MonthlyAbout Us

ResourcesPsychology

Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

Seven Costs of the Money Chase: How Academia’s Focus on Funding Influences Scientific Progress – APS (James McKeen | September 2017)0

Posted by Admin in on January 13, 2018
 

This essay is adapted from the article “Psychology’s Replication Crisis and the Grant Culture: Righting the Ship,” published as part of the Special Symposium on the Future of Psychological Science in the July 2017 issue of Perspectives on Psychological Science.

You may recall Willie Sutton, the thief who, when asked by a reporter why he robbed banks, purportedly replied, “because that’s where the money is.” Whether or not Sutton actually said this (he denied it), the Willie Sutton Principle makes a point self-evident to those familiar with the matching law: When organisms, including academicians, are confronted with two or more choices that differ substantially in reinforcement value (read: grant dollars), they will apportion more of their efforts to the alternative possessing the highest reinforcement value. This pattern of behavior is amplified when administrators impose incentives (e.g., tenure, promotions, awards, salary increases, resources) and penalties (e.g., threats of being denied tenure, loss of laboratory space) tied to the acquisition of grant dollars.

As our field gradually rights the ship — addressing questionable research practices (QRPs) that have contributed to the replication crisis — we have been insufficiently proactive in confronting institutional obstacles that stand in the way of our scientific progress.

Read the rest of this discussion piece

AI Gaydar Study Gets Another Look – Inside Higher Ed (Colleen Flaherty | September 2017)0

Posted by Admin in on September 18, 2017
 

A prominent journal that already accepted a controversial study about using computers to “read” sexuality based on a photo is further scrutinizing the paper after intense public backlash.

Michal Kosinski, a psychologist and assistant professor of business at Stanford University, knew his new study about training a computer to recognize gays and lesbians by their photos would be controversial: so much so that he sat on the paper for months before submitting it for publication.

But while Kosinski expected backlash, he didn’t expect the journal that had already accepted his paper — a preliminary version of which has been widely viewed online — to do what it did this week: initiate another review of parts of the study, citing new concerns about ethics.

Read the rest of this discussion piece

Also see

Using AI to determine queer sexuality is misconceived and dangerous – The Conversation (Alex Sharpe and Senthorun Raj | September 2017)

How do we know if someone is gay? A recent Stanford University study has claimed that Artificial Intelligence (AI) using a facial recognition algorithm can more accurately guess whether a person is gay or lesbian than human beings can.

The study has proved controversial not because of our apparent mediocrity in the face of computer algorithms, but because of its dubious methodology – among other things, its exclusive focus on white subjects and its exclusion of bisexual, transgender, and intersex participants. It also highlights the dangers AI poses to the “outing” of sexual minorities against their will, exposing people to possible discrimination.

We strongly object to the use of an algorithmic “gaydar” to predict a person’s sexual orientation and believe studies such as this are both misconceived and pose very real and present dangers for LGBTQI human rights around the world.

Read the rest of this discussion piece

Also see

Journal Will Publish AI Gaydar Study After All – Inside Higher Ed (Colleen Flaherty | September 2017)

After some additional review, the Journal of Personality and Social Psychology will publish a controversial study about training a computer to predict someone’s sexual orientation based on a photo. An editor for American Psychological Association-owned journal last week informed co-author Michal Kosinski, an assistant professor of business at Stanford University, that it would proceed with publishing the already accepted paper. That seemed somewhat up in the air earlier in the week, when the journal said it needed to address the “ethical status” of the project — namely issues related to copyright of publicly available photos and how Stanford’s Institutional Review Board had assessed the project.

Read the rest of this discussion piece

Five reasons blog posts are of higher scientific quality than journal articles – The 20% Statistician (Daniel Lakens | April 2017)0

Posted by Admin in on June 29, 2017
 

The Dutch toilet cleaner ‘WC-EEND’ (literally: ‘Toilet Duck’) aired a famous commercial in 1989 that had the slogan ‘We from WC-EEND advise… WC-EEND’. It is now a common saying in The Netherlands whenever someone gives an opinion that is clearly aligned with their self-interest. In this blog, I will examine the hypothesis that blogs are, on average, of higher quality than journal articles. Below, I present 5 arguments in favor of this hypothesis. [EDIT: I’m an experimental psychologist. Mileage of what you’ll read below may vary in other disciplines].

1. Blogs have Open Data, Code, and Materials

When you want to evaluate scientific claims, you need access to the raw data, the code, and the materials. Most journals do not (yet) require authors to make their data publicly available (whenever possible). The worst case example when it comes to data sharing is the American Psychological Association. In the ‘Ethical Principles of Psychologists and Code of Conduct’ of this professional organization that supported torture, point 8.14 says that psychologists only have to share data when asked to by ‘competent professionals’ for the goal to ‘verify claims’, and that these researchers can charge money to compensate any costs that are made when they have to respond to a request for data. Despite empirical proof that most scientists do not share their data when asked, the APA considers this ‘ethical conduct’. It is not. It’s an insult to science. But it’s the standard that many relatively low quality scientific journals, such as the Journal of Experimental Psychology: General, hide behind to practice closed science.

Read the rest of this discussion piece

Some Social Scientists Are Tired of Asking for Permission – The New York Times (Kate Murphy | May 2017)0

Posted by Admin in on May 25, 2017
 

Sometimes a change to national policy isn’t enough to alter institutional practice – especially when that practice has been entrenched for a few decades and is wrapped in institutional risk. This New York Times story highlights why there’s so much chatter around the change to the US ‘Common Rule’.

If you took Psychology 101 in college, you probably had to enroll in an experiment to fulfill a course requirement or to get extra credit. Students are the usual subjects in social science research — made to play games, fill out questionnaires, look at pictures and otherwise provide data points for their professors’ investigations into human behavior, cognition and perception.
.
But who gets to decide whether the experimental protocol — what subjects are asked to do and disclose — is appropriate and ethical? That question has been roiling the academic community since the Department of Health and Human Services’s Office for Human Research Protections revised its rules in January.
.
The revision exempts from oversight studies involving “benign behavioral interventions.” This was welcome news to economists, psychologists and sociologists who have long complained that they need not receive as much scrutiny as, say, a medical researcher.
.

Read the rest of this discussion piece

0