ACN - 101321555 Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)
Generic filters
Exact text matches only
Search into
Filter by Categories
Research integrity
Filter by Categories
Human Research Ethics

Resource Library

Research Ethics MonthlyAbout Us


Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

Facebook Says It Will Help Academics Use Its Data. Here’s How That’s Supposed to Work – The Chronicle of Higher Education (Nell Gluckman | April 2018)0

Posted by Admin in on April 19, 2018

There has never been a time when so much data existed about human behavior. What many of us buy, sell, like, dislike, read, and tell our friends is recorded on the internet thanks to sites like Facebook. To social scientists, the company is sitting on a gold mine.

Some of that information is public, but much is not, and the company’s reach is so vast most people don’t know how far it extends. Several research projects that use Facebook data have ended as high-profile privacy-breach scandals in part because subjects didn’t know they were being studied. In the most recent and possibly the largest data breach at the company, an academic harvested information about millions of Facebook users and shared it with Cambridge Analytica, a firm that advised the Trump campaign.

One might think that in the wake of that scandal, Facebook would lock academics out. That’s what Gary King, a political scientist at Harvard University who has pitched Facebook about opening up its data for research, expected. He met with Facebook officials right before the Cambridge Analytica news broke and, to his surprise, he got a call a few days later. They wanted him to study the company’s impact on elections.

Read the rest of this discussion piece

A real-life Lord of the Flies: the troubling legacy of the Robbers Cave experiment – The Guardian (David Shariatmadari | April 2017)0

Posted by Admin in on April 17, 2018

In the early 1950s, the psychologist Muzafer Sherif brought together a group of boys at a US summer camp – and tried to make them fight each other. Does his work teach us anything about our age of resurgent tribalism?
……Read an extract from The Lost Boys

July 1953: late one evening in the woods outside Middle Grove, New York state, three men are having a furious argument. One of them, drunk, draws back his fist, ready to smash it into his opponent’s face. Seeing what is about to happen, the third grabs a block of wood from a nearby pile. “Dr Sherif! If you do it, I’m gonna hit you,” he shouts.

A useful example of the degree to which such work not only fails modern ethical standards, its results were cherry-picked and stage managed. We note again our caution about using such cases to justify current human research ethics/research integrity arrangements. Also see James Kehoe recent post.

The man with the raised fist isn’t just anybody. He is one of the world’s foremost social psychologists, Muzafer Sherif. The two others are his research assistants. Sherif is angry because the experiment he has spent months preparing for has just fallen apart.
Born in the summer of 1905 and raised in İzmir province, Turkey, during the dying days of the Ottoman empire, Sherif won a place at Harvard to study psychology. But he found himself frustrated by the narrowness of the discipline, which mainly involved tedious observation of lab rats. He was drawn instead to the emerging field of social psychology, which looks at the way human behaviour is influenced by others. In particular, he became obsessed by group dynamics: how individuals band together to form cohesive units and how these units can find themselves at each other’s throats.

Read the rest of this discussion piece

‘Cult’ Universal medicine practices promoted by researchers, UQ launches investigation – ABC News (Josh Robertson | Apr 2017)0

Posted by Admin in on April 17, 2018

Researchers who promoted an alleged cult and showcased its bizarre healing claims in published studies have embroiled one of Australia’s top universities in an academic misconduct probe.

The University of Queensland (UQ) and two international medical journals are investigating alleged ethical violations in research around Universal Medicine (UM), an organisation based in Lismore in New South Wales, which touts the healing power of “esoteric breast massage” and other unproven treatments.

Founded by Serge Benhayon — a former bankrupt tennis coach with no medical qualifications who claims to be the reincarnation of Leonardo Da Vinci — UM is a multi-million-dollar enterprise with 700 mostly female followers in 15 countries.

Read the rest of this news story

Algorithms Are Opinions Embedded in Code – Scholarly Kitchen (David Crotty | January 2018)0

Posted by Admin in on April 6, 2018

Recent discussions about peer review brought me back to thinking about Cathy O’Neil’s book, Weapons of Math Destruction, reviewed on this site in 2016. One of the complaints about peer review is that it is not objective — in fact, much of the reasoning behind the megajournal approach to peer review is meant to eliminate the subjectivity in deciding how significant a piece of research may be.

As algorithms play an increasing role in the design, conduct (including the collection and analysis of data), reporting and  our evaluation of research, it is essential to recognise they can be built upon values and beliefs that could distort the body of  knowledge. Often these tools can be treated as more objective than entirely human-based techniques, but sometimes not even the original coders understand how they work or the degree to which they echo very subjective attitudes.

I’m not convinced that judging a work’s “soundness” is any less subjective than judging its “importance”. Both are opinions, and how one rates a particular manuscript will vary from person to person. I often see papers in megajournals that are clearly missing important controls, but despite this, the reviewers and editor involved judged them to be sound. I’m not sure this is all that different from asking why some reviewer thought a paper was significant enough to be in Nature. Peer reviews, like letters of recommendation, are opinions.
Discussions along these lines inevitably lead to suggestions that with improved artificial intelligence (AI), we’ll reduce subjectivity through machine reading of papers and create a fairer system of peer review. O’Neil, in the TED Talk below, would argue that this is not likely to happen. Algorithms, she tells us, are not objective, true, or scientific and they do not make things fair. “That’s a marketing trick.”

Read the rest of this discussion piece