ACN - 101321555 Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

Resource Library

Research Ethics MonthlyAbout Us

ResourcesNews

Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

‘Silicon Valley is ethically lost’: Google grapples with reaction to its new ‘horrifying’ and uncanny AI tech – Financial Post (Mark Bergen | May 2018)0

Posted by Admin in on June 8, 2018
 

The most talked-about, futuristic product from Google’s developer show isn’t even finished yet — and Google hasn’t agreed how to do it.

After watching the demo you might be left wondering: How long until large-scale telephone surveys are conducted by digital assistants? How should we handle disclosure/deception? Should the assistant be named in the research output?

At its I/O conference on Tuesday, Alphabet Inc.’s Google previewed Duplex, an experimental service that lets its voice-based digital assistant book appointments on its own. It was part of a slate of features, such as automated writing in emails, where Google touted how its artificial intelligence technology saves people time and effort. In a demonstration on stage, the Google Assistant spoke with a hair salon receptionist, mimicking the “ums” and “hmms” pauses of human speech. In another demo, it chatted with a restaurant employee to book a table. The audience of software coders cheered.
.
Outside the Google technology bubble, critics pounced. The company is placing robots in conversations with humans, without those people realizing. The obvious question soon followed: Should AI software that’s smart enough to trick humans be forced to disclose itself. Google executives don’t have a clear answer yet. Duplex emerged at a sensitive time for technology companies, and the feature hasn’t helped alleviate questions about their growing power over data, automation software and the consequences for privacy and work.
.

.

Read the rest of this discussion piece

The Case of the Girl from La Noria: Implications for Ethics in Research with Human Remains – Etilmercurio (Por Invitado Especial | April 2018)0

Posted by Admin in on June 5, 2018
 

A recent article describing the whole-genome sequencing of a body of alleged «extraterrestrial» origin according to UFO organisations (1), journalists (2), and other media outlets (3), has initiated an important controversy regarding adherence to scientific, legal and ethical standards for studies involving human skeletal remains. This controversy began with the commentary published by Etilmercurio (4), which was followed by press reports (5,6,7), public statements released by local and international scientific organisations (8,9,10), the authors of the original article (11), and the journal where it was published (12).

Further commentary on this archeological project that prompted a UFO conspiracy and media storm. Do your institution’s guidelines speak to such projects (including legal frameworks in the source country)? We’ve included a link to an earlier item about this case.

The basic issues raised by researchers questioning the article are clearly summarised in a tweet by Professor Tom Higham (School of Archaeology, University of Oxford, UK): «Accepting a human sample sent via TV film crew from a private owner in Spain; not seen or viewed by them – without any checks for provenance or permission, let alone ethical considerations… what were they thinking?». This is exactly what the authors (Nolan and Butte) claimed in their statement, as part of their argument disavowing responsibility, without acknowledging their lack of involvement as the root of the problem.
.
In their statement (11), the authors attempted to rebut these claims, identifying an earlier lack of criticism or legal action from the Chilean press and authorities when these remains originally became subject to public attention in 2013. Moreover, they declare to have followed U.S. regulations in this regard, completely ignoring Chilean law.
.

Read the rest of this discussion piece

Why detailed retraction notices are important (according to economists) – Retraction Watch (Alison McCook | March 2018)0

Posted by Admin in on June 4, 2018
 

When journals retract a paper but don’t explain why, what should readers think? Was the problem as simple as an administrative error by the publisher, or more concerning, like fraud? In a recent paper in Research Policy, economists led by Adam Cox at the University of Portsmouth, UK, analyzed 55 retractions from hundreds of economics journals, mostly issued between 2001 and 2016. (Does that number sound low? It should — a 2012 analysis of retractions in business and economics found they are a relatively rare occurrence.) In the new paper, Cox and his colleagues analyzed how many notices failed to provide detailed information, the potential costs of these information gaps, and what journals should do about it.

Retraction Watch: You used “rational crime theory” to analyze retraction notices and their consequence to offenders in economics. Could you explain briefly how rational crime theory works in this context?

Adam Cox: Rational crime theory is a framework for explaining why an individual may commit a crime. This involves an (implicit) cost-benefit analysis by (prospective) perpetrators of crime, or in our case, (prospective) perpetrators of research impropriety. If the benefits exceed the costs then a rational individual may be tempted to participate in the crime.

Read the rest of this interview

New tool looks for signs of image doctoring – Retraction Watch interview (Alison McCook | March 2018)0

Posted by Admin in on June 1, 2018
 

One of the most common reasons for retractions is image manipulation. When searching for evidence of it, researchers often rely on what their eyes tell them. But what if screening tools could help? Last week, researchers described a new automated tool to screen images for duplication (reported by Nature News); with help from publishing giant Elsevier, another group at Harvard Medical School is developing a different approach. We spoke with creators Mary Walsh, Chief Scientific Investigator in the Office for Professional Standards and Integrity, and Daniel Wainstock, Associate Director of Research Integrity, about how the tool works, and why — unlike the other recently described automated tool — they want to make theirs freely available.

Retraction Watch: What prompted you to develop this tool?

Mary Walsh and Daniel Wainstock: When reviewing concerns that two published images, representing the results of different experiments, might actually be the same, we typically assess whether the images are too similar to derive from different samples. The answer is often obvious to the naked eye, but not always, and we wanted to determine if it was possible to quantify the similarities.

Read the rest of the interview

0