ACN - 101321555 Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)
Search
Generic filters
Exact matches only
Search into
Filter by Categories
Research ethics committees
Research integrity

Resource Library

Research Ethics MonthlyAbout Us

ResourcesGood practice

Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

Universities told to appoint research integrity ‘counsellors’ – Times Higher Education (Ellie Bothwell | January 2020)0

Posted by Admin in on February 10, 2020
 

But paper from League of European Research Universities says that anonymous reports of research misconduct should be avoided

Universities should appoint “confidential counsellors” at a faculty level to advise staff and students on research integrity issues, according to a new report from some of Europe’s leading institutions.

Australian institutions will be familiar with the value of Research Integrity Advisers and will hopefully also have a network of collegiate Research Ethics Advisers.

An advice paper published by the League of European Research Universities says it is “important that researchers are able to…obtain strictly confidential advice”, adding that “in many cases researchers face problems that they do not immediately want to share with their colleagues”.
.

This is particularly an issue when the researcher’s career is partly dependent on their colleague, such as in the case of a PhD student and their supervisor, it says.
.

Antoine Hol, professor of jurisprudence at Utrecht University, chair of the Leru research integrity group and co-author of the paper, said that it was relatively common for universities to have confidential counsellors or advisers at a university level, but it was important for institutions to make such appointments at a faculty level so that they are easily accessible and understand the specific culture that researchers might be facing.

.

Read the rest of this discussion piece

The battle for ethical AI at the world’s biggest machine-learning conference – Nature (Elizabeth Gibney | January 2020)0

Posted by Admin in on February 9, 2020
 

Bias and the prospect of societal harm increasingly plague artificial-intelligence research — but it’s not clear who should be on the lookout for these problems.

Diversity and inclusion took centre stage at one of the world’s major artificial-intelligence (AI) conferences in 2018. But once a meeting with a controversial reputation, last month’s Neural Information Processing Systems (NeurIPS) conference in Vancouver, Canada, saw attention shift to another big issue in the field: ethics.

If your institution is involved in AI, algorithm or big data research, who advises on its ethical dimensions?   Given the potential for societal harm, perhaps it’s time for serious consideration of the need for research ethics review for such work.

The focus comes as AI research increasingly deals with ethical controversies surrounding the application of its technologies — such as in predictive policing or facial recognition. Issues include tackling biases in algorithms that reflect existing patterns of discrimination in data, and avoiding affecting already vulnerable populations. “There is no such thing as a neutral tech platform,” warned Celeste Kidd, a developmental psychologist at University of California, Berkeley, during her NeurIPS keynote talk about how algorithms can influence human beliefs. At the meeting, which hosted a record 13,000 attendees, researchers grappled with how to meaningfully address the ethical and societal implications of their work.
.

Ethics gap
Ethicists have long debated the impacts of AI and sought ways to use the technology for good, such as in health care. But researchers are now realizing that they need to embed ethics into the formulation of their research and understand the potential harms of algorithmic injustice, says Meredith Whittaker, an AI researcher at New York University and founder of the AI Now Institute, which seeks to understand the social implications of the technology. At the latest NeurIPS, researchers couldn’t “write, talk or think” about these systems without considering possible social harms, she says. “The question is, will the change in the conversation result in the structural change we need to actually ensure these systems don’t cause harm?”
.

Read the rest of this discussion piece

Friday afternoon’s funny – The specifics of recruitment0

Posted by Admin in on February 7, 2020
 

Cartoon by Don Mayne www.researchcartoons.com
Full-size image for printing (right mouse click and save file)

When considering the ethical dimensions of the recruitment strategy for a project (e.g. As per Element 2 of Chapter 3.1 of the National Statement).  Specific details matter.  Is the text formatting, text and location of the recruitment text appropriate for the potential participants?  Whatever text accompanies the recruitment material.  Is appropriate attention given to the experimental nature of the agent that is being trialled and the project design?

Research intelligence: how to sniff out errors and fraud – Times Higher Education (Jack Grove | January 2020)0

Posted by Admin in on January 27, 2020
 

A growing number of data detectives are on the hunt for sloppy science and dodgy statistics. Jack Grove examines the methods they use

These days it is not just co-authors or peer reviewers who are checking journal papers for errors: a growing number of self-appointed fraud busters are scanning scientific literature for flaws.

This unpaid and mostly anonymous endeavour has led to the retractions of hundreds of papers and even disciplinary action where wrongdoing is exposed.

So how can scholars catch errors when reviewing others’ papers, or when double-checking their own work or that of collaborators?

Read the rest of this discussion piece

0