ACN - 101321555 Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)
Search
Generic filters
Exact matches only
Search into
Filter by Categories
Research integrity
Filter by Categories
Human Research Ethics

Resource Library

Research Ethics MonthlyAbout Us

ResourcesOnline research

Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

The Secretive Company That Might End Privacy as We Know It – New York Times (Kashmir Hill | January 2020)0

Posted by Admin in on January 26, 2020
 

A little-known start-up helps law enforcement match photos of unknown people to their online images — and “might lead to a dystopian future or something,” a backer says.

Until recently, Hoan Ton-That’s greatest hits included an obscure iPhone game and an app that let people put Donald Trump’s distinctive yellow hair on their own photos.

Another piece of work that almost certainly never went anywhere near a research ethics committee, but will have massive ethical implications  for society.

Then Mr. Ton-That — an Australian techie and onetime model — did something momentous: He invented a tool that could end your ability to walk down the street anonymously, and provided it to hundreds of law enforcement agencies, ranging from local cops in Florida to the F.B.I. and the Department of Homeland Security.
.

His tiny company, Clearview AI, devised a groundbreaking facial recognition app. You take a picture of a person, upload it and get to see public photos of that person, along with links to where those photos appeared. The system — whose backbone is a database of more than three billion images that Clearview claims to have scraped from Facebook, YouTube, Venmo and millions of other websites — goes far beyond anything ever constructed by the United States government or Silicon Valley giants.
.

Read the rest of this discussion piece

Reasoning “Uncharted Territory”: Notions of Expertise Within Ethics Review Panels Assessing Research Use of Social Media (Papers: Chelsea Sellers, et al | February 2019 )0

Posted by Admin in on January 5, 2020
 

Abstract

Of late, AHRECS has been asked numerous times to conduct professional development for HREC members, and in-meeting briefings on online research.  But committee expertise on project designs is an important general point to make.  For example: If an HREC doesn’t have access to someone (or more than one) who has run a clinical trial or been involved in trials groups and trial management then they have insufficient expertise to review trials.  During the preparation of a meeting agenda, the Chair and Secretary should discuss the committee’s relevant expertise and needs.

The fast changing field of social media (SM) research presents unique challenges for research ethics committees (RECs). This article examines notions of experience and expertise in the context of REC members reviewing proposals for SM research and considers the role of the RECs in this area of review. We analyze 19 interviews with REC members to highlight that a lack of personal and professional experience of SM, compounded by a lack of institutional and professional guidelines, mean many REC members feel they do not possess sufficient expertise to review SM research. This view was supported by 14 interviews with SM researchers. REC members drew on strategies to overcome their lack of experience, although most SM researchers still found this problematic, to varying degrees. We recommend several steps to ensure REC expertise in SM research keeps pace of this fast-developing field, taking a pro-active, dialogic approach.

Keywords
social media, research ethics committee, ethics, experience, expertise

Samuel, G. N., Samuel, G. and Derrick, G. (2019). Civil society stakeholder views on forensic DNA phenotyping: balancing risks and benefits. Special Issue: Ethical Issues in Social Media Research
Publisher (Open Access): https://journals.sagepub.com/doi/pdf/10.1177/1556264619837088

(US) Rounding up the Belmont Report Retrospectives – Amp@sand (May 2019)0

Posted by Admin in on June 27, 2019
 

Last month brought the 40th anniversary of the publishing of the Belmont Report, and along with that milestone came a reflection on how its values, conclusions, and imperatives have changed in the intervening years. A celebration of its durability has been accompanied by a necessary reckoning with the ways that a 40-year-old document may be ill-equipped to process the ethical issues brought about by technological, cultural, and political changes. Here, we’ve gathered a range of resources that look back on 40 years of the Belmont Report.

Safeguards for human studies can’t cope with big data
Nature
This provocative piece explores the ways in which the Belmont Report is insufficient for dealing with revolutionary digital technologies, arguing that “data science overlooks risks to human participants by default” and that it is “past time for a Belmont 2.0.” That new summit, the author argues, would need to engage with the currently “poorly understood risks and harms” that big data researches poses to humans.

A Belmont Report for Health Data (abstract available)
The New England Journal of Medicine
HIPAA offers robust protection of a limited range of data, but in 2019, the demands on humans’ health data come from far more directions than the 1996 legislation could anticipate. The authors of this NEJM piece call for a coordinated expansion of the scope of ethical review of the gathering, use, and manipulation of health data to account for sources such as “social-media platforms, health and wellness apps, smartphones [and] life insurers,” citing concerns about reidentification of deidentified data, discrimination, health profiling, and more.

Read the rest of this discussion piece

(US) Safeguards for human studies can’t cope with big data – Nature (Nathaniel Raymond | April 2019)0

Posted by Admin in on April 19, 2019
 

Forty years on from a foundational report on how to protect people participating in research, cracks are showing, warns Nathaniel Raymond.

One of the primary documents aiming to protect human research participants was published in the US Federal Register 40 years ago this week. The Belmont Report was commissioned by Congress in the wake of the notorious Tuskegee syphilis study, in which researchers withheld treatment from African American men for years and observed how the disease caused blindness, heart disease, dementia and, in some cases, death.

This item obviously relates very specifically to the origins of the US human research ethics arrangements and the operation of IRBs, but the questions it poses are salient to Australasia.  The oft repeated statement: “But the information is already published and so is in the public domain and so is exempt”.  Is no longer helpful. We have provided a list of related items.

The Belmont Report lays out core principles now generally required for human research to be considered ethical. Although technically governing only US federally supported research, its influence reverberates across academia and industry globally. Before academics with US government funding can begin research involving humans, their institutional review boards (IRBs) must determine that the studies comply with regulation largely derived from a document that was written more than a decade before the World Wide Web and nearly a quarter of a century before Facebook.
.

It is past time for a Belmont 2.0. We should not be asking those tasked with protecting human participants to single-handedly identify and contend with the implications of the digital revolution. Technological progress, including machine learning, data analytics and artificial intelligence, has altered the potential risks of research in ways that the authors of the first Belmont report could not have predicted. For example, Muslim cab drivers can be identified from patterns indicating that they stop to pray; the Ugandan government can try to identify gay men from their social-media habits; and researchers can monitor and influence individuals’ behaviour online without enrolling them in a study.
.

Read the rest of this discussion piece

0