ACN - 101321555 Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

Resource Library

Research Ethics MonthlyAbout Us

ResourcesPrivacy

Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

Wildlife Cameras Are Accidentally Capturing Humans Behaving Badly – Nature (James Dinneen | November 2019)0

Posted by Admin in on November 25, 2019
 

Scientists face an ethical dilemma over what to do with their ‘human bycatch’

To study wildlife, Dr. Nyeema Harris, an assistant professor in the Ecology and Evolutionary Biology Department at the University of Michigan, uses camera traps — remotely triggered cameras that take pictures when they detect movement and body heat. Harris, a wildlife biologist, is not typically interested in humans, but sometimes they still end up in her photographs.

This is another example of researchers who may not be accustomed to thinking about human research ethics matters (in this case wildlife research and accidentally capturing images of people) and the question of how to inform their practice. This is really useful and important discussion. The issues in play are no different to government and others using CCTV, which they do without consent. We have created a somewhat artificial divide between research and real life. Any useful research reflects and interacts with real life. In this case, the capture is identifying some bad behaviour which is useful to know about and to act upon. The social good outweighs privacy rights. We should all be discussing this more.

Between 2016 and 2018, Harris led the first published camera trap survey ever conducted in Burkina Faso and Niger, originally conceived to focus on the critically endangered West African lion. But Harris ended up capturing so much human activity that she expanded the focus of her study to include how humans were using the area. Research on human activity in the wildlife preserve had typically relied on humans reporting their own actions, but with the cameras, Harris could see what they were actually doing. “The data emerged to be a really interesting story that I felt compelled to tell,” Harris says.
.

Even in studies conducted in remote nature reserves, meant to capture wildlife at its wildest, people showed up.
.

When camera traps inadvertently capture human activity, it’s called “human bycatch.” And according to a 2018 University of Cambridge study, Harris is far from the only researcher to have ended up with humans in the data. The study included a survey of 235 scientists across 65 countries about their experiences with human bycatch, and 90% of them reported capturing some images of people in their most recent projects. Even in studies conducted in remote nature reserves, meant to capture wildlife at its wildest, people showed up.
.

As in Harris’s study, this human data doesn’t always stay “bycatch.” Nearly half of respondents to the Cambridge survey said they had used images of people apparently involved in illegal activity to inform wildlife management efforts. Many of them had reported images to law enforcement, others to conservation staff, and some to the media. All this, despite only 8% of projects having set out to capture images of people.
.

Read the rest of this discussion piece

23andMe, moving beyond consumer DNA tests, is building a clinical trial recruitment business – STAT (Rebecca Robbins | September 2019)0

Posted by Admin in on November 23, 2019
 

SAN FRANCISCO — Consumer genetics giant 23andMe announced Thursday that it would move deeper into the business of clinical trial recruitment, partnering with a fast-growing startup to help match its customers with nearby study sites based on their diseases, demographics, and DNA.

[Our image library isn’t working at the moment, please bear with us while we work to resolve this problem.]
.
This story touches on a tricky problem: The use for recruitment purposes of a service that people would have understood to be private and not for research purposes.

The Silicon Valley company has for months been quietly making inroads into clinical trial recruitment by emailing customers who’ve opted in with recommendations about studies that might be appropriate for them. It has recruited for studies, both interventional and observational, in disease areas including Alzheimer’s, Parkinson’s, attention-deficit hyperactivity disorder, eczema, and liver disease, a spokesperson for the company confirmed.
.

But the new partnership with TrialSpark, which offers a tech-powered alternative to traditional contract research organizations, may help 23andMe address one of the biggest challenges in clinical trial recruitment: geography. The idea is that patients who want to enroll in a clinical trial centered out of, say, Memorial Sloan Kettering Cancer Center, won’t have to fly to New York and can instead participate by visiting their local doctor’s office.
.

Read the rest of this discussion piece

Can dynamic consent facilitate the protection of biomedical big data in biobanking in Malaysia? (Papers: Mohammad Firdaus Abdul Aziz & Aimi Nadia Mohd Yusof | May 2019)0

Posted by Admin in on October 20, 2019
 

Abstract
As with many other countries, Malaysia is also developing and promoting biomedical research to increase the understanding of human diseases and possible interventions. To facilitate this development, there is a significant growth of biobanks in the country to ensure continuous collection of biological samples for future research, which contain extremely important personal information and health data of the participants involved. Given the vast amount of samples and data accumulated by biobanks, they can be considered as reservoirs of precious biomedical big data. It is therefore imperative for biobanks to have in place regulatory measures to ensure ethical use of the biomedical big data. Malaysia has yet to introduce specific legislation for the field of biobanking. However, it can be argued that its existing Personal Data Protection Act 2010 (PDPA) has laid down legal principles that can be enforced to protect biomedical big data generated by the biobanks. Consent is a mechanism to enable data subjects to exercise their autonomy by determining how their data can be used and ensure compliance with legal principles. However, there are two main concerns surrounding the current practice of consent in biomedical big data in Malaysia. First, it is uncertain that the current practice would be able to respect the underlying notion of autonomy, and second, it is not in accordance with the legal principles of the PDPA. Scholars have deliberated on different strategies of informed consent, and a more interactive approach has recently been introduced: dynamic consent. It is argued that a dynamic consent approach would be able to address these concerns.

Keywords
Biobanking, Autonomy, Data protection, Informed consent, Dynamic consent

Abdul Aziz, Mohammad Firdaus, and Aimi Nadia Mohd Yusof. 2019. Can dynamic consent facilitate the protection of biomedical big data in biobanking in Malaysia? Asian Bioethics Review 11 (2) 1-14.  https://doi.org/10.1007/s41649-019-00086-2.
Publisher: https://link.springer.com/article/10.1007%2Fs41649-019-00086-2

(US) Google and the University of Chicago Are Sued Over Data Sharing – New York Times (Daisuke Wakabayashi | June 2019)0

Posted by Admin in on October 17, 2019
 

SAN FRANCISCO — When the University of Chicago Medical Center announced a partnership to share patient data with Google in 2017, the alliance was promoted as a way to unlock information trapped in electronic health records and improve predictive analysis in medicine.

On Wednesday, the University of Chicago, the medical center and Google were sued in a potential class-action lawsuit accusing the hospital of sharing hundreds of thousands of patients’ records with the technology giant without stripping identifiable date stamps or doctor’s notes.

The suit, filed in United States District Court for the Northern District of Illinois, demonstrates the difficulties technology companies face in handling health data as they forge ahead into one of the most promising — and potentially lucrative — areas of artificial intelligence: diagnosing medical problems.

Read the rest of this discussion piece

0