ACN - 101321555 Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

Resource Library

Research Ethics MonthlyAbout Us

ResourcesHuman Research EthicsAI Gaydar Study Gets Another Look – Inside Higher Ed (Colleen Flaherty | September 2017)

Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

AI Gaydar Study Gets Another Look – Inside Higher Ed (Colleen Flaherty | September 2017)

 


View full details | Go to resource #1, resource #2, resource #3


A prominent journal that already accepted a controversial study about using computers to “read” sexuality based on a photo is further scrutinizing the paper after intense public backlash.

Michal Kosinski, a psychologist and assistant professor of business at Stanford University, knew his new study about training a computer to recognize gays and lesbians by their photos would be controversial: so much so that he sat on the paper for months before submitting it for publication.

But while Kosinski expected backlash, he didn’t expect the journal that had already accepted his paper — a preliminary version of which has been widely viewed online — to do what it did this week: initiate another review of parts of the study, citing new concerns about ethics.

Read the rest of this discussion piece

Also see

Using AI to determine queer sexuality is misconceived and dangerous – The Conversation (Alex Sharpe and Senthorun Raj | September 2017)

How do we know if someone is gay? A recent Stanford University study has claimed that Artificial Intelligence (AI) using a facial recognition algorithm can more accurately guess whether a person is gay or lesbian than human beings can.

The study has proved controversial not because of our apparent mediocrity in the face of computer algorithms, but because of its dubious methodology – among other things, its exclusive focus on white subjects and its exclusion of bisexual, transgender, and intersex participants. It also highlights the dangers AI poses to the “outing” of sexual minorities against their will, exposing people to possible discrimination.

We strongly object to the use of an algorithmic “gaydar” to predict a person’s sexual orientation and believe studies such as this are both misconceived and pose very real and present dangers for LGBTQI human rights around the world.

Read the rest of this discussion piece

Also see

Journal Will Publish AI Gaydar Study After All – Inside Higher Ed (Colleen Flaherty | September 2017)

After some additional review, the Journal of Personality and Social Psychology will publish a controversial study about training a computer to predict someone’s sexual orientation based on a photo. An editor for American Psychological Association-owned journal last week informed co-author Michal Kosinski, an assistant professor of business at Stanford University, that it would proceed with publishing the already accepted paper. That seemed somewhat up in the air earlier in the week, when the journal said it needed to address the “ethical status” of the project — namely issues related to copyright of publicly available photos and how Stanford’s Institutional Review Board had assessed the project.

Read the rest of this discussion piece



Resources Menu

Research Integrity


Human Research Ethics