Artificial intelligence can better distinguish than humans whether someone is gay or straight — but the "gaydar machine" could lead to unwanted cyber-outing, two researchers say.
The research — first reported last year and posted in this month's Journal of Personality and Social Psychology — found using a single facial image, an A.I. "classifier" could correctly distinguish between gay and heterosexual men in 81 percent of cases, and 71 percent of cases for women.
Human judges, on the other hand, were 61 percent correct for men, and 54 percent for women, the researchers found. Furthermore, the accuracy of the algorithm increased to 91 percent and 83 percent, respectively, given five facial images per person.
Stanford researchers Michal Kosinski and Yilun Wang sounded an alarm, however, about such cyber-outing, writing "given that companies and governments are increasingly using computer vision algorithms to detect people's intimate traits, our findings expose a threat to the privacy and safety of gay men and women."
The researchers' work has been fiercely criticized by LGBTQ organizations, the U.K.-based Guardian reported.
But the researchers' colleague, J.D. Schramm of Stanford's Graduate School of Business, wrote Monday that ignoring the findings would be disastrous.
"Kosinski and Wang's research looked only at gays and lesbians," Schramm wrote Monday in a commentary for The Washington Post. "But I fear the greater risk may be to the most vulnerable in our community: the transgender individuals who may well carry identifiable facial features from the gender they were assigned at birth — and who are already at high risk for hate crimes."
"The advances in A.I. and machine learning make it increasingly difficult to hide such intimate traits as sexual orientation, political and religious affiliations, and even intelligence level," Schramm added. "The post-privacy future Kosinski examines in his research is upon us. Never has the work of eliminating discrimination been so urgent."
© 2023 Newsmax. All rights reserved.