LGBT Groups Blast Study Claiming AI Can Determine Sexual Orientation from Photos

from edge media network – 

Responding to reports of a Stanford University study claims that artificial intelligence (AI) can be used to detect sexual orientation, LGBTQ media advocacy group GLAAD and LGBTQ civil rights organization the Human Rights Campaign called on Stanford University and responsible media outlets Friday to expose what they label “dangerous and flawed research” that could cause harm to LGBTQ people around the world.

The controversy surrounds a Stanford University university professor who published a research study that artificial intelligence (AI) can be used to detect sexual orientation from facial recognition.

The study claims that gay men and women tend to have “gender-atypical” features, expressions and “grooming styles.” This basically means gay men appeared more feminine and gay women more masculine. The data also found trends among gay men: They tended to have narrower jaws, longer noses and larger foreheads compared to straight men. Gay women apparently had larger jaws and smaller forehands compared to straight women.

GLAAD and HRC called on media who either covered the study or plan to in future coverage to include the myriad flaws in the study’s methodology. Both groups made mention of the study’s “inaccurate assumptions,” and how it left out any non-white subjects. Moreover, GLAAD and HRC pointed out that the study was not peer reviewed.

“Technology cannot identify someone’s sexual orientation. What their technology can recognize is a pattern that found a small subset of out white gay and lesbian people on dating sites who look similar. Those two findings should not be conflated,” said Jim Halloran, GLAAD’s Chief Digital Officer. “This research isn’t science or news, but it’s a description of beauty standards on dating sites that ignores huge segments of the LGBTQ community, including people of color, transgender people, older individuals, and other LGBTQ people who don’t want to post photos on dating sites.”

“At a time where minority groups are being targeted, these reckless findings could serve as weapon to harm both heterosexuals who are inaccurately outed, as well as gay and lesbian people who are in situations where coming out is dangerous,” Halloran continued.

HRC Director of Public Education and Research Ashland Johnson, said:

This is dangerously bad information that will likely be taken out of context, is based on flawed assumptions, and threatens the safety and privacy of LGBTQ and non-LGBTQ people alike. Imagine for a moment the potential consequences if this flawed research were used to support a brutal regime’s efforts to identify and/or persecute people they believed to be gay. Stanford should distance itself from such junk science rather than lending its name and credibility to research that is dangerously flawed and leaves the world — and this case, millions of people’s lives — worse and less safe than before.

Among the flaws in research pointed out by GLAAD and HRC are:

* The study did not look at any non-white individuals.

* The study did not independently verify crucial information including age and sexual orientation, and took at face value information appearing online.

* The study was not peer reviewed.

* The study assumed there was no difference between sexual orientation and sexual activity, which is incorrect.

* The study assumed there were only two sexual orientations — gay and straight — and does not address bisexual individuals.

* The study only looked at out gay men and women who are white, of a certain age, and are on dating sites. It is not surprising that gay people (out, white, similar age) who choose to go on dating sites post photos of themselves with similar expressions and hairstyles (one of the characteristics according to the study).

* The research states: “Outside the lab, the accuracy rate would be much lower” (the lab = certain dating sites) and is 10 points less accurate for women.
The study claims to detect gay men from the pool of photos on the dating sites with 81% accuracy. Even if this were true given the aforementioned flaws, it still means that heterosexual men could therefore be identified as gay nearly 20% of the time.

* The study reviewed superficial characteristics in the photos of out gay men and women on dating sites such as weight, hairstyle and facial expression.

“Stanford University and the researchers hosted a call with GLAAD and HRC several months ago in which we raised these myriad concerns and warned against overinflating the results or the significance of them,” The LGBTQ organizations’ joint statement read. “There was no follow-up after the concerns were shared and none of these flaws have been addressed.”

“Based on this information, media headlines that claim AI can tell if someone is gay by looking one photo of your face are factually inaccurate,” GLAAD and HRC concluded.

.