Study Claiming AI Can Detect Sexual Orientation Cleared for Publication

Composite faces built by averaging faces classified as most and least likely to be gay by a computer. (Michal Kosinski and Yilun Wang/APA)

Update, Sept. 19: The American Psychological Association says that a controversial research paper that applied computer facial recognition to successfully guess people’s sexual orientation has passed a review of documentation submitted by the researchers. The paper is set to be published in the association’s peer-reviewed Journal of Personality and Social Psychology.

A spokesperson for the association said it completed the review last week. The association undertook the review to substantiate an institutional review board’s vetting of the study, which ensured that the study met ethical guidelines.

“Given the sensitive nature of photo-images used in the current study, we are currently taking this additional step with this as yet unpublished manuscript,” a spokesperson for the APA wrote to KQED last week, before the review was completed.

The research, by Michal Kosinski and Yilun Wang of Stanford University, claims that a computer algorithm bested humans in distinguishing between a gay person and a straight person when analyzing images from public profiles on a dating website. (Here is a preprint of the study; it’s not necessarily the final version of the paper.)Researchers claim AI can be taught to predict sexual orientation from analyzing photographs. LGBTQ advocates are outraged.

The research had led to a firestorm of criticism from LGBTQ advocates and academics since it was first reported Sept. 9 by The Economist. Two gay rights groups, Human Rights Campaign and GLAAD, called the research, in a joint press release, “junk science.”

Analyzing Faces

Using basic facial-recognition technology, the researchers weeded through 130,741 public photos of men and women posted on a U.S. dating website, selecting for images that showed a single face large and clear enough to analyze. This left a pool of 35,326 pictures of 14,776 individuals. Gay and straight people, male and female, were represented evenly.

Software called VGG-Face analyzed the faces and looked for correlations between a person’s face (nose length, jaw width, etc.) and their self-declared sexual identity on the website. Using a resulting model made up of these distinguishing characteristics, the program, when shown one photo of a gay man and one of a straight man, was able to identify their sexual orientation 81 percent of the time.  For women, the success rate was 71 percent. (Accuracy increased when the model was shown more than one image of a person.) Human guessers correctly identified straight faces and gay faces just 61 percent of the time for men and 54 percent for women.

The researchers say in the paper that these results “provide strong support” for the prenatal hormone theory of gay and lesbian sexual orientation. The theory holds that under or overexposure to prenatal androgens are a key determinant of sexual orientation.

In an authors’ note (last updated Sept. 13), the researchers discuss the study’s limitations at some length, including the narrow demographic characteristics of the individuals analyzed — white people who self-reported to be gay or straight. They also expressed concerns about the implications of the study:

We were really disturbed by these results and spent much time considering whether they should be made public at all. We did not want to enable the very risks that we are warning against.

Recent press reports, however, suggest that governments and corporations are already using tools aimed at revealing intimate traits from faces. Facial images of billions of people are stockpiled in digital and traditional archives, including dating platforms, photo-sharing websites, and government databases. Profile pictures on Facebook, LinkedIn, and Google Plus are public by default. CCTV cameras and smartphones can be used to take pictures of others’ faces without their permission.

Critics of the research expressed concerns that it will lead to the very invasion of privacy the authors seek to warn against. HRC/GLAAD also criticized the limited demographic pool used by the researchers, the “superficial” nature of the characteristics analyzed in the model, and the way media have represented the study.

The authors responded angrily, calling the HRC/GLAAD press release premature and misleading. “They do a great disservice to the LGBTQ community by dismissing our results outright without properly assessing the science behind it, and hurt the mission of the great organizations that they represent,” they wrote. 

The researchers also stressed, in their authors’ note, that they did not invent the tools used. Rather, they applied internet-available software to internet-available data, with the goal of demonstrating the privacy risks inherent in artificially intelligent technologies.

“We studied existing technologies,” wrote Kosinski and Wang, already widely used by companies and governments, to see whether they present a risk to the privacy of LGBTQ individuals.”

They added, “We were terrified to find that they do.”

Such tools present a special threat, said the authors, to the privacy and safety of gay men and women living under repressive regimes where homosexuality is illegal.

But other LGBT academics and writers did not accept this line of reasoning. Oberlin sociology professor Greggor Mattson wrote a takedown, published on his website,  describing the study as “much less insightful than the researchers claim.” The authors’ discussion of their ethical concerns suffered from “stunning tone-deafness,” Mattson wrote.

At least one LGBT blogger, though, came to the researchers’ defense.

Alex Bollinger, writing at LGBTQ Nation, wrote a post titled “HRC and GLAAD release a silly statement about the ‘gay face’ study.”

“We should take a stance of curiosity instead of judgment.

“This is just one study that looked at one sample and said a few things. There will be more studies later on that will say other things. Let’s see how that all unfolds before deciding what the correct answer is.”

This post was edited Oct. 9 to specify the nature of the American Psychological Association’s review of the study.

Study Claiming AI Can Detect Sexual Orientation Cleared for Publication 9 October,2017Danielle Venton

  • musicandart

    This is basically phrenology, a pseudoscience that was discredited a long time ago. Time should not be wasted on discussing this except descriptively, and it certainly should not be applied to any evaluation of sexual orientation since to do so will needlessly offend people in the LGBT community and anyone who values their rights.

    • Peter

      What the authors demonstrated was a possible application of existing technology. They have demonstrated it to work on a limited demographic. Whether or not you believe this is bunk is irrelevant.

      The authors outlined their thoughts perfectly. Just because you don’t want to use this technology to profile people’s sexual orientation, that doesn’t stop other governments and industry groups from doing so. Thus research like this is building the foundation of legal advocacy of privacy rights rather than just pushing the problem underneath a rug of obscurantism.

    • Adrian Johnson

      Phrenology was based on an “expert” purporting to read character and intelligence by using his hands to feel bumps and contours of a subject’s skull.

      This study is based on finding similarities of proportions in photographs of faces of self-identified gay or straight caucasians. These proportions are at least partially due to strong or weak androgen (male hormone) whilst children are being formed in utero.

      It is a false-comparison to liken AI face-recognition software with demonstrable accuracy with a pseudoscience like phrenology.
      We may not like certain facts, but we cannot deny they are facts for political reasons.
      This topic needs a lot more study, but the initial findings justify them.

  • Images/photos used in the samples do not take into account:
    * Normally cycling women versus women using hormonal contraceptive methods.
    “Ovulatory cycle and changes in face width: How women can tell when other females are ovulating using clues in their face – and how they may then try to hide their partners from the ‘threat’ of these fertile ladies (Women’s attractiveness changes with estradiol and progesterone across the ovulatory cycle)”

    * Persons with facial surgery or even worse Photoshopped / altered / retouched images (nose, lips, eyebrows) can introduce bias in the algorithm.

    Also from the original research:
    (208) Facial images. We obtained facial images from public profiles posted on a U.S. dating
    (209) website. We recorded 130,741 images of 36,630 men and 170,360 images of 38,593 women
    (210) between the ages of 18 and 40, who reported their location as the U.S. Gay and heterosexual
    (211) people were represented in equal numbers. Their sexual orientation was established based on the
    (212) gender of the partners that they were looking for (according to their profiles).

    “Gay and heterosexual people were represented in equal numbers. ”

    Perhaps it is a flaw. Gay people are OVERrepresented in the sample.
    Perhaps Gay people should be less than 10% in the sample.

    If in real world heterosexual people are 90% of any REAL / REPRESENTATIVE sample of any real population, and 10% are only Gay and Bisexual, it will be easy to any algorithm to achieve 90% precision detecting heterosexual people!
    It only needs to say all of them are heterosexual people and it will fail for a 10%, i.e. 90% precision !

  • Also in the picture of this page
    Her face may be swollen by a pregnancy
    or
    His / Her face may be swollen due to subclinical hypothyroidism.

Author

Danielle Venton

Danielle Venton is a host and reporter for KQED.

Before joining KQED in 2015, Danielle was a staff reporter at KRCB in Sonoma County and a writer at WIRED in San Francisco. She is a 2011 graduate of the University of California at Santa Cruz’s science communications program, and has held internships at High Country News and the Monterey County Herald.

Sponsored by

Become a KQED sponsor