If given five photos, the computer program can correctly guess the sexual orientation of a person 91% of the time.
A newly created artificial intelligence program can determine whether someone is gay or straight from a picture of their face with surprising accuracy.
A new study from Stanford University found that a computer algorithm could correctly determine the sexual orientation of a person from a photo of their face 91% of the time if given multiple photos of the subject.
If the program was only given a single photo, it could still correctly guess the sexual orientation of a male subject 81% of the time and a female subject 74% of the time. These results were compared to human judges, who were able to correctly guess the sexual orientation of men 61% of the time and of women 54% of the time.
Lead researchers Michal Kosinski and Yilun Wang developed a program that used a deep neural network and complex mathematical algorithm. They pulled 35,000 facial images from a popular US online dating site and analyzed them alongside the information on sexual orientation provided by the site.
They then ran the data through a logistic regression aimed at classifying sexual orientation. This algorithm then calculates how the facial features correlate to different sexual orientations.
This classification system used both fixed facial features, ones that are generally defined by biological factors like nose shape, and transient facial features, ones that are decided by personal choice like hairstyle, as factors in determining the sexual orientation the person.
The program found that gay men and women tended to have gender-atypical facial morphology, expression, and grooming styles. That means that both their chosen features and their biologically determined features are less like straight members of their gender, and often more masculine in women and more feminine in men.
These results seem to support ideas that biological and hormonal factors contribute to sexuality. The study also raises worries that “gaydar” computer programs like this could be used to detect and discriminate against LGBTQ people.