Ayuda
Ir al contenido

Dialnet


Resumen de Presentation in self-posted facial images can expose sexual orientation: Implications for research and privacy

Dawei Yang

  • Recent research has found that facial recognition algorithms can accurately classify people’s sexual orientations using naturalistic facial images, highlighting a severe risk to privacy. This article tests whether people of different sexual orientations presented themselves distinctively in photographs, and whether these distinctions revealed their sexual orientation. I found significant differences in self-presentation. For example, gay individuals were on average more likely to wear glasses compared to heterosexual individuals in images uploaded to the dating website. Gay men also uploaded brighter images compared to heterosexual men. To further test how some of these differences drove the classification of sexual orientation, I employed image augmentation or modification techniques. To evaluate whether the image background contributed to classifications, I progressively masked images until only a thin border of image background remained in each facial image. I found that even these pixels classified sexual orientations at rates significantly higher than random chance. I also blurred images, and found that merely three numbers representing the brightness of each color channel classified sexual orientations. These findings contribute to psychological research on sexual orientation by highlighting how people chose to present themselves differently on the dating website according to their sexual orientations, and how these distinctions were used by the algorithm to classify sexual orientations. The findings also expose a privacy risk as they suggest that do-it-yourself data-protection strategies, such as masking and blurring, cannot effectively prevent leakage of sexual orientation information. As consumers are not equipped to protect themselves, the burden of privacy protection should be shifted to companies and governments. (PsycInfo Database Record (c) 2022 APA, all rights reserved)


Fundación Dialnet

Dialnet Plus

  • Más información sobre Dialnet Plus