Imperfect ImaGANation: Implications of GANs exacerbating biases on facial data augmentation and snapchat face lenses

作者:

摘要

In this paper, we show that popular Generative Adversarial Network (GAN) variants exacerbate biases along the axes of gender and skin tone in the generated data. The use of synthetic data generated by GANs is widely used for a variety of tasks ranging from data augmentation to stylizing images. While practitioners celebrate this method as an economical way to obtain synthetic data to train data-hungry machine learning models or provide new features to users of mobile applications, it is unclear whether they recognize the perils of such techniques when applied to real world datasets biased along latent dimensions. Although one expects GANs to replicate the distribution of the original data, in real-world settings with limited data and finite network capacity, GANs suffer from mode collapse. First, we show readily-accessible GAN variants such as DCGANs ‘imagine’ faces of synthetic engineering professors that have masculine facial features and fair skin tones. When using popular GAN architectures that attempt to address mode-collapse, we observe that these variants either provide a false sense of security or suffer from other inherent limitations due to their design choice. Second, we show that a conditional GAN variant transforms input images of female and nonwhite faces to have more masculine features and lighter skin when asked to generate faces of engineering professors. Worse yet, prevalent filters on Snapchat end up consistently lightening the skin tones in people of color when trying to make face images appear more feminine. Thus, our study is meant to serve as a cautionary tale for practitioners and educate them about the side-effect of bias amplification when applying GAN-based techniques.

论文关键词:Generative adversarial networks (GANs),Societal impacts,Algorithmic bias,Data augmentation,Social media

论文评审过程:Received 26 January 2021, Revised 7 December 2021, Accepted 18 December 2021, Available online 29 December 2021, Version of Record 10 January 2022.

论文官网地址:https://doi.org/10.1016/j.artint.2021.103652