Auto Essentialism: Gender in Automated Facial Analysis as Extended Colonial Project
Abstract
Facial analysis systems, like facial classification and facial recognition, are built on a foundation of pattern recognition. Computer vision algorithms are designed to identify patterns of the face to identify individuals, or aspects of those individuals, such as emotion, race, or gender. Information scientists, ethicists, and policymakers are increasingly raising concerns about potential social biases in facial analysis systems, particularly with regard to the tangible consequences of misidentification of marginalized groups. However, few have examined how automated facial analysis technologies intersect with the historical political genealogy of racialized gender—the gender binary and its classification as a highly racialized tool of colonial power and control. In this paper, we introduce the concept of auto-essentialization: the use of automated technologies to re-inscribe the essential notions of difference that were established under colonial rule. We consider in particular how the face has emerged as a legitimate site for the classification of gender, despite being historically tied to projects of racial domination. To do so, we examine the history of gendering the face and body, from colonial projects aimed at disciplining non-binary bodies, to sexology’s role in normalizing a white binary, to physiognomic practices that ascribed a racialized and gendered notion of inferiority to non-European groups and women. We argue that these imperialist ideologies are reflected in modern automated facial analysis tools in computer vision by relying on two case studies: the use of facial analysis for (1) commercial gender classification and (2) security of spaces both small-scale (women-only online platforms) and large-scale (national borders). Thus, we posit a rethinking of ethical attention to these systems: not as immature and needing further exploration, but as mature instantiations of much older technologies.