When feminism and Artificial Intelligence come together

Artificial Intelligence

Science and technology have long — and fallaciously — been equated with masculinity. The result is that gender roles have been reaffirmed along with preconceived values assumed to be natural to femininity and masculinity in society, where it has been assumed that for girls and boys, it is natural to have opposite career paths and academics. For example, girls are associated with nurturing and caring, while boys are with rationality and self-sufficiency. In this article, we will discuss women and artificial intelligence.

Statistics published in PNAS that address gender inequality in different countries and disciplines show that 73% of males are compared to 27% of females in STEM academia and industry globally. This is proof not only of gender stereotypes but how they have been constructed throughout history.  For a long time, there was a misapprehension about whether science is patriarchal from its conceptions among feminist traditions of the Global North, especially those labelled second-wave feminism

Artificial intelligence

The Australian sociologist Judy Wajcman, professor at the LSE, acknowledged for her thought-provoking works about techno-feminism (a concept that addresses feminism struggles within the novelty world of technology) has exposed the recalcitrant determinism of liberal and socialist feminism when it comes to technoscience.

Artificial Intelligence and women

According to the author, in her article “From Women and Technology to Gendered Technoscience. Information, Communication & Society” (2007), while the former focuses on critiquing the absence of opportunities for women or the existent gap within the field, the latter had more of an essentialist point of view, where technology was “deeply implicated in the masculine project of the domination and control of women and nature” (p.289) derived from the division of labour and sexual division since the industrial revolution.

However, the issue goes beyond the artefact itself and relapses in the ideology behind some technologies or the demonization of technology for its patriarchal origins. 

These judgment values continue to be reproduced today by social media, ads, work applications, and political systems much larger than we could ever imagine. The issue exceeds gender violence, as artificial intelligence and machine learning algorithms have a proven track on racism, homophobia and ableism.

Artificial Intelligence - Women

Here I do not aim to fall back on a pessimist narrative, hopeless and scared of technological advances, as we cannot forget that technology is not autonomous. Behind it are people constructing and reconstructing it. That’s why it is important to name its harms and liberatory possibilities. 

Indian sociologist Ruha Benjamin, a professor at Princeton University, has named the “New Jim Code” in her book Race after Technology as “the employment of new technologies that reflect and reproduce existing inequities but that are promoted and perceived as more objective or progressive than the discriminatory systems of a previous era” (p.3).

The book’s name alludes to the Jim Crow laws that enforced segregation in the US between 1876 and 1965. Benjamin focuses on the racial dominance of technology and exposes how this is not a minor glitch of the system but a part of institutionalized racism. To understand algorithmic discrimination, she mentions different examples.

One of them is that after a graduate student googled “unprofessional hairstyles for work,” she was provided with photographs of Black women; when she altered her search to “professional hairstyles for work,” she was given photos of White women (p.62). Another example of this same spirit is when tech corporations usually found in Silicon Valley hire lower-paid foreign tech workers to accomplish repetitive tasks, paying as little as $1.46/hour after tax (Williams, Miceli & Gebru, 2022).

One could ask if imagining conscious algorithms or algorithms rooted in social justice is delusional. How a feminist ethic based on care possibly enter into this discussion? What does that even mean?

An ethic based on care, an attribute that has also been embedded like femininity, embraces sensibility and concern for others, therefore, for the social world. Philosopher Virgina Held has emphasized that this ethics should not be understood as charity or altruism, as it could lead to a more paternalistic vision of the other’s care.

In her article “Philosophy, Feminism and Care” (2018), she states that this ethic calls for “cooperation rather than competition, mutual benefit rather than the maximizing of self-interest” (p.145). Following this premise, machine learning would no longer be glimpsed under rationality absent of sensitivity, which promotes efficiency to the maximum. It would have as a central axis of human experience and its complexities. Raw data, for instance, would be shaped through the intersectionality that characterises the world for which artificial intelligence is meant to be useful.

In search of a workable solution to the alienation to which machine learning pipelines seem to be subject, Dr Joanne Grey and Alice Witt from Queensland University of Technology, in their article “A feminist data ethics of care framework for machine learning: The what, why, who and how” (2021) have suggested that companies need to include diversity teams in their executive positions, and not just for filling quotas, but for constantly reflect and evaluate the datasets that are working with.

According to them, this could be achieved by discussing complex ethical problems and involving “community consultation and impact assessments for better contextualising machine learning technologies”. It is time, then, for humanities to join and learn about the tech universe to tackle the inequalities of the contemporary world. 

References

Benjamin, R. (2019). Race after technology: Abolitionist tools for the new Jim code. Polity Press.

Gray, J. E., & Witt, A. (2021, December 6). A feminist data ethics of care framework for machine learning: The what, why, who and how. First Monday, 26(12). doi: 10.5210/fm.v26i12.11833

Held, V. (2018). Philosophy, Feminism, and Care. Proceedings and Addresses of the American Philosophical Association, 92, 133–157. http://www.jstor.org/stable/26622970

Wajcman, J. (2007). From Women and Technology to Gendered Technoscience. Information, Communication & Society, 10(3), 287-298. DOI: 10.1080/13691180701409770.

Williams, A., Miceli, M., & Gebru, T. (2022, October 13). The Exploited Labor Behind Artificial Intelligence. Noema Magazine. Retrieved from https://www.noemamag.com/the-exploited-labor-behind-artificial-intelligence/

María Angélica Mantilla (Author)

Colombian Political Scientist and Historian based in Brighton, UK. Interested in gender, emotions, media representations and decolonial approaches.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Don't Miss

Women Writers: The Influence of British Writers on Latin American Literature | Rock & Art

Women Writers: The Influence of British Writers on Latin American Literature

British literature has been an endless source of inspiration for writers around
Petrona Martínez: The Great Bullerengue Singer

Petrona Martínez: The Great Bullerengue Singer

Petrona Martínez, the great bullerengue singer, has become one of the most