Skip to main content

Demystifying the Voice of AI

Essay by Anastasia Lewis '24 for FMST Capstone

From the radio waves during wartime, through the television screen for the evening news, or even in the chambers of government dictating law, men have historically controlled the flow of information by placing themselves in positions of power and calculated trust. So why is the voice of women leading the realm of AI technology? Specifically, why are Siri, Alexa, Google Assistant (and others) undoubtedly female — and why do some even have human, female names? In a world that still operates on predominantly patriarchal system(s), this brief analysis seeks to explore the history and reasoning behind why the exponentially developing industry of AI is so frequently presented as female1 and what ramifications it may have on women’s social standing.

As a general statement, women have notoriously been excluded or gate-keptfrom the realm of technology and business. Though there may be more recent developments to improve such social structures and workplace norms, there remains an undeniable “glass ceiling” in those industries. One could argue that it is due in part to the 200+ year-long history of women being delegated to “terminal occupations”3 such as secretaries. According to Lingel and Crawford’s 2020 article “Alexa, Tell Me About Your Mother: The History of the Secretary,” secretaries originate with “a piece of office furniture” from the 18th Century and were used as “tools of organization, privacy, and efficiency [as] a display of professionalism, hierarchy, and status.”4 Over time and with technological developments (like the typewriter), men began to delegate so-called “menial tasks” to women, enforcing a submissive inferior role that focused on the gathering “of personal and professional information.”5 This structure has shaped a history with a “complex dynamic of trust, affect, listening, and subversion surrounding secretaries”6, 7 Hence, a critical analysis of this entangled history can reveal how such workplace hierarchies, cultural norms, and power structures continue to survive and operate in the evolving technological/business world.

Stepping back to examine this foundational dynamic, Lingel and Crawford explain how a female secretary created the opportunity for society to blur the lines between corporate and domestic working worlds. This created a “hybrid role [for women] as both an underling and vital insider [and] a liminal figure” as their work included both “visible and invisible labor.”8, 9 In other words, the history of secretarial work, and the societal norms, expectations, and treatment of women in such roles are the foundations and models that have built AI assistants. “There is a retained investment in performing gender, managing data, and arranging surveillance” that AI assistants are following, all of which is “very much tied to administrative and emotional support” that has historically been reflected and performed by female secretaries.10 As it stands today, roughly 95% of administrative assistants in the US are women and this has been the upward trend since the job became available to them around the early 20th Century. However, by the end of that century, “the job title of “secretary” lost both social and economic value [as] traditionally feminized roles in the workplace were increasingly sidelined while expertise11 became a more male-dominated... category.”12 This reflects a social cycle where men maintain control over the accessibility to specific jobs, technologies, and opportunities by limiting access to such systems and “expertise” through the manipulation and regulation of social and cultural norms in the workplace.

So why, then, is the AI assistant universally recognized (and frequently presented) as female? There are a few reasons. First, people have learned to trust female AI secretaries because they trust the history and place of a human, female secretary within her traditional role of a business structure.13 This is inherent because the algorithmic predicting, natural language processing, and machine learning AI technologies mirror secretarial operations through “the tasks they perform [and] the gendered valences of how that work is performed.”14 In other words, AI assistant programmers have deliberately recreated the ever-supportive, attentive, competent, female subordinate secretary, only this time, she is not even human. Assistants like Alexa (Amazon, 2014), Google Assistant (Google, 2016), Siri (Apple, 2011), and Cortana (Microsoft, 2014) have become so popular because they are far more accessible, affordable, and controllable than human women.15 For a contrasting example, IBM generated a male AI voice for Jeopardy (“Watson”) because male voices are chosen when people are seeking a self-assured voice that uses short definitive phrases and reflects the attitude/tone of a leader.16 AI (female) assistants, however, are perceived to “not be in charge,” because “they embody what we think of when we picture a personal assistant: a competent, efficient, and reliable woman.”17, 18 I would further argue that using a female voice or face to present a new commodity not only conveys a calming effect19 but also makes it appear more “attractive.”20 People want and expect their personal secretaries (both human and AI) to be submissive in  order to maintain a specific, but traditional power dynamic: men (human bosses) controlling women (machine assistants).

Interviews with Amazon and Microsoft spokespeople further explain this pattern. As they described, female voices were selected for AI assistants because “in gender research when choosing a voice and weigh[ing] the benefits of male [versus] female voices,” a female voice presented a more “helpful, supportive, trustworthy assistant” which remained in line with corporate objectives.21 Additionally, there is a linguistic component embedded in AI assistant programming that is tied to gender: women are known to use more “pronouns and tentative words than men” and in particular, make “I” statements, which (fun fact) is “indicative of lower social status” and thus comes across as socially inferior.22, 23 There are of course other classifications of education and race assigned to AI assistants that reiterate and confirm such lower status. Lingel and Crawford cite Amazon Alexa’s detailed backstory (as a well-educated, white female) to expose the expected class, education, race, and narrative standard for AI assistants.24 Finally, Lingel and Crawford note that assigning a human name, and in particular one with a female connotation, to a machine is “our way of ultimately exerting control over it.”25 This is yet another way for individuals (aka men) to reassert dominance and power over technology (and historically women) and maintain (human) control.26

So what does this mean for the larger social perspectives and futures of women in the workplace? Unfortunately, all that AI assistants have achieved for human women is a subconscious amplification of harmful gender/female stereotypes and the continued normalization of mistreatment and harassment in the workplace. As Steele pointed out, “When we can only see a woman, even an artificial one, in that [submissive] position, we enforce a harmful culture.”27 Lingel and Crawford expand that statement to include the fact that dehumanization, objectification, and discrimination are not directed actions restricted to human secretaries, but AI assistants have also “been ridiculed, harassed, and threatened by their users.”28 But unlike human secretaries, AI assistants do not always report such harassment and they are arguably more cost-effective and efficient than a typical human secretary. As a result, AI assistants are beginning to offset the need for hired human labor while simultaneously continuing to reduce the respect and recognition of secretarial work.29 So though women may voice AI assistants, the industry is still being dominated by male programmers who have found a new way to regulate women in the workplace.

1 There are, of course, other factors such as race, class, and education that are linked to AI assistants which may be briefly discussed/mentioned in this paper. However, this paper’s main focus is on the gender of AI assistants. Further exploration of the interdisciplinary effects of the other factors would require further research and a much longer analysis to adequately capture and examine the full extent of their impacts/reflections on general society.
In this context, the term “gate-kept” is referring to both the literal historical barriers and segregation of women from many workplaces as well as the modern phenomena of women breaking through some glass ceilings, but still often finding themselves confronted by “new” barriers and challenges that only they (women) have to overcome.
3 Referring specifically to jobs that do not have promotions and instead remain stagnant in the working hierarchy.
4 Lingel, Jessa, and Kate Crawford. "Alexa, Tell Me About Your Mother: The History of the Secretary and the End of Secrecy" Catalyst: Feminism, Theory, Technoscience, vol. 6, no. 1, 15 May 2020, pg 5.
5 Ibid, pg 4.
6 Lingel 2020, pg 4.
7 This raises a possible connection to Haraway’s “Cyborg Manifesto” as the history and cultural patterns I describe above are (in some ways) echoed by Haraway: “The cyborg is our ontology; it gives us our politics... in the traditions of “Western” science and politics—the tradition of racist, male dominant capitalism; the tradition of progress; the tradition of the appropriation of nature as resource for the productions of culture; the tradition of reproduction of the self from the reflections of the other” (Haraway 2016, 7).
8 Lingel 2020, pg 7.
9 This previews how AI technologies have been able to seamlessly transition into these “assistant roles” as they operate in similar liminal contexts of “visible and invisible” work.
10 Lingel 2020, pg 4.
11 Expertise is being interpreted/measured here as individuals who are “able” to understand, maintain, and program newer technologies. As seen in the quote, this is specifically and traditionally reserved and assigned to men and thus creates another kind of barrier that separates genders in the workplace.
12 Lingel 2020, pg 8.
13 Lingel 2020, pg 11.
14 Ibid, pg 9.
15 This connects to yet another example as observed in the movie her (Jonze, 2013) with their AI assistant “Samantha.” In Kornhabar’s article, she describes Samantha as “a kind of fantasy end point of the imagined disjunction between information and infrastructure, intelligence and embodiment - the hyper abstraction of data that is at the origin of so much of contemporary digital culture, grounded in "the great dream and promise of information . . . free from material constraints'' (Kornhaber 2017, 7). In other words (and within the context of this analysis), an individual now can have all the benefits of a knowledgeable and powerful female, without the complex and emotional human component.
16 Steele, 2018.
17 Lingel 2020, pg 9.
18 There are some AI assistants who are known to be “genderless” and/or have a gendered voice option (male v female). For example, Cortana can technically be genderless while Siri and Google Assistant have options for male voices. However, all of them require the user to manually change the voice and companies tend to lead with female voiced AI assistants because their internal studies have shown that female voices market and  perform better in the general public.
19 Lingel and Crawford further discussed how “female-voiced technology is the pleasing compromise that succeeds in being informative and servile to a human master without being truly, independently intelligent” especially in cases of emergencies or informative announcements (Lingel 2020, 10).
20 This “attraction” is in reference to the fetishization of women by men and the male gaze, a common theme across most media and feminist studies (i.e. Laura Mulvey). Across most media platforms, the presentation of women alongside a product or commodity has often been used as a marketing tactic to draw attention from viewers, especially men.
21 Steele, 2018.
22 Ibid.
23 This inferiority ties to the Westernization and discrimination of language(s). In quick reference to linguistic studies, it is revealed that language can provide a keen insight into dominating/hegemonic cultures and societies as they relate and/or suppress others. (i.e. What is the nationally recognized v. colloquial language? When traveling, what dialects and soundhouses are primarily used for communication and/or “universally understood”?)
24 For reference, in an interview with Alexa’s design team, the following backstory was recorded: “She comes from Colorado... She’s the youngest daughter of a research librarian and a physics professor who has a B.A. in art history from Northwestern... When she was a child, she won $100,000 on Jeopardy: Kids Edition. She used to work as a personal assistant to “a very popular late-night-TV satirical pundit. And she enjoys kayaking” (Lindel 2020, 10).
25 Lingel 2020, pg 9.
26 Ibid.
27 Steele, 2018.
28 Lingel 2020, pg 13.
29 Ibid, pg 9.

Works Cited

Haraway, Donna J.. A Cyborg Manifesto: Science, Technology, and Socialist-Feminism in the Late Twentieth Century. University of Minnesota Press, 2016.

Jonze, Spike, director. her. Warner Bros., 2013.

Kornhaber, Donna. "From Posthuman to Postcinema: Crises of Subjecthood and Representation in ‘Her.’" Cinema Journal, vol. 56, no. 4, 2017, pp. 3–25. JSTOR,

Lingel, Jessa, and Kate Crawford. "Alexa, Tell Me About Your Mother: The History of the Secretary and the End of Secrecy" Catalyst: Feminism, Theory, Technoscience, vol. 6, no. 1, 15 May 2020,

Steele, Chandra. "The Real Reason Voice Assistants Are Female (and Why It Matters)." PC Magazine, Ziff Davis, 4 Jan. 2018,