skip to main content

Why do most digital voice assistants have female voices & names?

Photo: Getty
Photo: Getty

Sarah Magliocco explores why nearly all digital voice assistants are programmed with female voices and names, and what this says about gender bias.

Hey Siri! Why do nearly all digital voice assistants have female voices and names?

It seems like an innocent enough question, one that we would assume has an answer rooted in science - something along the lines of vocal pitch preference or pattern – but it was a question that spawned an online conversation about gender bias, gender roles, and over-thinking when posed by The Sleepover Club podcasters Ione Gamble and Halima Jibril.

Ione, author and founding editor of digital publication Polyester, and writer and community editor Halima set TikTok ablaze with their examination of the topic, with their video breakdown of the topic racking up a quarter of a million views.

A number of studies and articles have already scrutinised the link between female AI voices and the perpetuation of sexist stereotypes, as Siri Alexa, Microsoft's Cortana, and Google Assistant have all had feminine voices as the default.

Photo: Getty

The feminine default

While all digital assistants have received updates since their conception, with a range of different voices now available in the settings options, for most, the first time you plug it in it will be a feminine voice responding to your commands.

When you look for weather updates, directions, or a change of radio station, a female-presenting voice will take care of the request without complaint.

Data from Statista estimates that in 2022, 6.4 billion people globally are using voice-activated intelligence, whether it is via smartphones, in-car GPS systems, or speaker systems.

With such a huge portion of our world’s population engaging with AI assistants, it is worth examining how their presentation can impact how we see them and interact with them, and what knock-on effect it has, if any, in our day-to-day lives.

If all voice assistants are female, does that lead us to revert to making assumptions about women that hundreds of years of progress have already refuted and worked to overturn? If we place only female presenting tech in this subservient role, can that lead people to have less respect for the female voice when it emerges from a larynx instead of a speaker?

Photo: Getty

AI equality

According to a much-discussed study by UNESCO, voice assistants with female voices have the power to perpetuate harmful stereotypes about female-identifying individuals.

The report by UNESCO explained that companies staffed by "overwhelmingly male engineering teams, have built AI systems that cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation."

The title of the study, I'd blush if I could, is named after the response Apple's Siri used to give when it received sexist abuse from users.

Siri has been given some self-respect in recent years, as its default response to being called hateful names is now "I won’t respond to that" – go Siri, give them the virtual cold shoulder – but many argue there is still a way to go when it comes to figuring out how best to apply society’s standards of gender equality to AI.

"Because the speech of most voice assistants is female, it sends a signal that women are... docile helpers, available at the touch of a button or with a blunt voice command like 'hey' or 'OK'. The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility," says the study.

Photo: Getty

Sexy Sally & Barking Bob

The history of feminine voices and their roles in tech assisting goes back much further than Amazon or Apple's beginnings. Since the 1950s, the development of various aircraft for the US military saw fighter crafts begin to get fitted with voice command systems, which warned pilots of issues and guided them to make various decisions.

One such system, voiced by singer and actor Joan Elms, was nicknamed 'Sexy Sally', while a later system was informally dubbed 'B*tching Betty'. A masculine equivalent featured in later voice systems is called 'Barking Bob' by some pilots.

The researchers believed that a woman’s voice was more likely to get the attention of the predominantly male pilots over the din of direct communication radio frequencies. This has been disproven in various studies throughout the 2000s, but companies gathered a database of majority female voice samples for use in such systems over the decades.

Photo: Getty

Vocal Fry

Women were previously made to hold certain roles and occupations in society, and generations of breaking gender biases have gone into reshaping our world to be a more equal one (though we are not quite there yet).

While being a homemaker is a valid form of work – in fact, the very framework of our industrially motivated civilisation would collapse without the unseen labour done by those who work in the home – homemaking was traditionally done by women and as such, stilted career prospects for women before the modern age.

Because of this, roles such as telephone operations, receptionist work, and assistant work were previously more often held by women than men, and a feminine voice became what people were accustomed to when they thought about helpful compliance - despite people of all gender identities having the ability to deliver information effectively and in the neutral, light manner that has become the preference for digital vocal work.

"It's much easier to find a female voice that everyone likes than a male voice that everyone likes," the late Stanford communications professor Clifford Nass, told CNN in 2011."It’s a well-established phenomenon that the human brain is developed to like female voices."

However, with the Google autofill for "women's voices is" being "annoying," that does not exactly hold up. Female radio hosts, podcasters, and even celebrities like the Kardashians are frequently slated for their voices, which may suggest that people only think they prefer feminine voices for these roles because they are accustomed to them.

Photo: Getty

Man and machine

While the voices we hear coming out of our devices may be female-coded by default, they are also deeply technological, with strange inflection, pacing, and a stilted tone that lets us know we are speaking to a machine. Perhaps this is to prevent people from becoming too frustrated when their devices don’t know exactly how to answer their questions or get something wrong.

The slightly unnatural tone of the responses reminds us that we are dealing with a technical issue, not a personal one. However, people still get personal with their machines, judging by the fact that AI assistants are now programmed with responses to abusive questions.

The issue was once that AI assistants had problems recognising and challenging inappropriate behaviour. Now they can recognise it, but still do not challenge it.

It could be lumped in with the world’s wider issue of glorifying violence towards women, with games, films, and online incel communities reveling in negative depictions of women. If you can degrade your female-voiced AI assistant without retaliation, then does that instill the belief - consciously or unconsciously - that the same actions can be perpetrated toward women in reality?

We can’t know this yet for sure as the concept is so new, but the data so far confirms that these compliant female robot voices reinforce gender stereotypes, and it is not necessarily because they act as digital servants to the user, but because regardless of how they are asked, they will comply.

Photo: Getty

Gender neutral devices

This form of technology is all about communication, and people do not respect their machines like they respect other people. Giving these lesser-respected entities female voices has the potential to have a significant cultural impact.

The UNESCO report argues that making the voices gender neutral and programming them to properly deflect and discourage sexist abuse, or abuse in general, can help to deter the increasing popularity of AI voice assistants from having a negative impact.

These systems are designed to help us, and we are meant to boss them around, and removing the gendered aspect from the devices can help to alleviate the false stereotype that women are subservient, available to help at the touch of a button, and unable to retaliate to abuse that the current default voices may perpetuate.


The views expressed here are those of the author and do not represent or reflect the views.

Read Next