Dezeen Magazine

United Nations criticises voice assistants for promoting gender biases

United Nations criticises voice assistants for promoting gender biases

A United Nations report has found that Apple's Siri, Amazon's Alexa, and other female-voiced digital assistants "reinforce commonly held gender biases".

The assistants both reinforce negative ideas of women as being subservient and help to normalise verbal assault, a new report by the United Nations Educational, Scientific and Cultural Organization (UNESCO) argues.

"Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like 'hey' or 'OK'," said the report.

"It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment."

Agency concerned over sexually explicit comments

UNESCO notes that there has been a "sudden proliferation" of these technologies, especially since the introduction of smart speakers, which saw sales of approximately 100 million units in 2018.

This, combined with the high proportion of sexually explicit comments the assistants record — five per cent of all interactions on conservative estimates — has triggered the agency's concern.

The statements feature within a wider report from UNESCO addressing the digital skills gap between men and women worldwide. It's titled I'd Blush If I Could, after the response that Apple's AI assistant used to give to the statement "Siri, you're a slut".

The company changed this programming after a 2017 investigation from Quartz highlighted the light-hearted language with which voice assistants respond to such gendered abuse. Siri now greets this command with a more blunt "I don't know how to respond to that".

UNESCO calls on technology companies to do more

UNESCO acknowledges that this is a positive first step but calls on technology companies to do more to counteract the harm created by years of female-sounding AI assistants.

The agency asks that companies end the practice of making the assistants female by default, explore the feasibility of genderless machine voices, programme them to discourage sexist insults and train the AI around gender-sensitive data sets.

It also recommends several ways to increase diversity in the tech industry, since the prejudices of design teams and writers' rooms inevitably filters into their products.

Siri, Alexa and Cortana launched with female voices

Most of UNESCO's research centres on Apple's Siri, Amazon's Alexa and Microsoft's Cortana — all gendered female from the outset by their names — and Google Assistant.

Together, these technologies make up 90 per cent of the voice assistant market, all launched with only a female voice, and all still operate that way by default in most countries.

For positive comparison, the report cites Q, the world's first engineered gender-neutral voice and Kasisto's chatbot Kai, which is explicitly identifies itself to users as a genderless robot and tells them to stop harassing it if they use aggressive or sexual language.

UNESCO says its discussion should make the gender implications of artificial intelligence technologies visible at a time when such technologies are increasingly entering the mainstream.

"The gender issues addressed here foreshadow debates that will become more prominent as AI technologies assume greater human-like communication capabilities," said the report.

Photography is by Tyler Lastovich.