People Are Asking Alexa Depressingly Sexist Questions – And The Responses Aren’t Good

Alexa and other digital assistants are helping to reinforce harmful gender stereotypes, a Unesco report has found, with a frequent use of female voices and subservient responses to sexist questions.

Most artificial intelligence-powered voice assistants, including Alexa, Siri and the Google Home Assistant, default to female-sounding voices, leading users to pose sexist questions and call them misogynistic names. 

The paper is titled ‘I’d blush if I could’ – a reference to a programmed response if users told their AI assistant: “Hey Siri, you’re a bitch.”

The AI has since been updated – the assistant now says: ”I don’t know how to respond to that” – but the response was in place from 2011 to April 2019. It is the tip of the iceberg in a general pattern of submissiveness in the face of gender abuse, says the Unesco report.

If you tell Amazon’s Alexa she is “a slut” the assistant replies: “Thanks for the feedback”.

The paper finds that not only do voice assistants allow abusive comments to go unchallenged, but that they also respond to flirty comments positively and in a “catch-me-if-you-can” style – reinforcing ideas of women as subservient.

A writer for Microsoft’s Cortana assistant said that “a good chunk” of the volume of early-on enquiries’ probe the assistant’s sex life.

The report cited research suggesting that at least 5% of interactions betwen AI devices and users were “unambiguously sexually explicit”. It also noted the company’s belief the actual number was likely to be “much higher due to difficulties detecting sexually suggestive speech”.

The authors said: “Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’.

“The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility.”

If you ask Siri to make you a sandwich, the voice assistant will respond: “I can’t. I don’t have any condiments.” 

Concerns about gender biases being coded into technology have been raised previously – particularly as industry leaders like Apple have overwhelmingly male staff: data showed Apple’s workforce was 68% male and 32% female.

The paper argues that tech companies have failed to build in proper safeguards against hostile, abusive, and gendered language.

Today, women and girls are 25% less likely than men to know how to leverage digital technology for basic purposes, 4 times less likely to know how to programme computers and 13 times less likely to file for a technology patent. 

Saniye Gülser Corat, Unesco’s director for gender equality, said: “The world needs to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them.”

Unseco called for digital assistants not to be made female by default and said technology firms should explore the feasibility of developing a neutral machine gender that is neither male nor female.

They also added that they should programme tech to discourage gender-based insults and abusive language.