Amazon’s virtual assistant Alexa has been accused of sexism after being unable to respond to a question about the Lionesses’ World Cup semi-final.
British academic Dr Joanne Rodda asked Alexa for the result of Wednesday’s match against Australia, which England won 3-1.
But the supposedly ‘smart’ technology didn’t even know the match had taken place as it was only familiar with the men’s game, the BBC reports.
Astonishingly, when Dr Rodda asked ‘for the result of the England-Australia football match’, Alexa said there was no such match.
Amazon admitted the mistake was due to an ‘error’ – although it didn’t specify the cause – and that Alexa will get better at learning over time.
Amazon, which released its 5th generation Echo Dot smart speaker (pictured) last year, confirmed it is winding down celebrity voices. The fun tool let users receive audio responses from their Echo device in the voice of their chosen celebrity
The Lionesses beat Australia 3-1 on Wednesday to reach the World Cup final. Pictured, England midfielder Keira Walsh (left) with Australia’s Clare Hunt during the match, which took place at Sydney’s Stadium Australia
Dr Rodda, a psychiatrist at Kent and Medway Medical School, told the BBC it showed ‘sexism in football was embedded in Alexa’.
She was only able to get the answer from Alexa when she told it she was talking about women’s football and not men’s.
‘When I asked Alexa about “the women’s England-Australia football match today” it gave me the result,’ Dr Rodda said.
‘[It’s] pretty sad that after almost a decade of Alexa, it’s only today that the AI algorithm has been “fixed” so that it now recognises woman’s World Cup football as “football”.’
Dr Rodda also had problems getting information about the Women’s Super League – the top tier of the women’s game in England and the equivalent of the Premier League.
‘Out of interest, I just asked Alexa who Arsenal football team are playing in October,’ she said.
‘It replied with information about the men’s team, and wasn’t able to give an answer when I asked specifically about women’s fixtures.’
Amazon’s smart assistant powers the Echo speakers, including the spherical fourth generation Echo released in autumn 2020 (pictured)
In response to a request for comment, an Amazon spokesperson told MailOnline: ‘This was an error that has been fixed.’
According to the company, when a customer asks Alexa a question, information is pulled from a variety of sources.
These include ‘licensed content providers and websites’, although it didn’t specify which ones – and whether this includes recent sports reports.
Amazon has already faced accusations of sexism from the UN, which claimed using a female voice reinforces the idea that women are ‘subservient’.
It also criticised the way female AI systems in general respond to gender-based insults with ‘deflecting, lacklustre or apologetic responses’.
Amazon took action in 2021 by introducing a male voice to its smart speakers, giving users another option to choose from and redressing the balance.
Also that year Amazon added ‘Ziggy’ as one of its ‘wake words’ – words that users can say before a command to make sure the smart assistant is listening.
But to reflect modern gender diversity, Amazon let users choose between either the male or female voice and use any of the wake words to activate them.
This means users can potentially start a command with the word ‘Ziggy’ and hear the female voice responding, or say ‘Alexa’ and hear the male voice responding.
More recently, the tech giant retired all three celebrity voices for its smart speakers – Samuel L. Jackson, Shaquille O’Neal and Melissa McCarthy.
Amazon offered the superstar voices for $4.99 each as an alternative to Alexa, but these are no longer available for purchase on its website.