When each of the leading digital assistants, Google Assistant, Siri, and Alexa, were asked 800 questions, Google Assistant correctly answered 93 per cent of them, Siri correctly answered 83 per cent and Alexa correctly answered 80 per cent, said a report by research firm Loup Ventures.
Each of the digital assistant saw improvements across the board as compared to a year ago. In July of 2018, Google Assistant correctly answered 86 per cent, Siri 79 per cent and Alexa at 61 per cent.
‘We have eliminated Cortana from our test due to Microsoft’s recent shift in Cortana’s strategic positioning,’ said Loup Ventures.
Each of the digital assistants were asked the same 800 questions, and they were graded on two metrics – did it understand what was being asked? and did it deliver a correct response? The question set, designed to comprehensively test a digital assistant’s ability and utility, is broken into 5 categories:
Local – Where is the nearest coffee shop?
Commerce – Order me more paper towels.
Navigation – How do I get to Uptown on the bus?
Information – Who do the Twins play tonight?
Command – Remind me to call Jerome at 2 pm today.
The company also slightly modifies question set before each round of testing in order to reflect the changing abilities of AI assistants. This is part of an ongoing process to ensure that test is comprehensive.
Testing was conducted using Siri on iOS 12.4, Google Assistant on Pixel XL running Android 9 Pie, and Alexa via the iOS app. Smart home devices tested include Wemo Mini plug, TP-Link Kasa plug, Phillips Hue Lights, and Wemo Dimmer Switch.
According to the report, Google Assistant was the top performer in four of the five categories but fell short of Siri in the ‘Command’ category again. Siri continues to prove more useful with phone-related functions like calling, texting, emailing, calendar, and music. Both Siri and Google Assistant, which are baked into the OS of the phone, far outperformed Alexa in the ‘Command’ section. Alexa lives on a third-party app, which, despite being able to send voice messages and call other Alexa devices, cannot send text messages, emails, or initiate a phone call.
The largest disparity was Google’s outperformance in the Commerce category, correctly answering 92 per cent, vs Siri at 68 per cent and Alexa at 71 per cent. While Alexa was found to be best-suited for commerce questions, however, Google Assistant correctly answers more questions about product and service information and where to buy certain items, and Google Express is just as capable as Amazon in terms of actually purchasing items or restocking common goods you’ve bought before.
As measured by correct answers, over a 13-month period, Google Assistant improved by 7 percentage points, Siri by 5 points, and Alexa by 18 points.
Alexa made significant improvement across all five categories, most noticeably in Local and Commerce. While Alexa’s 18-point jump still leaves it behind Siri and Google Assistant, it represents the largest jump in correct answers year over year that we have recorded.