$(‘#scheader .sc-logo’).append(‘ ‘);
__gaTracker( “send”,”event”, “Sponsored Category Click Var 1”, “digital-marketing-tools”, ( $(this).attr(‘href’) ) );
Google Assistant is the most accurate smartphone digital assistant, beating Amazon Alexa, Apple’s Siri, and Microsoft Cortana.
A report released this week from Loup Ventures contains data from a digital assistant test, which measures how well the four competitors answered a series of 800 queries.
Google Assistant came out on top with 85.5% of queries answered correctly.
Apple’s Siri was a not-so-distant runner up with 78.5% of queries answered correctly.
From there the gap widens with Amazon’s Alexa answering 61.4% of queries correctly.
Microsoft’s Cortana came in last, answering just a little over half of the queries correctly (52.4%).
For what it’s worth, Apple’s Siri was the most improved digital assistant compared to a similar test conducted last year by the same company.
The test is divided into 5 categories:
Google Assistant came out first in all but the ‘Command’ category. That refers to commands which control actions performed by the phone, such as creating a calendar entry or sending a text message.
In other words, according to this report, Google Assistant bested its competitors when it comes to processing and delivering accurate information.
“One of the largest discrepancies between the assistants was in the Information category, where Google achieved the highest percentage of correct answers we have seen in our testing (93%). This should come as no surprise given the company’s core competencies, but the depth of answers and true usefulness impressed us.”
Loup Ventures attributes Google Assistant’s performance in this area to the search engine’s featured snippets.
“Where others may answer with, “here’s what came back from a search” and a list of links, Google is able to read you the answer. We confirm that each result is correct, but this offers a huge advantage in simple information queries (one of the most common voice computing activities).”
Note that this test only examined how the digital assistants performed on a smartphone.
The test did not look at the performance of these digital assistants via their respective smart speakers. That test will be coming later this fall.