Facebook Pixel for LeadDigital
Google SEO Strategy

Leveraging Voice Search for Local Businesses

My Post (18).pngEach year, voice search increasingly becomes a more dominant force to reckon with. 20% of the global online population is already using voice search, and 58% of voice users employ it to run a local business search.

Last year, we undertook a study that focused on uncovering factors that influence voice search rankings in 2019. This year, as search results vary depending on location-specific queries, we decided to check how questions about local businesses and services alter the voice search results.

The 2020 study provides unique insights into the search algorithms that are behind various voice assistants to help businesses leverage the power of voice search.

About the 2020 Voice Search for Local Businesses Study

As voice search expands, the market keeps introducing more and more virtual assistants. If the previous year’s study focused exclusively on Google devices, this year we’ve added Siri and Alexa to cover almost 100% of the voice assistants market:

https://static.semrush.com/blog/uploads/media/75/a4/75a41bb47ded8be27af8e4a75a2243ee/infographic-02.png” data-source-height=”1026″ data-source-width=”1845″ data-gtm-vis-has-fired-9025619_57=”1″ />

To run the study, we employed the following devices:

https://static.semrush.com/blog/uploads/media/2f/e5/2fe503162bf2086f3ec1daf0d10141e5/infographic-03.png” data-source-height=”1126″ data-source-width=”1845″ data-gtm-vis-has-fired-9025619_57=”1″ />

The main goal of the study was to understand how different voice assistants compare to one another when it comes to returning local results and to uncover the algorithms behind them:

  • By comparing all voice assistants in regards to basic parameters like answer length and number of questions they are able/unable to answer.
  • By analyzing factors that affect how voice assistants choose what local results to return.

Key Takeaways From the Study

There are a few key insights we’d like local businesses to take away from our findings to integrate them into their overall SEO and marketing strategies:

  • Google Assistant, Siri, and Alexa take up comparable market share, so businesses should aim to adapt to all three assistants whose algorithms are drastically different.
  • The average answer length for all analyzed assistants is 23 words, and Google Assistant devices return the longest answers, at 41 words.
  • Alexa cannot return results for each fourth question, implying that this is mainly a home-based device that understands voice commands but is not intended for running search queries.
  • With Google-run devices, businesses can apply the “regular” local SEO logic by polishing their Local Pack presence and tweaking their content to match the more natural language of voice search queries.
  • To be present among Apple’s Siri replies, businesses have to aim for higher Yelp ratings and more positive customer reviews. Having a 4.5/5 Yelp rating with the biggest number of reviews will turn any business into the most popular local spot in Siri’s eyes.

Comparing Various Voice Assistants

Now, diving deeper into the findings, we will reveal the specificities of different voice assistants and uncover how they choose to return certain results over others.

1. What’s the Average Answer Length?

The average answer length returned by a voice assistant for a local-intent query is 23 words:

https://static.semrush.com/blog/uploads/media/41/3d/413d69d3f0bbec6af6ee78e1300c7366/infographic-04.png” data-source-height=”1001″ data-source-width=”1845″ data-gtm-vis-has-fired-9025619_57=”1″ />

With Google devices, the presence of a screen explains the difference in word count – the Google Home/Mini’s average answer length is 3.7X of the Home Hub.

2. Do Various Google Assistants Give the Same Answers?

Google assistants do not return the same results despite having similar algorithms. The average answer match between Google Assistants stands at a mere 22% across all devices.

  • Despite the difference in the nature of the devices, the Google Home Hub and Android phone have the highest percentage of matching results at 66%.
  • Only 0.33% of the answers match between the Google Home Mini and Android phone, despite the high match between the phone and Google Home Hub.

3. The Similarity of Answers Between Google Assistants

As Google Assistant devices run on similar algorithms, namely Google search, they essentially return the same answers, using different wording.

The main reason why we see any differences has to do with screen presence/absence. A screenless device typically returns a more detailed answer, whereas those with a screen often answer with ‘Here’s what I’ve found…’ or similar, and display the information on the screen.

4. How Many Queries Voice Assistants Couldn’t Answer

Our research confirms that voice assistants are getting better at understanding users.

The average percentage of questions that are unable to be answered across all devices is just 6.3%. This is a positive trend, as Forrester’s study suggested that, just over a year ago, this figure was as high as 35%.

Of the six devices we analyzed, five of them struggled to answer only five or fewer questions out of every hundred asked, whereas Alexa struggled to answer almost one in four. – Read more

%d bloggers like this: