Google love to give their updates names, we previously had FRED, and now we’ve been hit with Google BERT. Now we can imagine that some of you will be thinking the new update is named after the rather grumpy Seasame Street character, but alas, BERT actually stands for Bidirectional Encoder Representations from Transformers.
Yes, that is what it stands for, and we understand if you think that it is just some random words put together to form an acronym. What does Google BERT mean? Well, the new algorithm change is all based around complex searches that depend on context.
Here’s the quote directly from Google:
“These improvements are oriented around improving language understanding, particularly for more natural language/conversational queries, as BERT can help Search better understand the nuance and context of words in Searches and better match those queries with helpful results.
Particularly for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning, Search will be able to understand the context of the words in your query. You can search in a way that feels natural for you.”
As we ask Google more complex questions every day, the BERT update is designed to deploy the correct results for searches with conversational queries. Previously before BERT, if you searched for “best website designer for my company”, Google would probably return a search for pages optimised for website designer and company. Now with the processing power of BERT, Google will return results with the natural language from the web.
BERT is great for on-page SEO. Website’s with poorly written content will fall foul with BERT. As BERT is programmed to read a more natural language, sites with shoehorned, overly spammy content will not read right. Think of BERT as an actual person, if you walked up to someone on the street and asked “Sandwich Shop Darlington”, they would probably look at you like you were crazy and walk away at speed. Change the way you ask, “best place for a sandwich in Darlington”, and you would most likely get the proper response. This is similar to how BERT will work.
The new update also comes into play when the context of a search plays a part in the results displayed. Some words are the same but carry different meanings across different markets. Take “ball” for instance. Are you searching for a rugby ball or a ball to attend with guests? Context plays a huge part in what results are returned, and with the deployment of BERT, search results served will meet your query and filter out the searches that aren’t relevant.
You’ve probably read this blog and had the same thought we had. BERT search sounds how we talk to Google Assitant, Alexa and Siri. BERT sounds like the perfect start for a global algorithm that can develop over multiple devices. As we speak to search more, instead of typing, BERT will play a massive part in getting the right searches in front of the right person.