Who isn’t a fan of the Muppets? In this scene, Ernie presents a scenario to Bert and Bert has to figure out from context what is going on, why there are so many decorations on the wall, and who the celebration is for. Ernie gives him some clues and, after a few tries, Bert figures out the correct result and ends up staring back at us, the viewers, from behind the screen (much like Google, Amazon, and Apple already do).
This is analogous to Google’s recent BERT update. In this case, were Bert an AI searchbot, he would’ve relied upon BERT to help put Ernie’s input into context to arrive at the most relevant result more quickly. So it’s BERT on Bert. Or maybe it’s Bert on BERT. (BERT in Bert?)
Make sense? No? See below.
What is Google’s BERT Update, in Plain English?
BERT (or “Bidirectional Encoder Representations from Transformers”) is a deep machine learning update that helps Google’s AI better understand words in context, especially prepositions and pronouns, synonyms and ambiguities which can be confusing to their ever-improving, but still-pretty-dumb searchbots.
For once, Google spells out its BERT update in more-or-less plain English, calling BERT their “most important update in five years.”
“These improvements are oriented around improving language understanding, particularly for more natural language/conversational queries, as BERT is able to help Search better understand the nuance and context of words in Searches and better match those queries with helpful results.
Particularly for longer, more conversational queries, or searches where prepositions like ‘for’ and ‘to’ matter a lot to the meaning, Search will be able to understand the context of the words in your query. You can search in a way that feels natural for you.”
-Almighty Google, October 25, 2019
For us humans, natural language processing is, well, “natural.” We have no problem disseminating between words that have multiple meanings (“polysemy”) and the messes of words in between.
- George loves to milk cows on the farm.
- Louise farmed out menial tasks to an intern.
- Unfortunately Sam bought the farm yesterday.
For us human robots, obviously, we have no problem understanding that “farming out” and the idiom “bought the farm” have nothing to do with an actual farm. This is not so obvious to a poor AI robot who, in the past, has only been able to interpret language in one direction (or “unidirectionally”).
BERT has changed that. Let’s break down the acronym.
What Does Google’s B.E.R.T. Stand For?
B = “Bidirectional”
BERT’s language modeler – for the first time in the history of language modeling – is able to move in both directions, so is able to graph distances between words, syntactical relationships and context on either side of a word almost instantaneously. It’s akin to sentence diagramming, so “B” provides the graphed language data for the searchbot to analyze.
E R = “Encoder Representations”
A fancy label for the input/output device aka the “plug”
T = “Transformers”
No, these aren’t the transforming robots of the ‘80s – no searchbot will be able to transform into a car, truck or F-16 anytime soon. The transformers are the “interpreters” of “B’s” language graph. See the below technical article for a detailed explanation of how it works, but the AI brain uses masking technology and pronoun/preposition analysis to “make its best guess” as to the meaning of the sentence or phrase.
Why Should You Care About The BERT Update?
Anytime Google unveils an important update like this, the SEO multiverse goes all atwitter (literally). Everyone clambers to be the first to figure how to best optimize for (aka “cash in on”) the most recent update.
So how do we as marketers best change trajectory and optimize our content in the wake of BERT? The early answer is: write better and more naturally. At this stage (according to Google), BERT affects 10% of total searches, so it’s still in its early stages. That doesn’t mean we shouldn’t get ahead of the ball, however.
Our .02 is that, if anything, BERT is another step towards freeing us from our chains to targeted keywords and keyword phrases. That’s not to say we should stop paying attention to keyword data and data-driven topical strategy, but focus should be on tighter content structure/organization and pristine writing with good sentence structure that is easy to understand.
What’s the Best Way to Optimize for BERT? – Q&A Format
Whenever possible, you should use a Q&A format in online content pieces and observe a tightly controlled outline structure. The searchbots appreciate the “clean file cabinet” approach (as in, they’ll reward you for it), plus it’s a user-friendly format – the searcher can skim the part of the piece most relevant to them – PLUS it gives us a leg up on conversational search (a topic for another time). We’ve had a lot of success with this approach in the rankings over the past year:
- Main Question – What is BERT?
- Category – What Does the BERT Acronym Stand For?
- Subcategory 1: What Does the B in BERT Stand For?
- Subcategory 2: What Does the ER in BERT Stand For?
- Subcategory 3: What Does the T in BERT Stand For?
- Category – What Does the BERT Acronym Stand For?
BERT should make our lives easier as marketers and free us up to write more natural content that is engaging, entertaining and worthwhile to our real audience: the human being staring back into the robo-puppet eyes of BERT from behind the screen.
Sources and for further reading:
In-depth technical explanation: https://www.searchenginejournal.com/bert-explained-what-you-need-to-know-about-googles-new-algorithm/337247/#close
BERT myths debunked: https://www.searchenginejournal.com/google-bert-misinformation/332931/