Getting your Trinity Audio player ready...
|
Google has made its search engine more human.
The company is rolling out a change to its core search algorithm that could change the rankings of results for as many as 10% of queries.
This update is based on a cutting-edge natural language processing (NLP) technique, called Bidirectional Encoder Representations from Transformers, or BERT. The technology was developed by Google researchers and has been applied to its search product over the last few months.
BERT has helped us grasp the subtle nuances of language that computers don’t quite understand the way humans do.
Pandu Nayak, Google Fellow and VP, Search
This breakthrough helps Google process words in relation to all the other words in a sentence, rather than one-by-one in order. BERT can therefore consider the full context of a word by looking at the words that come before and after it—particularly useful for understanding the intent behind search queries.
Simply put, while the old Google search algorithm treated a sentence as a “bag of words,” the new algorithm is able to understand the context of the words and return results accordingly.
“With the latest advancements from our research team in the science of language understanding–made possible by machine learning–we’re making a significant improvement to how we understand queries,” said Nayak, “representing the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search. “
Google’s BERT update means that the concept of keyword stuffing is now history, for good. The company acknowledges that over time, people have been trained to think in terms of keywords, and they often use “keyword-ese,” typing strings of words that they think search engines understand better, but aren’t actually how they’d naturally ask a question.
After BERT, Google search can now be more like a conversation. “No matter what you’re looking for, or what language you speak, we hope you’re able to let go of some of your keyword-ese and search in a way that feels natural for you,” says Nayak.
Particularly for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning, Search will be able to understand the context of the words in your query. You can search in a way that feels natural for you.
Pandu Nayak, Google
The strategy of depending on keywords volume to drive search has been frowned upon for a while, so it’s unlikely this update will impact reputable publishers in any major way. As Google’s public search liaison says:
“The company says that it doesn’t anticipate significant changes in how much or where its algorithm will direct traffic, at least when it comes to large publishers,” noted Dieter Bohn, Executive Editor for The Verge.
Publishers will also not immediately notice much of a difference because while analyzing rankings, they most likely track queries that send higher volumes of traffic, and those tend to be short-tail queries, not the long-tail ones BERT is more likely to impact.
“BERT is just another step in Google’s effort to understand what people want when they search. The more you deliver what people want, the more likely you are to rank high in search results,” says Jeff Haden, Contributing editor, Inc. “Because trying to “game” a system never works for long. But providing genuine value does.”
While currently BERT is primarily helping with ranking results in the U.S. in English, the search giant will bring this to more languages and locales over time.
“A powerful characteristic of these systems is that they can take learnings from one language and apply them to others,” says Nayak. “So we can take models that learn from improvements in English (a language where the vast majority of web content exists) and apply them to other languages. This helps us better return relevant results in the many languages that Search is offered in.”
Google has continued to improve Search since its original deployment, but the current update is the “single biggest … most positive change we’ve had in the last five years and perhaps one of the biggest since the beginning.”
“But you’ll still stump Google from time to time. Even with BERT, we don’t always get it right,” admits Nayak. “Language understanding remains an ongoing challenge.”
Nevertheless, in spite of its imperfections, Google’s public search liaison is confident BERT is doing just fine: