“With the latest advancements from our research team in the science of language understanding–made possible by machine learning–we’re making a significant improvement to how we understand queries, representing the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search,” said Pandy Nayak, VP of Search at Google.

The new algorithm is called, simply, BERT.  That stands for Bidirectional Encoder Representations from Transformers.

“This breakthrough was the result of Google research on transformers: models that process words in relation to all the other words in a sentence, rather than one-by-one in order,” said Nayak.   “BERT models can therefore consider the full context of a word by looking at the words that come before and after it—particularly useful for understanding the intent behind search queries.”

Here are some examples from before and after the algorithm change: