Skip to main content

BERT: Google’s Latest Update to Search


Google on October 25th updated its search algorithm to implement BERT (Bidirectional Encoder Representations from Transformers), a natural language processor built to contextualize search queries. BERT can better understand context in search queries to provide more relevant results in recognizing natural linguistics that computers historically were not able to comprehend. Specifically, BERT can understand parts of speech like pronouns providing a mechanism to analyze these more natural search queries. BERT will not extend reachability to websites that are poorly written just because it can now understand them. This algorithm currently only applies to user search queries greatly increasing the relevance of the results. SEJ (SearchEngine Journal) provides an example of BERT in action following a standard google search.


A search using the phrase “how to catch a cow fishing?” pre-implementation of BERT would return results about cows. However, the context of this question suggests the results should be about how to fish a striped bass, contextualized through the keywords “cow”, “catch”, and “fishing”. With BERT implemented the results have to do with fishing, thus demonstrating the algorithm’s effectiveness in context identification. 


Google made BERT open sourced to allow developers to alter it and provide novel uses. BERT can be applied to other aspects of SEO critical to PageRank. If BERT were to be used on the actual content of websites, it could provide a much better evaluation of each site’s relevance to the query and improve analytics like readability. Implementing BERT in this way should limit page 1 results to highly relevant, well written and organized websites. BERT could also be used to improve the detection of spam links which are built to artificially improve a website’s PageRank. Since BERT can analyze natural language at times better than humans could, it could provide a “naturalness” metric that would be able differentiate between human made content and computer made, often spam content. A spam link would go down in credibility given its lack of natural content. This would further affect PageRank as pages would rank highly given the sites “humanness” i.e. how natural the language is in the website. 


BERT according to Google was the search algorithm’s greatest update in the past 5 years. It will change the way we search and the results we get. Natural language processors such as BERT provide a better way to search the web, that is as a more natural question, as well as provide better, more relevant content. A tool like this in theory should bolster the integrity of human made content on the web and allow a metric for determining naturalness. What could result is a PageRank algorithm impervious to spam, augmented to return more natural rankings.



Leave a Reply

Blogging Calendar

October 2019