Google Domain Diversity Update

The new Google domain diversity is now rolled out. Find out how it affects you and your SERP rankings.

Updated: 14 Dec, 21 by Antoniy Yushkevych 13 Min

List of content you will read in this article:

Google has one of the largest and widely used search engines globally, and the bulk of its annual revenue comes from advertising services provided through the Chrome search engine. It has the largest search engine market share of almost 70%, and it consistently dominates other competitors. 

Google has the edge over every other company because of its proprietary search engine algorithm. It is one of the top trade secrets that Google has in its arsenal, and very few people in the world know how it fully operates. 

People often think the algorithm consists of only one main complex system to retrieve data and display relevant results, but in action, it has millions of small algorithms that collectively operate to deliver the best possible results for a query. 

Now, to maintain the relevance of searches, the algorithm needs to change and adapt to user behavior and search activity in different geographic locations. In the early days of Google's search engine development, there weren't many updates to affect search rankings drastically. But now, we see thousands of updates every year from Google, and some of them completely refresh the algorithm's functionality. 

Google rolls out algorithm updates to create an ethical environment for user search and increase rankings for more relevant websites. In this article, we will dial back and see what major updates Google had launched in the last couple of years and understand their impact on businesses and online professionals. 

Searching for a piece of particular information on the web is nearly impossible if we don't have a proper sorting algorithm. The algorithm has to check through billions of web pages to find relevant search results for a user query. It has to return the best results in a fraction of a second. This is made possible by Google's search algorithm and ranking systems. These systems have specific parameters to define the quality or rank of a web page. Some of these parameters with high priority in the search algorithm are words of your query(keywords), relevance, and current geographic location. It is also important to remember that each of these factors does not have the same weightage every time. 

In some cases, if users want to know the latest updates on certain news, the algorithm will prioritize the freshness of the content before it considers the usability of pages and other factors. To keep the algorithm working on high standards, Google launches new updates every year and performs rigorous testing to check each new function's validity added to the algorithm. 

The algorithm's functionality keeps changing every year, but the influencing results remain the same for the most part. Google has officially listed five main factors that play a key role in search rankings, and they are:

  • Meaning of your query
  • Relevance 
  • Quality of Content 
  • Usability of web pages 
  • Context 

Meaning of the Query 

For displaying the best possible results, Google's algorithm needs to know the user's search intent. If the query consists of words like "new" or "latest," the algorithm will begin to search for new information because the keywords involved are time-sensitive. Natural meaning is taken into consideration when the user types a general query in the address bar. When algorithms come across general queries, they don't understand what the user meant, so Google displays additional search results that may or may not be relevant to the query. 

Relevance 

After breaking down the query's actual meaning, the algorithm considers the relevance of web pages associated with each keyword provided by the user. The search engine does not instantly display the best results. The algorithm keeps conducting regular scans to index website content, and search indexing plays a vital role in assigning relevant websites to the search keywords. 

Quality of Content

The search algorithm alone cannot determine the quality of content for millions of web pages, so Google uses the PageRank system to analyze websites that demonstrate good quality content and high volume page links. Some of the parameters used to assess website content are expertise, authoritativeness, and trustworthiness. Industry experts rank higher because they know a particular topic and can provide the most value to the user. Similarly, websites of top companies in the world always rank on the first page of Google search because of their high authority and trustworthiness. 

Usability of pages 

The next important factor in elevating user web experience is the usability of websites. Google makes sure to check the technical aspects of a website, including page speed, security, and page responsiveness. These are also called "Core Web Vitals" concerning the ranking system. Visual stability should also be the same in every browser, as Google's algorithm considers when other factors like relevance and content are on the same level. 

Context 

Context and relevant settings use past searches to deliver more personalized results when the user types similar queries. For example, if you want to watch a football game and search for a particular match before, you will get details about that match only. The search settings make this possible by integrating some indicators to check location and history. The geographic location is also key in delivering relevant search results. If you are in New York and search for the best restaurants, you will get different restaurants present only in New York. 

Google has released many updates to modify the algorithm to meet current demands from their user base. Some of the most significant updates were announced in the last year or two. They changed the website's functionality so much that Internet marketers and SEO experts had to change their strategy to increase search rankings. We have listed some major updates announced in the last decade for you to check out and understand how the search algorithm evolved over the years. 

1.   Google Panda 

This algorithm update took place in 2014, and it was incorporated into the ranking system to determine the quality of website content. Google Panda update allowed the algorithm to deep scan web pages and found the low quality or spammy content. The Panda update made it easier for the algorithm to down-rank sites with no expertise and authority-backed websites linked. The update made the search engine filter results faster than before, so it is not like a core algorithm update to drastically alter rankings. It mainly was able to identify websites that were plagiarizing content and inserting too many keywords to increase search rankings. Every website is taking care of the user experience by checking for repeated content and keyword stuffing. 

2.   Penguin Update 

In 2012, Google rolled out Penguin update to identify sites that manipulate users into making bad buying decisions. Some of these links may also contain malware, so when the Penguin update was introduced in the core ranking algorithm, Google could penalize these websites immediately with less time for recovery. Some of the main reasons Google needed to incorporate this update were: too many spammy sites paid links for SEO development and irrelevant backlinks. After this update started working in real-time, everyone had to check their profile growth and be cautious about the number of spammy links being added every day. Harmful links are also a big no when it comes to estimating your penalty risk. 

3.   Pirate Update 

Pirate updates helped the algorithm find the websites that shared pirated content for free and reduced their search rankings from top pages. This was mainly the issue with torrent sites. The only drawback that occurred even after this update is that Google couldn't keep up with the reports, and it took a considerable amount of time to de-rank such pirated sites. 

4.   Hummingbird Update 

This update was a huge deal for Google's algorithm to understand user search intent and conversation queries. Until 2012, the algorithm focused on individual keywords rather than the entire meaning of the query. With Hummingbird update, the algorithm considers the query's whole meaning, not individual keywords, to provide listings. Keyword targeting and excessive use of one keyword benefitted many websites to rank higher in the search listings. Still, after incorporating this update, everyone needed to diversify their content and use co-occurring terms while targeting keywords. 

5.   Pigeon Update 

Pigeon update was launched to increase the number of relevant search results for every user query. This update made it possible for Goope Map and Google Web search to co-exist and boost local directory sites' search ranking. By integrating the Maps feature, Google's algorithm could use location and distance to enhance user search results. The Pigeon update added a few more parameters for the algorithm to consider while checking the quality of content and its relevance to the query. It also helped in a better communication channel between the local and core search algorithm. 

6.   Mobile-Friendly Update 

In 2015, we saw so many smartphones of different sizes and specifications entering the marketplace, so Google decided to launch an update that pushes website rankings higher if their pages are optimized for the mobile version. The mobile-friendly update required people to change their viewport configuration and conduct tests to pass mobile friendliness criteria. 

7.   RankBrain 

One of the most important ranking factors in 2015 was RankBrain, and primarily it was using machine learning to derive better search results. The complexity of the search algorithm increased a lot with this update, and at that point, it was able to self-adapt to user behaviour and other data collected from online search activity. RankBrain was a major update because it provided relevant search results and provided the algorithm with the ability to reconfigure on its own. After the complete integration of this update, people started focusing on analytics to elevate their website's user experience. The two main metrics used to evaluate web experience were bounce rate and session duration. This cannot be applied to current websites because the online activity has increased tremendously over the last five years. 

8.   Possum Update 

Possum update was responsible for delivering diverse searches that were primarily based on user location and business' address. The update made the algorithm consider users' location while breaking down the meaning of the query. Businesses that used to share the same physical addresses got de-ranked after the Possum update. To maintain a high search ranking, companies had to track their website in different locations, and they also needed to expand their local keyword list. 

9.   Fred Update 

Even though Google didn't release any specific detail about this update, we have seen in many studies that people whose websites were violating webmaster guidelines paid a hefty price. This update mainly concentrated on filtering websites with low-quality content and articles written over various topics to generate more revenue through affiliate schemes and ads. To overcome the losses and recover from the update Fred, people had to eliminate aggressive advertising and work on their content and web experience. 

10.  Medic Core Update 

We saw the Medic core update released two years ago in August, and it immediately affected business sites that were trying to optimize for keyword ranking and not for search intent. Google's algorithm changed this ranking system by only considering keywords and avoiding the main user search intent. Medic update made sure that businesses consider user intent when optimizing their website and content. Long-tail keyword research was also important in identifying the best search opportunities for user queries. 

Google will continue to release new updates to prevent spammy websites from ranking higher in the search results, and people will still try to decode the algorithm in multiple ways. Search rankings get better over time, and to get to that level, it is essential to consider user experience and the amount of value the website is providing. If people only concentrate on increasing the value they are providing to their audience, they can rank higher with some keywords. Organic ranking results from consistent efforts to enhance the quality of content and web experience, so while gaining knowledge and being aware of Google’s algorithm updates is essential, you should also focus on the reader and make sure all their queries are being answered. 

Antoniy Yushkevych

Antoniy Yushkevych

Master of word when it comes to technology, internet and privacy. I'm also your usual guy that always aims for the best result and takes a skateboard to work. If you need me, you will find me at the office's Counter-Strike championships on Fridays or at a.yushkevych@monovm.com