SEO: Google’s Algorithm Updates Of The Last 10 Years

by | Sep 13, 2021 | SEO, Tips | 0 comments

    Everything About Google SEO Updates of The Last Decade

    The slightest thrill in Google’s algorithm can shake the whole SEO world. For Google, it helps to respond to user queries with ever more precise results. But for online businesses, it can ruin the hard work of several months or even years.

    The famous search engine released several algorithms with names as charming as Penguin, Panda, or Hummingbird. Below we have explained all of these along with their impacts.

    But before that, you should know:-

    Google Algorithm – What is it?

    Google’s algorithm includes a complex system to retrieve data from its vast search index. It is a system that helps Google to fetch relevant data from its database in return for a search query. This system uses a vast combination of algorithms and various ranking factors to help deliver web pages by their relevance. This also affects the rankings of each web page. Thus, it helps Google deliver precise information instantly. The main goal of Google’s sophisticated algorithms is to provide the best value for the user. In simple terms, the search bots have to go through a multitude of content to retrieve the best result to clarify a user’s query. In addition, it follows several ranking factors to deliver the right web pages on the top of search results.

    What are updates in Google Algorithm and Why Are They Necessary?

    Everything needs updates, so do Google’s algorithms. It helps to upgrade Google’s ranking criteria for websites. They are necessary because without them, internet users will not get the precise results for their search queries.

    Google makes a plethora of updates in its algorithms every year. But plenty of these don’t show major effects and thus go unseen.

    Now let’s discuss the Google Algorithms updates from 2010:

    Google Caffeine Algorithm – June 2010

    This update was launched in Aug 2009 and rolled out in June 2010. Caffeine redesigned Google’s indexing system. With Caffeine, Google was now able to crawl and index pages faster on its servers. This update made it possible for Google to crawl and add newer content more quickly.

    Caffeine wasn’t an algorithm by Google to manipulate rankings. Rather, it was just a reconstruction of its indexing system.

    Google Panda Update – February 2011

    The Google Panda algorithm was a search filter that targets the content of sites. The goal for this update was to highlight unique and good quality content in its results.

    More specifically, Google Panda excludes:

    Content Spinning: a method of automatically generating content using software by taking text and replacing certain words with synonyms.

    Duplicate Content: Internally (a page that includes the exact content of another within the same domain name), or externally (a page of a website that includes the exact content of another page of ‘another website)

    Spam Comments: Identical texts automatically published in order to generate links to another site.

    To avoid being penalized by Google Panda, websites must therefore offer unique and useful content, i.e. content that answers the user’s question.

    Major Google Panda Updates

    • In mid-March 2013, Matt Cutts announced that Google Panda will now be integrated into ongoing algorithm updates.
    • In mid-June 2013, Matt Cutts also revealed that an update for Google Panda had been launched in early May (Google Panda 26) and that Google was working on less severe Panda filters.
    • In mid-July 2013, Google confirmed having deployed this reworked version of Panda (Google Panda 27), which according to the engine “incorporates new signals to target more finely”.
    • In 2014, Google began to communicate again on Panda deployments, in May and September in particular, with its versions 4 and 4.1.
    • At the beginning of 2016, Google announced that its Panda was now integrated into the core algorithm and there were no more updates on it since then.

    Sites that were hit the most by Panda update:

    Panda update

    Google Penguin – April 2012

    This is another Google search filter algorithm, but this time it tackles spam backlinking. Just like Google Panda, it has become so important for the engine that it was integrated into the core algorithm in September 2016, thus becoming an essential criterion in search engine result positioning.

    Google Penguin aims to punish poor quality backlinking, in particular by addressing:

    Spamdexing: This is when the backlinking of a site is framed in an artificial way, most often using link directories. It involves purchasing links. Very often, they come from URLs unrelated to the theme, or from poor quality sites from an SEO point of view. For Google, this is clearly a fraud on the engine index.

    Over-Optimization of Link Anchors: When the anchor of internal links or external links pointing to the same page is the same.

    For Google, the goal for Penguin Algorithm was to create the cleanest and most natural backlinking network possible between sites, in particular avoiding link purchases.

    Major Google Penguin Updates

    • Shortly before Penguin’s 4th deployment, in May 2013, Matt Cutts explained that the filter would now be able to “analyze more deeply” and also have “more impact” than the first version. It was therefore to be a new major version of the filter, which earned it the name “Penguin 2.0”

    Penguin 2.0

    • In 2014, Google rolled out ” Penguin 3.0 ” more than a year after the previous rollout. The first aftershocks were seen on October 18, but numerous aftershocks also came into notice more than six weeks later.
    • On December 10, Google claimed that Penguin was now deployed on a continuous basis, with no end-of-deployment date.
    • However, finally, on October 23, 2016, Google announced the last update of Penguin. It was then that Penguin was integrated into the core algorithm and has been operating since that date.

    Exact Match Domain (EMD) Update September 2012

    This update was announced by Cutts on September 28, 2012, on his Twitter:

    This algorithm was deployed to fill a flaw in the system: to prevent poor quality sites from appearing in the first position of the SERPs only because their domain name corresponds to the request.

    Exact Match Domain

    Some people, knowing the impact of the domain name on SEO, have taken advantage of this flaw in Google’s algorithm to improve the SEO of their site. However, thanks to this update, Google was able to remedy this error

    The motive behind this Algorithm was not to exclude websites with exact match domain names, but those exact match domains that have poor/ non-reliable/ thin content.

    Have an EMD? Wish to know how to avoid getting penalized?

    1. If you use a site containing an EMD, you must first and foremost focus on quality content. That is, original content, along with the regular addition of content.
    2. Go for creating content clusters. Keep in mind that for Google, the most important thing is to give relevant content to the internet user.
    3. Adapt your site with an optimal design (UX), and the mobile version (index mobile-first).
    4. Think about diversifying your outgoing links, and finally, create a community in order to create interactions.

    These points will allow you to keep a good position or limit losses with the use of an EMD.

    Hummingbird Update – September 2013

    The Hummingbird algorithm formalized by Google in September 2013 has made significant changes in the SERPs. Thanks to this update, the search engine has been able to understand what SEO experts call conversational search, i.e. a whole sentence or even a question, and no longer a succession of keywords.

    Hummingbird allows Google to understand a query in its entirety and is no longer based on one or more keywords. To achieve this, this tool has been programmed to better understand both Internet users’ requests and indexed content.

    Due to this, Google better understands the request made by the Internet user, which makes it possible to offer more precise and relevant results.

    The idea was: if an Internet user writes a question, he can obtain a result that answers his request. Depending on the search intent of the Internet user, the results differ.

    It is because of the Hummingbird Algorithm, the requests made orally via voice assistants had shown better relevant results.

    Payday Loan 2013 – 2014

    On June 11, 2013, Google announced this update to target spammy search results that mention porn and payday loans. It took around sixty days to roll out. Later in 2014, the Payday loan 2.0 anti-spam and Payday loan 3.0 update was rolled out. Some well-known experts suggested that the 3.0 update targeted spammy queries and 2.0 targeted specific sites.

    Google Rankbrain – Early 2015

    Rankbrain is part of the Hummingbird. Google said with Rankbrain, it has built machine learning skills into its algorithm so that it can understand the need behind a query, what experts call search intent.

    It aims to understand the implicit searches of Internet users via artificial intelligence. For example, it will understand that to the query “golden shoe”, it is relevant to provide answers concerning “Messi or Ronaldo”, even if their name is not given in the search.

    So, it relies on the whole long tail phenomenon. After the quality of the content and links, Rankbrain is the 3rd SEO positioning criterion for a site.

    Mobile-Friendly / Compatibility – April 2015

    The SEO people around the world nicknamed it Mobilegeddon!

    Since 2015, Google has made mobile compatibility a priority and a primary factor in the search engine optimization of a website.

    Deployed with the objective of favoring the sites adapted to mobile in SERPs, the update “Mobile compatibility” appeared on April 21, 2015. Later, Google launched its mobile-first index in 2017 to adapt to user behavior.

    Knowing that users are increasingly going to search via their mobile phone and not their computer, Google has therefore made responsive design a necessity by classifying websites in relation to their mobile version.

    It is a question of giving preference to websites that have an interface adapted to mobile browsing. Mobile-Friendly was the forerunner of the Mobile-First Index.

    Pigeon Algorithm – June 2015

    Google Pigeon is an algorithm that has come to strengthen local search. Launched internationally in June 2015, the Pigeon algorithm focuses on local search.

    With the aim of providing ever more precise responses to Internet users, the algorithm differentiates between local results (via cities) and more general results.

    This update is very useful for stores, restaurants that want to develop their visibility at the local level.

    Core Algorithm Update (Quality, Phantom Fred, Medic) 2015 -2017

    2015 – 2018 has been an eventful time for the SEO world. Google has rolled out a number of very important updates. The search engine announces updates to its algorithm approximately quarterly. These regular updates called “Core Update” impact the way Google ranks the pages it references in its search engine.

    Algorithm updates of this type are called Core Update because they relate to Google’s main algorithm and not a particular indexing method or SEO criterion. Let’s have a look at them:

    • The “ Phantom” update launched in 2015 allowed Google to penalize websites that did not present relevant content to users.
    • Phantom 2, also called “Quality Update”, was launched on May 20, 2015, with the objective of improving the user experience by combating low-quality content.
      In March 2017, Google rolled out its “Fred” update. This Core update led to many changes in the search results. It was launched with the aim of combating sites displaying a lot of advertisements that were detrimental to the user experience.
    • The effects of “Fred” were also felt on all sites using Black Hat SEO practices.

    Essential Web Speed ​​and Signals 2018

    This Google Algorithm was set up in April 2010 for computers. In July 2018, its version for mobile phones came into existence. On the computer, its goal was to promote sites with a good loading speed. On mobile, it penalizes sites that are too slow. It now serves as a part of Core Web Vitalls – an update for 2021.

    Local Search Update – November 2019

    In November 2019, Google announced the “Local Search Update”. It acts on the understanding of queries: Google was now able to analyze the link between the words used in a local query and its meaning, to provide more precise results.

    Core Update 2019

    Google has rolled out a major new update in 2019. This time the search engine imposes new technical quality criteria on the sites to be well ranked:

    • Importance of page loading speed which must be as low as possible
    • Switching to HTTPS to secure the data passing through the site (payment, personal data, etc.)
    • Importance of UX with fully responsive design

    Site Diversity Update – 2019

    Launched almost simultaneously with the previous one, this update wants to offer more diversity in the SERPs. Google no longer wants to bring up several pages of the same site, but to offer several domain names to respond to a request.

    Google EAT Algorithm Update -2019

    Google’s EAT Expertise -Authority-Trust, was established in 2014 and updated in 2019. This algorithm concerns the criteria that the search engine evaluates in terms of content. In order for the textual content of your site to be correctly rated by Google, it must respond favorably to these three criteria:

    • Expertise
    • Authority
    • Reliability

    Most of the sites had been impacted by this update, particularly the YMYL websites. YMYL stands for Your Money, Your Life. These sites have a direct impact on “people’s happiness, health and wealth”. In general, YMYL sites concern themes such as health, finance, legal, insurance, security, etc. In other words, a very large part of the web.

    Google wants to provide the best possible answers to Internet users, which is why it set up Google EAT. The content must then be of high quality, 100% authentic and respond perfectly to the requests of Internet users on a specific request.

    Why Did Google Create the EAT Criteria?

    Because a website that meets these criteria naturally distributes quality content that perfectly meets the demands of Internet users on a specific request. Thus, the search engine can provide a very satisfactory response to its users, based on quality, reliability, and relevance. Please note that the content of a web page is not limited to the text but to all its functionalities and its design.

    You must therefore include these criteria in your SEO strategy, especially as the results in the SERPs become more and more refined. Google tends to respond effectively to the problems of Internet users and the EAT plays an important role in the display of the first search results.

    What are the important EAT criteria for YMYL sites?

    Site Reputation: The reputation of a YMYL website is very important and negative ratings may penalize it.

    The Main Content Quantity: Whether for the YMYL site or for the other sites, the content must be highly qualitative.

    Supplementary Content: Content allowing a good user experience is mandatory;

    Advertisements: The advertisements which are in the web page must not disturb the visitor, whether it is for the YMYL sites or the other sites.

    Reviews: They ensure the reliability of the site.

    Recommendations and Awards: Awards greatly increase the score of YMYL sites.

    Other than that, one should work on:

    • Quality of Information on the website.
    • The level of expertise.
    • The online reputation of the author.

    How to improve your “EAT score”?

    To show Google that you are legitimate, you need to improve the “EAT” score of each of your pages. Here are some tips that will help you present yourself as an expert in your industry.

    1. Put Author Bio On Each Content:
      Google attaches importance to the author of the content. It is therefore important to put an author bio on each blog post. This allows Internet users and Google to see if the article was written by an expert on the subject.
    2. Encourage Contributors:
      You can encourage your community to comment on your articles. Interactions do good for the EAT and allow you to gain reliability (if the feedback is positive).
    3. Have a site that facilitates navigation:
      On e-commerce sites, all pages should be easily accessible from the home page. Also, all the information such as contact information must be easily available.
    4. Work on your brand image:
      To take care of your brand image, you must:

      Be active on social networks ; Present your values ; Tell a story to the audience.

    5. Delete or refresh pages with a low EAT score:
      Each page on a website is rated by Google and may have different EAT scores . All of these scores are combined with the reputation of the website and provide the final EAT score. If you have pages with a low score, update them. You can also remove them if they do not add value.
    6. Secure the Site:
      HTTP sites without an SSL certificate are flagged as insecure. You need to ensure that every page on your site has an HTTPS URLS to have good EAT scores.

    Google BERT Update – 2019

    Arguably the most significant update of the past 5 years, BERT was officially launched on September 9, 2019.

    BERT stands for “Bidirectional Encoder Representations from Transformers”. This is the deployment of a form of artificial intelligence in the algorithm to better understand the requests of Internet users.

    The objective of the BERT update is to achieve a better understanding of user requests by prioritizing the terms and expressions used by users. With this update, Google was able to better understand the relationships between words within an entire sentence, rather than processing the keywords (or phrases) one by one.

    Thus, since its launch, this update has enabled Internet users to see results appearing in the SERPs which are precisely linked to their searches.

    Core Updates – 2020

    In 2020, three Core Updates had been deployed by Google. The “ January Core Update” followed by the “May Core Update ” and finally at the end of the year the “December Core Update ”.

    The search engine focused on the four elements that must be of paramount importance for a web SEO:

    • Quality Content
    • Efficient Sites
    • A well-thought-out Website Structure
    • A Fluid User Experience

    Featured Snippet Repetition – January 2020

    Danny Sullivan from Google announced this update on Twitter on January 22.

    According to this update, the same featured snippet will not appear in page 1 organic search results repeatedly. It affected all the search results of Google.

    Passage Ranking Algorithm – February 2021

    Danny Sullivan introduced Passage Ranking to the world on February 12 2021 via Twitter. For now, this algorithm is only launched for search queries in the US. This algorithm will help Google to understand the bits of your content. We will soon consider passages as an additional ranking factor.

    Products Review April 2021

    This algorithm was designed by Google to rate product reviews. It was officially launched last April. Concretely, thanks to Product Reviews, Google wants to favor the sites having the best opinions. The search engine thus favors sites whose opinions are based on relevant data and genuine research work carried out by enthusiasts and professionals who master the subject concerned.

    Currently, Products Review only applies to English content. But Google plans to expand it to various other languages.

    Google Page Experience Update – June 15, 2021

    This update concerns the “Google Page Experience”. Google now takes into account more criteria to classify web pages. These criteria relate to the state of a site in terms of UX. They include:

    • The loading experience;
    • Interactivity;
    • The visual stability of the content of a site’s pages.

    Page Experience Update

    In order not to be penalized in the ranking of websites within the SERP, it is necessary to take upstream actions:

    • Test the mobile compatibility of your site
    • Test the speed of your site
    • Check the mobile efficiency of your site

    The objective is always the same for the search engine: to ensure that the sites offered to Internet users are always more relevant and efficient. But the user experience is also taken into account this time.

    Google relates UX to three “Core Web Vitals”:

    LCP: Largest Contentful Paint. This indicator concerns the loading time of a web page. The main elements should display in less than 2.5 seconds according to Google.

    FID: First Input Delay. This is the time between the user’s click to view a page and the browser response. For Google, the FID should not be greater than 100 milliseconds.

    CLS: Cumulative Layout Shift. It measures the visual stability of a page during loading time.

    Google Spam Algorithm Update – June 2021

    This update was rolled out on June 23. However, there were no specific details by Google on what spam update was targeting. Its second version was rolled out on June 28th.

    July Core Update – 2021

    In July 2021 Google deployed a Core Update. The engine specifies that the goal is to “always focus on the quality of content”. On July 12th, Google confirmed via Twitter that the July 2021 Core Update was effectively completed. Although no deep details were provided, one can read about it on the Google Search Console Blog.

    Think You’ve Been Impacted By Any of These Updates?

    Ask Our Experts For An Audit