Top Google Algorithm Updates: History, Works Penalty & Recovery Explained

What is a Search Engine Algorithm?

An algorithm is a computer program which look over the trillions of web pages in its index to extract the exact answer which user want.

How Search Engine Shows results?

Google Search Results Ranking System-

To rank, the website in SERP google Page Rank algorithm calculates the number and quality of links to judge the quality of the website. But google also use 200+ ranking factors to show the relevant search results related to the user query.

An entire year google makes changes in their algorithm more than 500+ times. And Most of the changes are lesser known there are only some major updates by google like (Panda, Penguin, Hummingbird, Pigeon, Rank Brain etc.) which they have announced officially!

How frequently google update their algorithm Some Brief History?

In 2004 google released the first Brandy Update and this update main motive was follows –

  • To increase the size of its index
  • To give the more weight to quality Outbound Links and descriptive anchor text
  • And Latent Semantic Indexing (LSI) while analyzing a web page google bots now looked at the keywords synonyms as well

Google also update their algo before this Brandy Update but that Algorithm changes didn’t impact much on the search results.

Hence I will describe each & every update in order that you can understand about the google updates easily.

#1. Google Panda Update: 

Google-Panda-Update

Google Panda Update – To Drop the Low Quality, Spammy Content sites from Search Results Google update their Filter with Panda Algo.

In this way, they show the quality content which can satisfy the user.

In February 2011 Google released an update which particularly they designed to wipe out the low quality, spammy thin content from the SERP.

This update affected 12% of search results & mainly planned for U.S. sites.

In other words, this Google Panda update means to downgrade the spammy content search results which were trying to rank higher.

And reward those sites which sites had meaningful, quality & original content.

How Google Panda update changed SEO Best practice forever Have a look at this Rand Whiteboard Friday-

On July 18, 2015, Google updates last Panda 4.2 which were affected 2–3% of queries.

Now Google Panda Update Algo demotes the spam by adjusting the rankings down.

I mean to say that if you are trying to gamble the search engine and trying to game their algorithms google will adjust the ranking down of your website it won’t mean your website is penalized it means an algorithm applied to sites.

Moreover, Google Panda Algorithm is applied to domain level & it downgrades the whole sites.

How to Stay Safe from Google Panda Penalty?

1) Write High-Quality Content- First thing first Write the High Quality & Well Researched Content which can deliver the value to the user and can help to solve the user problem.

Even few pages with duplicate or thin content can harm your website with panda penalty and these pages lead to drop in the ranking.

2) Avoid Keywords Stuffing – It’s not best practice to use the same keywords again & again in the content. Because this won’t help your pages to rank higher.

While doing the keywords research find the long tail & LSI keywords similar to your main keywords. Google understand the keywords intent intelligently.

3) Elude Spelling & Grammar Mistakes – Google understand English but all other languages very well so poorly written content won’t penalize your website but bad grammar leads to the poor user experience.

Use the Grammarly or Grammer Checker tool to check spelling or grammatical mistakes before posting the content.

4) An excessive amount of ads – Provide your user what they are looking for your content should be a problem solver.

Your content should provide a better answer regarding the user queries.

Peoples are not landing on your website for clicking ads so you need to place your ads strategically on the web page which shouldn’t hurt the user experience.

Google wants to ensure which websites are ranking on top of the SERP they should provide the solutions what the users are seeking.

As page layout algorithm update will penalize your website for using too many ads above the fold.

5) Avoid Canonical Issues on the website – Check your website is configured correctly or not.

Check every version of the website like – website is accessing through www or without www (If your website is accessing in both version need to put the 301 redirect to solve the canonicalization issue).

Otherwise same content on different URL leads to duplicate content issue and chances are site would be penalized.

Therefore there is a big misunderstanding that if Panda Algo hit the website it won’t rank for any queries.

And exactly it’s not true.

If your website has been penalized for low-quality content then the entire site can be penalized but still if your website also has some pages with quality content chances are those pages can remain in ranking.

This is also confirmed by Google spokesperson.

Hence write for the User, not for the search engine & stay safe & ahead in the ranking.

6) How to Identify & Recover From Google Panda Penalty?

If your website hit by Panda Algo how would you identify the penalty?

Because there is no notification you will get from google.

Firstly you really need to identify that is your website hit by Panda penalty or any other manual action is applied or whether traffic just dropped.

Note: Check the Search Console, for any manual action penalty (if Site hacked, Bad Link Penalty etc)

Where to Start –

1#. Check the Organic Traffic Report in Analytics –

At first, head over to Google Analytics to check the ups & down in the traffic because this penalty targets the content and adjust your website ranking.

If you will penalize by Panda Penalty definitely your traffic dropped off.

– If you haven’t setup read this How to setup google analytics.
Note: Check the Search Console, for any manual action penalty (if Site hacked, Bad Link Penalty etc)

Because you won’t warned by google for Panda or Penguin Algorithmic penalty

Now head over to google analytics select the date range you can select as long of a period as you want to check the overall traffic of the specified period.

google analytics traffic

Now click on +Add Segment

analytics sort segment

Uncheck the All User Box

Uncheck All User in Google Analytics

And check the Organic Traffic Box

check the Organic Traffic Box in Analytics

This setting is necessary because you need to identify that how much organic traffic you are generating from google and after the panda penalty how much organic traffic you did lose or gain.

(As I described above that if your website have some high-quality content pages chances are you can rank high for those pages in the ranking no doubt Panda Algo is applied to the whole domain)

Organic traffic in analytics

Now check your analytics this section and if you found an immense drop in the traffic it means your website has been penalized.

Little ups and done are normal due to your business nature but a quick decline in traffic means it’s a penalty.

2# Analyze the Search Console to find the errors or Manual Action-

Head over to Search Console – If not setup read this post How to set up Google Search Console.

Now If it is already setup go to Search Console >> Search Traffic >> Manual Action –

If your website received no message in Manual Action Section its clear panda hit the website-

Manual action in search console

Now click on Search Appearance >> HTML Improvements

HTML improvements error in search console

If you found duplicate title & Meta Description tags issue (as you are seeing in the above snapshot) it means these pages may have duplicate content for these pages as well.

You need to check it one by one!

Because in most of the CMS and in e-commerce sites these kinds of duplicate pages issues arise often.

So make the list of the pages & write the unique and fresh Title & Description.

If these pages also have duplicate content issues optimize these pages with the High quality & well-researched content.

And if you think these pages are not necessary you can place the Noindex, Nofollow and even you can delete those pages.
If they are providing no value for the user.

3# Check Your website in Siteliner For Duplicate Content Issue

Find the duplicate Content & Broken links easily with the help of this tool.

Enter the Domain Name and wait for to scan the pages and it will show you the duplicate content pages but other tons of information like – Average page size, Page Load Time etc.

check duplicate content in siteliner

4# Find Broken Links & 404 Errors Pages –

You can use Screaming Frog to identify the 404 pages and Broken links.

Download & Install it and after running the website through screaming frog it will show deep link analysis of your website

Collect the broken links, duplicate content pages, and 404 errors pages.

404 errors page is not a big problem until if these pages are less but if hundreds or thousands of pages return 404 errors then you need to fix these pages.

To provide the good user experience these are not acceptable by search engines.

crawl your site screaming frog

Final thoughts follow all the above steps and it will take the time to recover from panda penalty.

Moreover, It’s not much hard to recover from panda make sure you have done all the necessary hard work to think Google that your website is now better to serve for their user.

In a statement, Google spokesperson said that Google Panda Is Now Part Of Google’s Core Ranking Algo.

It means it googles panda won’t applied to the whole domain if your website has quality content pages. It means if you have some low-quality content pages google panda will filter those pages in search results.

Read this Jennifer Slegg Post- Understanding Google Panda: Definitive Algo Guide for SEOs

#2. Google Penguin Update –

Google-penguin-update

April 24, 2012, the day was when google rolled out penguin update. This penguin update target the sites which were following the spammy or unnatural link schemes.

Difference between Google Panda & Penguin Update –

The only difference in between Panda & Penguin update is where Panda targets the low quality & thin content sites whereas penguin targets the unnatural links sites (if you are trying to deceive the user by using sneaky redirects, keywords stuffing in the content, cloaking etc.)

You will definitely catch by penguin algo for doing the unethical activity.

On September 27, 2016, google release Penguin 4.0 and this update is real-time update and this update devalues bad links instead of penalizing sites. Before this update, Google penguin update was penalized the entire site.

Now penguin update becomes real time and now Penguin is now part of google core algorithm.

Let me explain how penguin works before this update –

Let’s suppose penguin is rolled out today so it will take some day days to fully rolled out and which sites penalized by penguin update those sites are not ranks again until google did not release its next penguin update.

How does the Real-Time Google Penguin work?

As of now Google Penguin has become Real Time so this is last Penguin 4.0 update by Google and furthermore, there will no update.

Hence this update means if you build the shallow links you don’t need to wait for penguin update your website ranking can hurt tomorrow.

How to stay safe from Google Links Penalty in the world of Penguin Real Time?

Build the quality & editorial links & stay safe from google link penalties don’t try to build spammy links.

If you have done or doing spammy link tactics Stop Doing it. Google will catch you and it will hurt your website ranking.

What to do if your website hit by Google Panda Algorithm

#1st Step: Analyze your Backlink Profile

It would be good for you to analyze your backlinks profile after a specific period of time.

Tools to analyze the backlinks-
1) Google Webmaster Tool (Free)
2) Ahrefs – I must Recommend (Free Trial/Paid)
3) Moz Open Site Explorer (Paid)
4) Majestic SEO (Free/Paid)
5) Linkody (Free Trial)
6) Cognitive SEO (Free Trial)
7) LinkResearchTools (Free Trial)

Why should you use backlinks analysis tools?

To make the process fast and keep track on links is must nowadays competitive world and it is also much important to understand your link profile.

To check your link profile on regular basis is too much difficult so that these tools are essential for any website owner.

Reasons to use these Tools-

1) If you gain/lose any links you will be notified
2) You can Keep track of what competitors are doing
3) You can discover Link Building Opportunities
4) You can analyze Your Backlink Spam Score

How to Judge if a Link is Good or Bad & if you found any risky links?

And How to Disavow them to keep your link profile healthy?

Below I am going to cover these 2 things step by step –

As you know if you use illegitimate link building techniques google will punish your website.

At first, you must know the difference between Good or bad link?

A good link is which comes from a trustworthy & relevant site.

A good link is which helps to drive quality traffic to your site & also increase the credibility of the site.

But actually, there are many factors which make a good link. So there is no perfect definition for a good link.

Here are some points which will help you to differentiate in between a Good or Bad link –

Good Link

1) If you are getting links from the high-quality sites (Business Insider, CNN etc.) (Good Link)
2) If your links are relevant or related to your niche/industry (Good Link)
3) You are getting the links from a trustworthy site (Wikipedia) (Good Link)

A Bad Link is which comes from low authority & Non Relevant site.

How to Recognize A Bad link

1) If too many links are coming from Article Directories, Forum or blog commenting sites (Article, Directory, and forum sites links not provide much value)

2) Over-Optimized Anchor Text.
(Let’s suppose if I am trying to rank SEO Consultant keywords and in my backlink profile all the links optimized with SEO Consultant anchor text it’s a red flag for your website (add Keywords diversity in your link building efforts)

3) Too many links from Non-ccTLD
Your business in the USA and targeting local market but most of your links are coming from UK sites, You need to build the links from local sites as well.

4) Links coming from unrelated sites or pages
If you are in Health Niche you don’t need to link your site to real estate site because the links you will get from real estate site is not relevant.

So now you have the good understanding of Good or bad backlinks.

As Links are the lifeblood of any website & backlinks are one of the most important Google ranking factor.

Hence if your website is Brand new then you need to do the hard work to build the links. (add link building resource here)

But if your website is some months old & you have already done some of the links building work.

Then you must need to analyze your backlinks so that you know how much healthy is your link profile & disavow the links which you find suspicious.

How to do it step by step –

1) Download the links from as many sources as you can, Make a Master Sheet.

And now you have two options you can do it manually (it’s too much time-consuming process) or with the help of tools. But still, you have to do some manual work.

2) Extract root domain from the subdomain
You can use this URL to Domain Tool (Upload your CSV and Done)

This tool also helps you to avoid the duplicity of the URLs

3) Audit the Links
Now it’s time to check each URL and take the decision that you should keep this URL or disavow it.

If you don’t know exactly how to do this or still confusing read this post – How to recognize an unnatural link and this

Link Audit Guide for Effective Link Removals & Risk Mitigation.

Note- If you disavow the good backlinks you will lose ranking so make sure before disavowing the links.

Make sure to add the “Domain:” in front of every domain which wants to disavow.

Every domain which would be in your disavow file should be like this

For Example- domain:example.com

Make this disavow domain file as a Text file

It should be in a UTF-8 format or 7-bit ASCII

How you can create a Text File in UTF-8 format or 7-bit ASCII

Simple copy your disavow domain in the Google Doc document & save it as Plain text.

Now Head Over to Disavow Tool – Select your site from the Dropdown list click the Disavow links choose file and Upload your Disavow file.

When Google crawls the links which you have disavowed a no follow tag will be applied to each link that would point to your site.

Mean to say Google Algorithm will ignore the links which you have disavowed and your website will be safe from any link penalty.

Note: If you are going to disavow the bad backlink again don’t upload the fresh disavow file if you do it will override the previous disavow file.

Instead of download the previous disavow file from the search console & update the new URL which you want to be disavowed in the old disavow file.

Let’s suppose if you have 100 URL’s in the old disavow file & you are going to disavow 200 new URLs then in the single disavow file total URLs would be 300.

In Closing-

To stay safe and to keep a healthy link profile it must be essential to check your link profile after a specific period of time.

So meanwhile if you found unnatural links are pointing to your site It would be good to keep them in the disavow file.

And use this google disavow feature cautiously otherwise it’s sure if you keep the good links in this file. You will have big ranking loses.

If you don’t have any knowledge I would recommend Hire the Best SEO Consultant who can do this job well.

Resources to Learn More –

The Complete Guide to Disavowing Links for Google and Bing

Your Start-to-Finish Guide to Using Google’s Disavow Tool

#3. Google Hummingbird Update –

Google-Hummingbird-Update

What is Google Hummingbird?

On September 26th, 2013 google announced the “Hummingbird” update. This algorithm is not designed to target the spam (as Panda or Penguin do) but Hummingbird main focus is to provide the best results for the specific queries.

What does google hummingbird do?

With this update now google is able to understand the user query intent in a better way to provide the best possible results which satisfy needs of the user.

Hummingbird Algorithm is now focusing on the meaning of the phrase rather than individual keywords to deliver better results related to user queries.

Because google noticed that more peoples are searching for longtail queries. Before this update, google only looks for individual phrase or keywords. And could deliver the results based on this.

But with Hummingbird algo now search engine could understand the question in a good way.

How to optimize your website For Google Hummingbird?

As you know now google give more priority to understand entire query means, instead of a single word.

As Google Users are asking the question and in google results, a user wants the correct answer to their question.

Hence you need to optimize the website to answer those user questions, How can you do this with your content?

Follow this while writing content

1) Write well-Researched Content with using long tail queries.

In your content write Great content which is useful for the user
You should not focus on single keywords terms instead of focus on long tail queries.

I don’t mean to say you only include the long tail queries in the content.

Rather before writing content, find all the short, medium or long term & use those terms in the Title, Sub Heading & in the Body content.

In this way, you can optimize your content for more search terms.

Thus your content will be optimized for many search terms and chances are you will rank for many related search terms & more traffic you will generate.

2) Links – Build the Quality Links

As we all know links are one of the major ranking factors to rank high in search engine.

Page Rank algorithm still uses by the Google to follow the links I agree that google has lowered down the value of anchor text & links.

But still anchor text & Links are much more important to rank high. As google doesn’t want to rank high those sites which provide no value for the user.

Mean to say if you write the content for the search engine and by building the links you think you would reach on top of the SERP You are wrong.

Let’s assume that you reached the top but after some time Google will not see any engagement on your site it will start lower down your website in the SERP.

So try to write for the user, not for the search engines.

3) Optimize Website Speed –

Not google but the user also wants fast results so optimize your site speed so that website loads fast.

Google also give more priorities to those sites who loads fast because hummingbird wants to deliver fast results.

4) Add Conversational keywords in your Content –

Peoples search in a conversational way so that not only main keywords should be included in the content.

But (Conversational Keywords) should also be the part of the content. Google also use the knowledge graph to answer question.

To learn more about the Knowledge Graph Watch this video –

While doing keywords research add the conversational queries in your content & answer the question.

In this way, you will optimize your content for long tail queries & it will also help your website to generate more & targeted traffic.

Find Long Tail Queries

5) User Experience –

User Interaction with the website, time spent on the web page by a user & CTR will be more important.

Hummingbird takes into account all the things while ranking content.

In the end, I would only say write the content for the user & not for the search engine.

4) Google Rank Brain Algorithm–

In October 2015 google confirmed the use of rank brain in their main algorithm.

Rank brain is a machine learning AI based algorithm which helps the search engine to process the relevant search queries in the search results.

Why does Google need to develop this machine learning system?

You know on the average google process 3.5 billion searches per day and 1.2 trillion searches per year worldwide.

And on the basis of this google analyze that 15% of search queries have never seen before by google search engine.

Mean to say 15% of search queries are performed for the first time on google.

So to provide the relevant search results related to that 15% queries Google developed the rank brain algorithm.

Rank brain at first try to understand the queries & based on these queries provide the search results.

Rank brain job is not only to process the best search results but it does also evaluates the results that user are satisfied with the results or not.

If it found that users are not happy with the search results if will apply the old algorithm back.

Can we optimize for Rank Brain?

Rank brain is all about to provide the best organic search results.

So try to write the content for the humans not for rank brain or any search engine algorithm.

Because at last humans are going to interact with your content not any search engine algorithms.

When rank brain delivers the search results it pays close attention to those results and analyzes those results based on user interaction.

Hence write helpful & engaging content which can satisfy user actual needs. Don’t try to please the rank brain instead of focus on your user.

Conclusion-

I hope this tutorial will help you to understand the google updates: Panda, Penguin & Hummingbird.

So focus on quality content which delivers the value to the user.

Write the content which can please the user & build the quality links definitely google will reward your website in SERP.