Google Penguin 4.0

Share on twitter
Share on facebook
Share on linkedin
Share on reddit
Share on pinterest
Share on twitter
Share on facebook
Share on linkedin
Share on reddit
Share on pinterest


Google recently launched the Penguin 4.0 algorithm update, the final update of the Penguin series. Learn about all of the key SEO components of this major update.

# Full Post

What is the Google Penguin algorithm?

First launched in April 2012, Google Penguin is a component of the algorithm targeted at decreasing rankings for webspam – specifically those doing so by participating in link-related schemes that violate Google’s quality guidelines.

When was Penguin 4.0 released?

On 9/23/16, Google officially launched the 4th major release of their Penguin algorithm (dubbed Penguin 4.0), though it appears parts of the algorithm rolled out prior to 9/23.

It had been over 2 years since the last Penguin update (e.g. 3.0) prior to the Penguin 4.0 launch, so many of the affected websites had been anxiously awaiting this release for some time.

What are the core components of Penguin 4.0?

First and most importantly, Penguin is now a part of Google’s core ranking algorithm and does not require manual releases any longer. The algorithm is now real-time and changes to results will be faster.

Second, this means that sites impacted by the Penguin algorithm will no longer have to wait until the next manual refresh for an adjustment to ranking – positive or negative. The adjustment to ranking and evaluation of any individual site/page will happen in real-time as Google crawls the page, and the site/page will continue to be re-evaluated each crawl thereafter.

Last, the algorithm is more page-specific and “granular” as opposed to strictly site-wide. Previously, Penguin penalties were site-wide, whereas now suppression may only occur with specific pages or page groups.

Will Google ever release another Penguin update?

No. This will be the last time Google officially confirms a Penguin update now that it’s a part of the core ranking algorithm.

Google will not issue manual action notices for Penguin-related penalties (as it is machine/algorithm-driven), but will continue to deliver manual notices for other types of link penalties (human-driven).

Did Google increase crawl rates when launching Penguin 4.0?

No. Google confirmed that they did not increase Googlebot crawling as a result of Penguin changes.

Per John Mueller when asked about the subject:

“I wouldn’t say that. I mean, obviously we always work on crawling faster. So I am pretty sure we’d be crawling faster this week than we were a month ago. But specific to this algorithm, I don’t think that would affect the crawling of the pages.”

Mueller was also asked if the increased need for refreshing all the pages made Google change crawling for Penguin:

“I think we have to do that anyway. There’s always so much more happening on the web from one day to the next that keeping up is hard and making sure that we’re really on top of things and not just chasing the tail is hard too. So that’s kind of something our engineers always have to fight with.”

Do you still need a disavow file?

Yes, Google still recommends using a disavow file as a way to help recover from Penguin-related issues, though the effort may prove unnecessary at times as the intent of 4.0 is to simply devalue spammy links rather than demote.

UPDATE 10-10-16: It’s worth nothing that Google labels your links – such as “footer,” or “Penguin.” So, if a site gets too many Penguin-flagged links, that may be the trigger for the manual-action team to take a look at a site’s backlinks and potentially apply a manual penalty.

Also, it’s worth noting the Google is looking mostly at your link source and not the end destination.

Have any sites recovered yet from previous Penguin penalties?

Some sites have. On 9/28, Google suggested that sites who may have been suppressed for over 2 years should be beginning to see recoveries in rank performance.

That said, recoveries are unlikely to allow sites to fully return to previous rank positioning as the links that once gave those sites positive positions are likely the same links that are now being devalued. However, there also won’t be any incremental artificial suppression/demotion as a result of the bad links – they simply won’t count.

As Google is likely still crawling many sites, webmasters may not see the full effect of Penguin 4.0 for a few weeks after the 9/28 date.

According to a small Search Engine Roundtable poll, only around 12% of SEO’s that responded reported seeing significant ranking improvements as of 10/3.

There is some industry chatter on 10/5 suggesting that while many sites have seen recoveries, some sites have seen net positive fluctuations followed by a return to pre-Penguin 4.0 positioning.

UPDATE 10/7/16: According to Search Engine Roundtable, Penguin 4.0 Recoveries are still rolling out but should be done soon.

There have been many tweets on the subject courtesy of Search Engine Roundtable:

And finally, this dude asked about 3.0 but I think Gary is speaking on 4.0:

UPDATE 10-12-16: More Penguin 4.0 recovery cases studies are coming out, these ones courtesy of the folks at Cognitive SEO!

What major SEO trends are likely to occur as a result of Penguin 4.0?

Now that it appears to be somewhat safe to go back into the waters, I predict that this algorithm update may allow for more aggressive link building efforts.

While I would still recommend being cognizant of anything that violates Google’s quality guidelines on Link Schemes, on the surface there seems to be less overall risk to attempting to build links as webmasters/brands won’t risk facing a potentially long SEO performance suppression now that the algorithm is real-time, and if not-so-great links are built, they will simply be ignored (unless they violate Google’s quality guidelines).

Additionally, with the algorithm’s new ability to segment suppression by individual pages or page groups, there is a reduced risk of having the entire site penalized for the sins of a few pages/links.

All of that being said, any link building should still be combined with continued efforts to improve the site’s technical infrastructure/user experience, while optimizing existing content and aggressively creating net new content.

The reason why?

Frankly, link building is hard and it requires a lot of effort with sometimes little return. And with bad links simply being ignored, it may be more difficult to tell good from bad until rankings begin to shoot up.

I personally prefer to focus on things areas where I have a greater degree of control – e.g. tech and on-site.


This is what we know to date about Penguin 4.0. Have you noticed anything? Are you recovering?


Image credit:

Jacob Stoops

Jacob Stoops

Long-time SEO and podcast host. Senior Manager at Search Discovery. Husband. Dad. Mob movie aficionado. @jacobstoops