Working in a Post-Penguin 2.0 Web

Posted on

  • email
  • twitter
  • facebook
  • share this

share this

Late last week, Google launched its most recent algorithm update in the form of Penguin 2.0. There was a lot of speculative discussion leading up to the release about just what changes this new Penguin would reveal.

For now, less than a week since Penguin 2.0 rolled out, the actual effects of this update are obviously still pretty obscure and unclear, but already some trends have begun to emerge that hint at what exactly is changing for search engine optimization professionals and content marketers.

So, what exactly will it mean to work on the Web and practice search engine optimization for your website or blog in a post-Penguin 2.0 world? Keep reading to find out.

Authority Matters (More)

Google is going to be paying much closer attention to your credentials, so if your site is considered an authority in your specific niche, expect to see that pay off in the form of higher rankings on the search engine.

Changing the Way We Guest Blog

One of the biggest things that Google seems to be interested in targeting are sites that have a lot of outbound links pointing at just one website, as opposed to various links to many different authoritative websites. This will (or should) have a major effect on the way that content marketers and bloggers choose the websites that they write guest blog posts for. With this in mind, writers should make sure they select sites that don’t link to low-quality sites and are unquestionably relevant to the site that they intend to link back to in their posts. Of course, this should be a common best practice for all guest bloggers already, but with the genesis of Penguin 2.0, those that do publish non-relevant links and content together on a regular basis will feel the sting with regards to devaluation of their own blogs.

The Advertorial Question

Advertorials tread murky water when it comes to the black hat/white hat ethics debate, but most people don’t consider this type of content to be “bad,” per se. That being said, it is sort of frowned upon, and Google likely won’t be recognizing those who publish advertorials as true “authorities,” and they should not be expecting link credibility from the search engine if they publish them.

The Death of Content Spam

Google has finally tied a direct penalty to the much-maligned art of content spam. This obviously means that any sites that “feature” user-generated content spam will be hurt on the SERPs as a result. Webmasters should take note and check their sites, and particularly their blogs and comments sections, to seek out things like multiple https or terms like “free shipping” to uncover (and then remove) content spam. They can do this by using a database crawler tool or Google’s “words go here” feature. (It should be noted that especially capable spammers might make it so that their efforts can’t be discovered without performing a Google search.)

Enough with the Over-Optimization

Sites that use their navigation, header and/or footer areas to include more keywords as a way to rank higher for those terms, or sites that add a superfluous amount of header and footer links for those keywords, will be working in vain now. In fact, they may even end up penalized by Google for their slightly spammy over-optimization efforts, and nobody wants that.

Keeping Ads in Check

In order to keep websites and domains from being landing pages for a bunch of ads, Google has started handing out penalties to sites that put too many advertisements above the fold. Of course, the famously vague company doesn’t exactly tell us what “too many” really means, but it’s at least enough to go off of for now. Just make sure you keep it down (down below the fold, that is).

An Increase in Clusters

Despite the fact that Google likes to have as much variety as possible on the first page of the SERPs, it looks like the search engine will be displaying more clusters of multiple pages from the same domain. The catch, of course, is that the domain and its pages are of a high quality in the first place.

Crawlability is Key

Crawl errors that diminish the spiders’ ability to scan your website to determine its authority, and thus ranking position, will now have a greater impact on your overall SEO efforts. Crawl errors affect a site’s strength and authority, and that will become a bigger problem on the post-Penguin 2.0 Web. Fortunately, you can go into Google Webmaster Tools, check your crawl rate and uncover any issues the spiders may be having, so that you can get on top of correcting them ASAP.

Login To Comment

Become a Member

Not already a part of our community? Sign up to participate in the discussion. It's free and quick.

Sign Up


David S Freid aka SEO Seattle® 05-28-2013 3:22 PM

All good points to remember...

The one we keep in mind as we continue the timeline in Google chages and updates is "Authority".

Nicely MENTIONED at the beginning of this article...

"Google is going to be paying much closer attention to your credentials"

DougB 05-29-2013 3:23 PM

Great to see you're following the on-going nature of this change; we did a robust analysis of the data, and the results were striking; here is a link to the blog post from our Chief Architect:

RichS 05-31-2013 8:26 AM

Good review for importance of authoritative blogs

BruceM 05-31-2013 8:52 AM

It is still on the same lines as the last few updates. They are trying to clean up more web sites. This will be the constant with bigger search engines, I think it has to be or people would quit using them to find what they need.The need of webdesigners to silo content and have what the searchers want on page one above the fold is still everything.

I still see millions of websites that search engines have no way to rank. This is mostly due to not knowing how to use keyword research. That or not understanding it!

Builders that do not understand the need for a menu set up like a site map well organised on the front page. preferably with a how to use this site posted as well.

I see the new mobile friendly sites like the ones used on the Gantry platform will become a must have to get ranked with mobile users. I also have been told by my instructors that this is the new trend. Smaller more portable computers will be forced to become the normal way.

What the search engines are fighting is the large volume of web-sites built by web developers that know nothing about search engines and how they function.

With out marketing experience used on the front page there is no way the search engines can rank the websites.

I think people need to set up there websites more information conscious like a newspaper front page.

DellM 06-03-2013 1:34 PM

As of right now everything with 2.0 hasn't affected user-generated content like I expected. I have 2 sites with strictly Youtube vids drawn using the WP Robot and no penalities as of yet. Fingers crossed.

Malaysian Body Wave 07-31-2014 2:17 AM

Great information, thanks for sharing. I learn a lot from this website and will read through the other articles.

buckyballs 08-09-2015 11:22 PM

The post on the latest algorithm of Google named Penguin 2.0 proved to be a great and informative one. I am a computer science student and this site has helped me a lot to understand about the various happenings in the world of computers and Internet.

Add to the discussion!

999 E Touhy Ave
Des Plaines, IL 60018

Toll Free: 1.800.817.1518
International: 1.773.628.2779
Fax: 1.773.272.0920