SEO Tools, Tips & Trends | SEO Blog at Website Magazine

Working in a Post-Penguin 2.0 Web

Written by Michael Garrity | May 28, 2013 5:00:00 AM

Late last week, Google launched its most recent algorithm update in the form of Penguin 2.0. There was a lot of speculative discussion leading up to the release about just what changes this new Penguin would reveal.

For now, less than a week since Penguin 2.0 rolled out, the actual effects of this update are obviously still pretty obscure and unclear, but already some trends have begun to emerge that hint at what exactly is changing for search engine optimization professionals and content marketers.

So, what exactly will it mean to work on the Web and practice search engine optimization for your website or blog in a post-Penguin 2.0 world? Keep reading to find out.

 

Authority Matters (More)

 

Google is going to be paying much closer attention to your credentials, so if your site is considered an authority in your specific niche, expect to see that pay off in the form of higher rankings on the search engine.

 

Changing the Way We Guest Blog

 

One of the biggest things that Google seems to be interested in targeting are sites that have a lot of outbound links pointing at just one website, as opposed to various links to many different authoritative websites. This will (or should) have a major effect on the way that content marketers and bloggers choose the websites that they write guest blog posts for.

 

With this in mind, writers should make sure they select sites that don't link to low-quality sites and are unquestionably relevant to the site that they intend to link back to in their posts. Of course, this should be a common best practice for all guest bloggers already, but with the genesis of Penguin 2.0, those that do publish irrelevant links and content together on a regular basis will feel the sting with regards to devaluation of their own blogs.

 

The Advertorial Question

 

Advertorials tread murky water when it comes to the black hat/white hat ethics debate, but most people don't consider this type of content to be "bad," per se. That being said, it is sort of frowned upon, and Google likely won't be recognizing those who publish advertorials as true "authorities," and they should not be expecting link credibility from the search engine if they publish them.

 

The Death of Content Spam

 

Google has finally tied a direct penalty to the much-maligned art of content spam. This obviously means that any sites that "feature" user-generated content spam will be hurt on the SERPs as a result. Webmasters should take note and check their sites, and particularly their blogs and comments sections, to seek out things like multiple https or terms like "free shipping" to uncover (and then remove) content spam. They can do this by using a database crawler tool or Google's site:domain.com "words go here" feature. (It should be noted that especially capable spammers might make it so that their efforts can't be discovered without performing a Google search.)

 

Enough with the Over-Optimization

 

Sites that use their navigation, header and/or footer areas to include more keywords as a way to rank higher for those terms, or sites that add a superfluous amount of header and footer links for those keywords, will be working in vain now. In fact, they may even end up penalized by Google for their slightly spammy over-optimization efforts, and nobody wants that.

 

Keeping Ads in Check

 

In order to keep websites and domains from being landing pages for a bunch of ads, Google has started handing out penalties to sites that put too many advertisements above the fold. Of course, the famously vague company doesn't exactly tell us what "too many" really means, but it's at least enough to go off of for now. Just make sure you keep it down (down below the fold, that is).

 

An Increase in Clusters

 

Despite the fact that Google likes to have as much variety as possible on the first page of the SERPs, it looks like the search engine will be displaying more clusters of multiple pages from the same domain. The catch, of course, is that the domain and its pages are of a high quality in the first place.

 

Crawlability is Key

 

Crawl errors that diminish the spiders' ability to scan your website to determine its authority, and thus ranking position, will now have a greater impact on your overall SEO efforts. Crawl errors affect a site's strength and authority, and that will become a bigger problem on the post-Penguin 2.0 Web. Fortunately, you can go into Google Webmaster Tools, check your crawl rate and uncover any issues the spiders may be having, so that you can get on top of correcting them ASAP.