Website Content Mistakes that Could Lead to Google Penalties

Posted on

  • email
  • twitter
  • facebook
  • share this

share this


:: Learn how to avoid common website content mistakes that can lead to Google penalties and serious traffic loss. ::


By Dan Kern, Founder of Kern Media


The past five years, website owners have seen an onslaught of both announced and unannounced algorithm updates from Google. Whether it was the near-monthly Panda update, the devastating (and rare) Penguin update, or unannounced “quality” updates, website owners have been on their toes more than ever for the past five years. In fact, a quick glance at the Moz Google Algorithm Change History shows how updates from Google have really ramped up since 2011, and since they were first tracked in the year 2000.

In response to these updates, the SEO industry has been forced to evolve, significantly. There has been a much larger shift from link building toward content marketing. While link building is still alive and well, it has also changed. Gone are the days of building low-quality links from directories and other spammy sites that few people will ever “click” on such links from. Good riddance, too, as there is now more true “marketing” happening in the SEO industry.

Furthermore, content marketing is believed to be “safer” than link building, perhaps since the Panda algorithm has proven to be much less difficult to recover from in comparison to the Penguin algorithm. However, content can be risky if webmasters don’t ensure their indexation is meeting the quality standards of Google.

This article will share insights about perceived causation of penalties (or perhaps more accurately: “de-rankings”) from Google’s Panda algorithm (now part of Google’s core algorithm, source) and other content “quality” algorithm updates. 

Duplicated External Content

Many sites struggled with the original Panda updates launched by Google during 2012 and 2013 due to content (from other websites) being duplicated on their site, or vice versa. While Google has often told the SEO industry that it does not penalize for duplicate content, many SEO professionals would beg to differ. Here are two examples showing how removing duplicate content (originally sourced from external websites) resulted in recovery of Google Panda de-rankings.

Press Release Duplication

During the summer of 2012, the editor for DBW (a digital publishing news website) had re-published nearly 500 press releases from external organizations (Apple, etc.). In late April of 2012, the website saw a significant decrease in organic traffic due to Google’s release of Panda 3.6. Once the editor set all of these press releases to “noindex,follow” (via page-level meta robots tags), the site recovered its organic traffic and rankings a few months later (in August), as indicated by the graph below.

How the Panda update hurt a site's rankings

The website has experienced much organic search traffic growth since this Panda difficulty (January - March spikes are caused by the company’s annual conference). This shows that recovery and further growth in Google’s good graces is possible.

Curation of (Too Many) Duplicated “Snippets” of Content

In early 2012, the Six String Soul guitar blog was heavily de-ranked for not only have some duplicated press releases, but also having its core directory pages of (guitar gear companies) loaded with content duplicated from the manufacturers’ websites. 

One company's pre and post panda stats

Once this content was rewritten (and attractive imagery added to further enhance user engagement) a couple years after the de-ranking, the website recovered (in May and October of 2014) after a lengthy lull in Panda updates. This shows that curation of external content is still a risky endeavor (especially if enough unique content isn’t added to complement the duplicate content).

Lack of Unique Content / Low Quality Affiliates

The blog discussed above also was loaded with pages where the only content was eBay listings of musical equipment (sourced from eBay Partner Network's affiliate program). This was likely frowned upon by Google as it didn't provide unique value to its index and was probably deemed to be “deceptive” by its standards.

Google likely believed including only ebay listings was a deceptive SEO practice

It’s believed that the removal of this content from Google’s index (via “noindex,follow” meta robots tags, and eventual deletion) also contributed to the recovery. Google was on a tear with penalizing affiliate sites from 2011-2013, especially.

Over-Optimized / Lower Quality Internal Content

In 2014, Costa Rica Escapes (a vacation itinerary agency) experienced a setback in Google when Panda 4.1 was launched. Shortly after, high-quality videos were added to the site along with revised copy that was less keyword-rich than previous copy. In a few months, the site got a familiar seasonal traffic boost and continued on a growth path again, until a subtle traffic hit in May 2015 when Google made a “quality” update to its core algorithm. The “Quality” Update was heavily covered by Glenn Gabe, who dubbed it the “Phantom 2” Update. 

This website eventually recovered (at least partially) from the “Quality” update after experiencing another seasonal traffic boost in the first part of the year. While the seasonal somewhat clouded the recovery, it’s clear that organic search traffic was higher a year after the penalty (once the seasonality had subsided).

It’s difficult to pinpoint which factors contributed to the recovery, however, it’s a good reminder that it’s worth continuously updating and improving website content. Google will notice! Continuous improvements were made to this website to improve grammar, internal linking, becoming mobile responsive, and the addition of in-depth content about where to go in Costa Rica. Those improvements appear to have paid off.

Audits, Removals & Updates

It’s critical for webmasters and business owners to realize that websites are never “done.” Website content should be seen as a living and breathing entity that deserves continuous attention. Sometimes website content needs to be removed. Sometimes it just needs to be updated. Regardless, website content should be revisited on a semi-regular basis by conducting an audit with an SEO Professional.


Contributing Autor Dan KernDan Kern is founder of Kern Media, a Denver SEO company focused on on content marketing consulting. He specializes in performing content audits and providing strategies for clients who realize the need for improving overall content quality for both users and search engines. Dan also specializes in technical SEO for eCommerce, lead generation and SEO training for large online publishers. Follow Kern Media on Twitter, Facebook and LinkedIn.

 Request Website Magazine's Free Weekly Newsletters 

Login To Comment


Become a Member

Not already a part of our community? Sign up to participate in the discussion. It's free and quick.

Sign Up

9 comments

JasonM 09-16-2016 2:25 PM

How do you handle "duplicate content" if the vary nature of your site discusses content that is readily available?

For example, let's say your website/blog talks about the Christian faith.  Your site will naturally be filled with Bible scripture that is "duplicate" content all over the internet.  

I can think of other examples...

recipes

music lyrics

resume (might also be on other job boards)

documents/forms that are standard and posted elsewhere

Is Google "smart enough" to realize you aren't just simply reposting old content?

Dan Kern 09-16-2016 2:43 PM

Think about it this way: how can you provide unique value to both readers and search engine indexes? If readers can find the same content elsewhere, then search engines can too, and you're not providing unique value. If, however, you quote specific bible verses and then add unique commentary...and the majority of the content on the page is unique perspective, then that's good. Google works off of thresholds, which we're not aware of. It's okay to "quote" content that users can find elsewhere (and that search engines already have indexed), but it should be the minority of the overall content on the page.

ToddH 09-16-2016 6:32 PM

You can always add your own perspective and provide tips and information in ways that are different from how others do it. Readers might understand your content more than they understand that of your competitors.

Gracious Store 09-16-2016 7:58 PM

It is true that google penalizes content duplication, but in a nutshell, markets say or write the same thing. The only difference is in the wording.

JuhaniT 09-16-2016 11:36 PM

Over-optimazation is a funny thing. When Google has not published the recommended amount of keywords per page it is always a danger to over-optimize for instance with alt text anchor texts. Is it still 5% density, which is the target or...? And how machine learning affects to optimization.

MichaelH 09-17-2016 7:16 PM

I have an e-commerce site that took it's first traffic hit in the summer of 2013. We lost an average of about 30,000 visitors a month but in spite of doing nothing (we had never been slapped in the past 10 years and I was confident that downturn was temporary) the traffic soon recovered back to previous levels. If we had made a content update (or taken any real action for that matter) during this downturn I would have likely attributed the recovery to that update, when, if fact, the recovery would have occurred on it's own. My point here is that I read with caution the cause and effect outcomes of a single website update being associated with a recovery or a downturn for that matter. How can you be sure that the outcome of a particular action you take is actually associated with the action unless you have tested it repeatedly over time to come up a statistically significant data set?  

RichS 09-18-2016 10:19 AM

Good point on testing to determine root cause and identifying quality content.

DanK 09-19-2016 10:07 AM

MichaelH, I cannot comment on your experience as I've not looked at your website from my perspective. However, clearly there was some signal that led Google to drop your rankings at one point and then improve them at another. Could it be that your website was unwarrantedly de-ranked by Google due to imperfections in its algorithm? Yes. Is it possible that your website was de-ranked by Google due to quality signals that it once de-ranked websites for, but over time did not? Yes. Anything's possible. I agree with your point to read with "caution". Question everything. My conclusions were based on the fact that I've have seen clear patterns with these types of content issues after having worked on SEO for 300-400 websites over the past 5-6 years in particular (200 websites for the publishing company where I was Director of SEO/Organic Search). We didn't see as many Google updates before this time. During this time period, I had the unique advantage of seeing Panda hit multiple sites at once...sometimes in consecutive months...for much of 2012-2013. When I saw multiple websites with thousands of "tag pages" indexed get penalized, and then they recovered when those pages were set to "noindex"...that's a clear pattern and gives me enough statistical confidence to make a business decision. When we rewrote duplicate content on 40-50% of all product pages on 3 eCommerce sites, and saw ~100% increase in organic search traffic + revenue for all three sites a year later, that gave us all the confidence we needed. The scientific approach can only go so far when you're in-house and not on the agency side. You have numbers to hit :) But, I took these learnings to the agency side and experienced very similar situations. I will say that it seemed Google has become less concerned with technical quality issues (i.e. - I haven't seen sites get penalized with the majority of their Google indexation being /tag/ pages), and more concerned with content quality issues (i.e. - lack of unique content, duplicated content, over-optimized/hard-to-read content, etc...anything that decreases the value that your page offers to Google's index). Hope this helps!

Nick Sharpe 09-19-2016 11:40 AM

EXCEPTION TO THE RULE Dept: I design websites for a market niche where everybody is saying the same thing (used car dealers buying from private sector "cash for cars") - I was concerned that duplicate content would be an issue. With over a dozen clients and hundreds of pages (most almost identical except for location) generated over the past 6 years, they rank at or near the top for their prime KWs (not just mine, but copycats too!).

My biz site, on the other hand, with totally unique and original content has  dropped like a rock (tops locally but gone globally). Many theories regarding this - so with all the updates blah blah blah, SEO is still part art, part science and all voodoo.

Add to the discussion!

999 E Touhy Ave
Des Plaines, IL 60018

Toll Free: 1.800.817.1518
International: 1.773.628.2779
Fax: 1.773.272.0920
Email: info@websitemagazine.com

Facebook


Twitter