Duplicate Content: No Such Thing

Posted on

  • email
  • twitter
  • facebook
  • share this

share this

Google did its best to clear up any confusion recently about duplicate content penalties - saying in short that they don't really exist (unless being black hat is your modus operandi). It should come as no surprise to most SEOs that many online destinations have multiple URLs on the same domain that point to the same content. How do you resolve these duplicate issues if they exist on your site?

Having duplicate content on your site can potentially affect your site's performance and ranking, but it doesn't neccessarily cause penalties. Let’s dig a little deeper to get a better understanding. Say for example that you notice www.example.com/skates.asp?color=black&brand=riedell and www.example.com/skates.asp?brand=riedell&color=black feature the exact same content, are on your site and within the Google SERPs. Penalty for you? Not exactly. Here’s why.

According to Google, duplicate content on a site is “not grounds for action (penalties) on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results.” That’s a very clear statement. Since most SEOs are of the white hat variety, Google has made their position on duplicate content clear and outlined some best practices to clear up any issues.

Non-malicious duplication is quite common as many CMSs don't handle URL construction and organization very well by default. When Google identifies duplicate content, it is because variations appear in URL parameters (as provided in the example above.) When duplicate content happens, the URLs in question are grouped into a “cluster.” From there, Google selects the one URL that best represents the cluster from that group, and then consolidates properties of the URLs in the cluster (such as link popularity) to the representative URL.

Google’s advice? Use sitemaps. If you want to tell Google which URL should appear in the SERPs, simply make the necessary modifications by including that preferred URL in your general sitemap. Should Google be unable to consolidate all of the duplicate pages, page strength might be diluted and your ranking may sink.

Login To Comment

Become a Member

Not already a part of our community? Sign up to participate in the discussion. It's free and quick.

Sign Up


Susan Whitehurst 09-17-2008 10:23 AM

I admit it: the twins' pix got my attention. Now I'm wondering, is this a big priority? If there are no penalties for "non-mal duplicate" content, does it make sense for busy people to spend time (1) fixing variations in URL parameters or (2) building/updating sitemaps?

KenS 09-17-2008 1:19 PM

@ Susan

Short answer: yes

Long answer: While Google isn't PENALIZING sites, unresolved duplicate content can, as the article mentions, can dilute page strength and affect ranking.

Besides, I find it better to leave as little in Google's hands as possible. If you can control how your URLs look/behave, you should exploit that.  Then, using that, you should be able to build your site maps with relative ease (i'm assuming we're talking about the .xml site maps). Perhaps some sort of dynamic site map could be useful in your case.

Add to the discussion!

999 E Touhy Ave
Des Plaines, IL 60018

Toll Free: 1.800.817.1518
International: 1.773.628.2779
Fax: 1.773.272.0920
Email: info@websitemagazine.com