Wipe Out Your Search Traffic with these 5 Perfectly Innocent Website Errors

Today's websites are often large, complex and continually evolving. Typically there's a growing army of people, from content writers to developers and SEO professionals, updating material and making changes.

This constant activity means there's a greater chance that innocent mistakes could wipe out - or severely reduce - the visibility of important Web pages in search results.

 

What are the most common technical errors that can impact search performance? Here are five issues that come up again and again.

 

1) Failing to clean up Page Not Found errors

 

If you've spent any time on the Web, you'll almost certainly have come across many of these: you click on a link and instead of the expected page, an error page pops up saying something like "HTTP 404 Page Not Found".

 

404 errors (and the very similar 400s, 500s) boil down to the fact that the pages you were trying to get to couldn't be found on the site's Web server. They might have been moved, taken down or the links got broken, which can easily happen as part of everyday maintenance or updates to the site. But they can be a problem if you don't clean them up.

 

Google and the other search engines are all about providing a good use experience and delivering content they deem useful and relevant. If your site has lots of loitering bad links that return 404s, they're inclined to push your pages lower down the search results.

 

2) Forgetting to take off No Index/No Follow tags

 

The No Index/No follow tag is a small line of code that's useful when updating or building new sections for your site. Your developers will insert this into the HTML to tell the search engines not to index the tagged 'work-in-progress' pages until they're finished. This makes perfect sense because you don't want Google's crawlers going near the new content ahead of time.

 

When the big moment arrives, however, it's very easy for whoever was doing the work to transfer the newly created sections to the live Web server without removing the tags. It's a small mistake, but means you'll be getting zero search traffic to your new content. If those pages are introducing a new product or service, such an error could be painful impact to your business.

 

3) Misplacing your Canonicals

 

Like the No Index/No Follow tag, the Canonical is piece of code that has a really useful purpose. If used incorrectly, though, it can hit your search traffic.

 

Let's say you run a retail fashion site. For every garment you have a main page and some variants that show the same item in different colors. With the help of the canonical tag, you can point search engines to the master page that should be indexed in the results while the variants are disregarded.

 

The logic here is simple: if you allow many or all of those similar pages to be indexed, then you dilute the search authority between all of them - meaning they'll all perform worse in search results. The canonical is a handy way of concentrating all the authority on a single page and giving it the best chance of performing well.

 

What goes wrong, though, is the people carrying out site maintenance or adding new content put the canonicals on multiple pages. You can cannibalize search performance, meaning potentially high-ranking content ends up lower down the SERPs.

 

Canonical tags that are aimed at incorrect pages can cause issues too. Let's say you're using canonicals to manage the variants of a t-shirt page (size, color, etc.). If you aimed the tags at a page about jeans by accident, this will push authority to the wrong page. And it might even appear as a malicious hack, leading to loss of value in the search engines.

 

4) Mixing up your Redirects

 

You probably know that a redirect is a way of forwarding traffic from one page to another. They make sense any time you take down or move some content and want to send the traffic elsewhere on your site. When a product line has been deleted, you might redirect visitors to new pages showcasing your latest line.

 

One of the obvious mistakes here is when someone inadvertently puts a redirect on a perfectly valid page. That page won't see any traffic - and if visitors coming in through search are being redirected to a page that's unrelated to their query, then search performance will suffer.

 

A chain of redirects going through several pages also leads to search problems. Each redirect delays the time it takes for the final content to load. Because search engines dislike slow loading content, any page that loads after several redirects will likely be ranked lower than faster loading content from competing sites.

 

5) Failing to notice spikes in word count

 

With people adding and updating content all the time, one of the simplest mistakes is when a big block of words gets deleted in error or when someone fails to notice text that gets pasted twice before the page is published.

 

They're preventable errors, but happen more than they should. When they do, it plays havoc with search visibility. The quality of content on a page is a primary factor in how it ranks, so big spikes in words that don't improve what's there will be viewed negatively by search engines. Dramatic changes can be a factor in suspected hacks - another reason for search engines not to favor it.

 

The first indication that some of these errors have occurred is when the marketing team notices that search traffic to an important revenue-generating page has dropped off. By this time, lots of visitors and potential sales might have been lost to competing sites.

 

For a small site, it might be viable to rely on manual checking to root out technical errors. But for most enterprise sites or large content-rich ecommerce sites there's so much activity and so many pages that tools to automate the process are probably the only reliable option.

 

There are a variety of offerings out there that claim to identify errors. Any tool you select should ideally be set to focus first on your most important 'top money' pages - as poor search performance here will hurt your business most.

 

Whatever technology you use should also work swiftly, be able to spot a wide variety of errors and be scheduled to run at the very least a few times a week or daily. Remember that Google is crawling sites with increasing regularity - so there's very little time to identify and fix problems before they start affecting your search performance.

 

Sebastien Edgar, SEO Specialist at Searchmetrics

As an in-house SEO Specialist at Searchmetrics, Sebastien helps Fortune 500 companies better optimize their Web presence. His experience includes leading SEO efforts for brands abroad (e.g., InterNations.org) and helping Silicon Valley startups kick start their search efforts.