Skip to Main Content

The Top 10 Technical SEO Problems Seen by SEO Pros Today

Posted on 8.18.2014

:: By Jason Squardo, ZOG Digital ::


Today, businesses no longer have the luxury of being blind to the benefits of search engine optimization (SEO). From small retail stores to multi-national chains and B2B focuses companies, SEO has been proven to produce positive results. 

Creating a technically sound website is still one of the most important aspects of an SEO strategy and on-page factors are considered one of the biggest factors when determining search engine rankings according to MOZ. As SEO experts and business owners today create more strategies to take advantage of digital opportunities, search engine updates like Panda, Penguin and Hummingbird have made quality content essential rather than allowing some SEO quick fixes to keep working. However, even with the rise of Panda and Penguin, the effectiveness of technical SEO has not been diminished. These 10 technical SEO issues are seen frequently by industry professionals and marketers today. Here’s how to fix them:

1. Multiple versions of homepage

Many websites have multiple URLs that lead to the same homepage. For instance, some homepages can be found using example.com, www.example.com, www.example/index.asp, www.example/default.asp and several other versions, which creates duplicate content for search engines. Search engines will eventually decide on one version to index, but people can link to multiple versions of your page splitting up the value of your external links. Another version of this technical SEO issue can occur when both HTTP and HTTPS versions of the site are allowed to co-exist.

The solution is to create 301 redirects from each of the variations to one, centralized correct version — typically www.example.com. Creating 301 redirects will also solidify link equity so it is not spread between multiple versions of the homepage. Canonical tags can also be used to indicate to search engines which page is the preferred domain.

2. Lack of keywords in URL structure

Keyword usage in URLs is one of the hundreds of factors search engines use to evaluate and determine a page’s content to assist in search engine rankings. A website’s URL structure should be created to take advantage of keywords, even as users go deeper into the site, because of the ability to help search engines determine relevancy. By connecting a page’s keyword focus to the URL, it gives a stronger signal to search engines about the focus of the page, helping increase rankings.

3. Title tags

Title tags are the most important elements of on-page SEO, both because of their ability to convey information to search engines but also because they are what searchers will see in search engine results. However, many businesses use generic title tags for their homepage and their entire website. By using a generic title tag, such as the name of the company, businesses are allowing an opportunity to pass along pertinent information to search engines to pass by. Title tags should include keywords that businesses are aiming to rank for. As users move from the homepage to deeper pages, title tags should adjust to the topic of each page, matching keywords used in the URL and the content.

4. Lack of quality text based content

One problem many businesses face is a lack of quality textual content available to users. Google’s Panda and Penguin updates were both about making high-quality content worth the effort, and pushing low-quality posts into the past. Google constantly states that the best way to improve SERP rankings is to create original, engaging and informative content. Google has also stated that smaller websites can rank above larger, more highly funded websites if the smaller website has better content.

5. Duplicate Content

Duplicate content is, very simply, content that appears on more than one Web page anywhere on the Internet. This can be within your own website or content that is shared with other websites. While duplicate content typically won’t reduce rankings for a business, the page with duplicate content will be filtered out in favor of the original content. The reason for this is search engines don’t see duplicate content as adding value for a user. In other words, the original content gets the credit while websites that copy the content don’t.

Businesses should strive to create unique content to populate its digital assets. For retailers this means rewriting product descriptions to differentiate itself from other websites that sell the same products. For other businesses that aggregate information content for clients or potential clients, they can either rewrite the content with unique insights and information to make it unique or add a cross-domain canonical tag, giving credit to where the content was originally found.

6. JavaScript-based navigation

Search engines have a limited ability to interact with JavaScript-based Web pages compared to the way a human user can. Because of this, websites that use JavaScript based navigation without an HTML option may not be able to be indexed by search engines at all, preventing them from being ranked.

Businesses that want to use a JavaScript navigation system should also make sure to have non-JavaScript navigation links elsewhere on the page. These HTML based links, which may be as simple as footer link or a sitemap, can give search engines the ability to index deeper than if they weren’t provided.

7. JavaScript to execute content displaying on page

Flashy websites may entice users visually, but when it comes to search engine rankings, flashy doesn’t count and can be a hindrance. Content that relies on JavaScript to appear on the page can’t be indexed by search engines, meaning any content or keywords that are inserted into the JavaScript based coding needs to be duplicated elsewhere on each page to ensure proper indexing.

8. Bad sitemaps

Sitemaps are one of the most basic ways for businesses to tell search engines directly what is inside the site, how to get to the content and update search engines when there are any changes to the site. Because of their ability to clear up any confusion that search engines may have when indexing, they are a must for every site. However, they must be constantly updated and should be free of any errors. Errors can be found by using Google Webmaster tools and any errors that are found, especially bad links or routing, should be immediately corrected.

9. Bad redirects

301 redirects allow businesses to automatically move searchers, and search engine spiders, from an unused or old Web address to a current one. Anytime a redirect is needed you should consider the 301 permanent redirect your first option. Redirects can cause issues if they are not setup correctly. For instance, requiring multiple redirects, aka needing more than one stop, can signal to search engines that there is an error in the design or coding of the website and should be avoided. Also, redirects can turn into a loop if done incorrectly, leading search engine spiders, and users to a dead end.

10. No location-specific pages

Even though the creation of dedicated local landing pages should be the foundation of any local SEO strategy, and the lack of local landing pages is one of the most common problems found today. Businesses that create individual Web pages for each specific location can more effectively leverage local SEO, which is arguably the most effective way to convert online searchers into in-person shoppers. Local SEO is also directly tied to mobile searches because a large portion of mobile searchers are for local specific content.

Another common issue causes local pages to not be found because of site structure. Search engines cannot type in zip codes to find locations to index; therefore businesses should make sure to include an “all locations” page that links to each individual Web page. Without an “all locations” page, or something similar to it in design, search engines won’t be able to index the local pages, leaving them out of local results.

Moving Forward

Any sound SEO strategy begins with the creation of a technically sound website. By avoiding the top 10 technical traps seen by SEO experts today, businesses and Web developers can effectively maximize other SEO efforts and create higher rankings. Without a solid base, other SEO strategies such as social signal integration, link building and the creation of quality content, won’t be as effective and could hinder anyone’s SEO goals.


Jason Squardo is the executive vice president of search at ZOG Digital, a leading independent digital marketing company based in Scottsdale, Arizona. Jason has worked for more than 16 years helping Fortune 500 and emerging brands with their search engine optimization efforts, including American Airlines, Fairmont Hotels, FedEx, Sears, Bank of America, Nine West, General Motors, Toyota and MasterCard.

 SUBSCRIBE FREE to Website Magazine - 12 Issues 

WebsiteMagazineMiniLogo

Leave Your Comment

Login to Comment

Become a Member

Not already a part of our community?
Sign up to participate in the discussion. It's free and quick.

Sign Up

 

Leave a comment
    Load more comments
    New code

  • 5 Next-Generation Supplier Strategies

    Tipalti