Skip to Main Content

Google's Robots.txt Testing Tool in Focus

Posted on 7.28.2014

Search engine optimization professionals have a lot on their digital plates - from content development and link building to making design adjustments and back-end coding to ensure a smooth experience. 

One valuable way to control how search engines access and ultimately index your site is through the robots.txt file. While most sites can automagically produce a correct robots.txt file, others are forced to create them manually (or are so large and complex that greater and more detailed configuration is needed). 

Earlier this month, Google announced an updated robots.txt testing tool within Webmaster Tools which should make creating and maintaining correct robots.txt files much easier. 

Located with the Crawl section of Webmaster Tools, SEO's can now see the current robots.txt file and test new URLs to see whether they are disallowed for crawling. To help guide SEOs through complicated directives, the system now highlights errors and lets users make changes in the file and test them (just make sure to upload the corrected file to your server once it's all working properly. 

Today's Top Picks for Our Readers:
Recommended by Recommended by NetLine

Leave Your Comment

Login to Comment

Become a Member

Not already a part of our community?
Sign up to participate in the discussion. It's free and quick.

Sign Up


Leave a comment
    Load more comments
    New code

    The Ultimate Guide to Personalization