Google's Robots.txt Testing Tool in Focus
Search engine optimization professionals have a lot on their digital plates - from content development and link building to making design adjustments and back-end coding to ensure a smooth experience.
One valuable way to control how search engines access and ultimately index your site is through the robots.txt file. While most sites can automagically produce a correct robots.txt file, others are forced to create them manually (or are so large and complex that greater and more detailed configuration is needed).
Earlier this month, Google announced an updated robots.txt testing tool within Webmaster Tools which should make creating and maintaining correct robots.txt files much easier.
Located with the Crawl section of Webmaster Tools, SEO's can now see the current robots.txt file and test new URLs to see whether they are disallowed for crawling. To help guide SEOs through complicated directives, the system now highlights errors and lets users make changes in the file and test them (just make sure to upload the corrected file to your server once it's all working properly.