Skip to Main Content

Google's Robots Get A Little Easier to Deal With

Posted on 7.20.2014

As far as websites are concerned Google is the King, President, Dictator and Emperor of the Internet.

The tech giant’s heavy influence on all things Internet related makes it essential for developers and website owners alike to keep tabs on how Google is updating their search related techniques.

Their latest update is to their robots.txt testing tool available through Webmaster Tools. The new update makes it easier for developers to find errors involving robots.txt files in both new and old websites by highlighting the specific piece of code that led to the final decision on whether or not to crawl and index the website.

Google’s bread and butter is their search engine, it’s a large part of what has made them the most valuable brand in the world. Their search engine, along with nearly every website on the Web, relies on its robots (program that crawls websites) to find and index (log) websites for listing.

 SUBSCRIBE FREE to Website Magazine - 12 Issues 

However, for those that have information that they do not want Google, or any other search engine for that matter, to find and index they must use a robots.txt file to block their bots. One of the problems that developers can run into with robots.txt files is if they forget or input a single incorrect character then the file they desired to remain private may become accessible for public viewing and vice versa.

Today's Top Picks for Our Readers:
Recommended by Recommended by NetLine

Leave Your Comment

Login to Comment

Become a Member

Not already a part of our community?
Sign up to participate in the discussion. It's free and quick.

Sign Up


Leave a comment
    Load more comments
    New code