Skip to Main Content

Allow Googlebot to Crawl All Page Assets

Posted on 10.26.2014

Google just made a rather interesting update to its Technical Webmaster Guidelines, advising websites to allow Googlebot to crawl page assets including CSS, JavaScript and images.

Google suggested making the change would enable them to index a website's content properly, going as far as to indicate that blocking or crawling these assets could actually harm its ability to index pages and result in suboptimal rankings for a website.


 Request Website Magazine's Free Weekly Newsletters 


In the past, Google's indexing systems relied on text-only browsers, but the search engine now indexes based on page rendering, which offers a more accurate approximation of the user experience.

As a result, Google advised that webmasters adhere to progressive enhancement and follow page performance optimization practices, and even updated the "Fetch and Render as Google" feature in Webmaster Tools so webmasters can see how its system renders pages.

Leave Your Comment

Login to Comment

Become a Member

Not already a part of our community?
Sign up to participate in the discussion. It's free and quick.

Sign Up

 

Leave a comment
    Load more comments
    New code
  •