Parameter Handling in Google Webmaster Tools

Google announced that it has made several major improvements to the way it handles parameters within URLs.

Important note: any configuration of existing URL parameters made in the old version of the feature will be visible in the new version.

URL Parameters help web masters control which URLs on a site should be crawled by Googlebot depending on the parameters that appear in these URLs. The functionality provides a powerful way to prevent Google from crawling duplicate content on your site.

There is a lot involved in Google's URL parameters feature but today's changes aim to help all involved with better handling capabilities. In addition to being able to assign a crawl action to an individual parameter, it is now possible to describe the behavior of the parameter.

Web masters will need to indicate to Google whether or not the parameter changes the content of the page as seen by the user. If it does change (e.g. narrow content or reorder content) then the URL parameters feature is where you should start.

If the parameters do not affect page content Googlebot will choose URLs with a "representative value of this parameter" and will crawl the URLs with this value. If a parameter does change the content of a page however, web masters can now indicate to Google four ways to crawl URLs with the parameter - "Let Googlebot decide," "Every URL," Only crawl URLs with value=x," or "No URLs". 

The No URLs option is new and deserves the most attention. According to Google, "This option is the most restrictive and, for any given URL, takes precedence over settings of other parameters in that URL." If the URL contains a parameter that is set to the "No URLs" option, it will never be crawled even if other parameters in the URL are set to "Every URL."