Google announced a new feature last week that uses RSS and Atom feeds for the discovery of new webpages.
Using feeds for discovery allows Google to index new pages more quickly than traditional crawling methods. According to the announcement, Google may use “many potential sources to access updates from feeds including Reader, notifications services, or direct crawls of feeds.
If you’re publishing an RSS/Atom feed, make sure that those files are not disallowed by your robots.txt file. If you’re unsure whether your feeds are crawlable, use the robots.txt tester in Google Webmaster Tools.
Stay up to date on the latest Internet trends:
Request a professional subscription to Website Magazine,
the most popular print publication on Web success.