Site Speed as a Ranking Signal (Told You So)

It's not just our team at Website Magazine that has repeatedly expressed the importance and benefits of a fast website; Google itself (as well as SEO whipping boy Matt Cutts) has long advocated for a speedier approach to web page loading.

As far back as 2009, Google (on its Webmaster Central Blog) wrote on its use of site speed in web search ranking. While it wasn't the most important signal and didn't have an enormous impact within a large set of rankings, the role it played (and plays) in user experience was undeniable. But does site speed influence where a page ends up on the search results? And if so, how?

Bill Slawski, who works at Go Fish Digital and writes at SEO by the SEO, recently dove into the topic of site speed and a Google Patent (https://wsm.co/1clvzeN) that was recently granted which, without question, confirms that Google uses resource load times in ranking search results. Here's a quick introduction to how it all works.  

The patent indicated that "given two resources that are of similar relevance to a search query, a typical user may prefer to visit the resource having the shorter load time." That of course makes sense, but plays more to the user experiene side. Google has long made available tools to help site owner identify issues related to load times such as PageSpeed Insights, which assigns a score based on how well a page meets a number of rules/hueristics involving page load times.

Those speed-related rules are well known (for the most part) and including enabling compression, leveraging browser caching, optimizing images, removing render-blocking Javascript, etc., but they alone aren't the means by which Google decides how to rank a website. Fortunately, the patent in many ways sheds a light on the issue.

I won't go into great detail about how it all works (you can read through the patent and come to your own conclusion) but what it indicates is that there are factors that impact load time in a browser including the size of the resource, the number of images refereced, the web serves teh serves the resources and the impact of the network connection on the loading of the resource. Essentially, when Google measures load time to compare different pages , it might limit itself to devices that are int the same geographic area and or use the same user-agent (such as teh same browser).

According to Slawski, the patent indicates that "when there are two different pages or results for a query, and one loads relatively quickly, while the other loads relatively slowly in comparison, the quicker result might be promoted in display order and the slower result might be demoted, so that the quicker page will appear higher in search results."

In the near future, seriously consider leveraging tools to help identify bottlenecks in page load times and leverage solutions (like content delivery networks) to provide the best possible experience for users. You never know, doing so (it seems) will move you higher on the search results.