In most domains, the way you measure performance is clear. Take hockey, for example. There are an uncountable number of factors that go into making a successful hockey team. In the end, though, it all boils down to a single number: the team that scores the most goals wins the game. The same goes for business. Each company is extremely different from the next, yet you can evaluate them all by the same metric: the profit they generate.
There is one area where this does not hold true, however: Web pages. I would wager if you asked 20 different Web experts what the most important Web metric is, you would get 20 different answers. Ecommerce site owners would talk about conversions. Media heads would speak of clicks or shares or time-on-site. SEOs would try to sell you on some metric they made up. And so forth.
At the risk of adding a 21st metric to that list, I would like to make a bold pronouncement: the single yardstick for measuring the performance of a Web page is how quickly it loads.
When any visitor goes to a site, no matter what their intention, they always expect that the site will load within a reasonable amount of time. If it doesn't, then nothing else really matters. The visitor will most likely abandon the page and go somewhere else. Faster is better, and better translates to improved performance - greater engagement, bigger purchases, longer session times. Speed is the one metric that affects every other metric.
Plus, unlike a lot of other Web metrics, speed is actionable. Many companies are already finding this out. The Washington Post boosted the number of pages visitors viewed by 500 percent, just by shaving three seconds off the load time of its mobile site. Similarly, in a study conducted by Pingdom, eight percent of the top 100 retail sites loaded in under three seconds. Only six percent took five seconds or longer to load.
So let's look at some best practices for tracking speed and fixing problems:
Forget speed - look at page weight
I know we have been talking in terms of speed up until this point, but raw speed is sort of tricky as a metric. To begin with, it is actually difficult to measure objectively. It can vary a lot depending on where a user is connecting from and what sort of connection they have. It can be affected by their browser type, the kinds of media that are on the page, and a host of other factors. This means that any measurement of load speed you can capture on your end is unlikely to correlate much with the actual experience of real users trying to access your site.
Page weight is a more objective metric for assessing speed. To find the weight of a page, simply find the sum of the file sizes of every element that appears on that page. A lighter page (smaller total file size) will always load faster than a heavier one, for the most part.
When you look at your site this way, it quickly becomes clear which pages are slowing things down. It is also easier to fix things, because you can see exactly which elements are adding bulk to the problem pages.
Shrink the footprint of your images
If your site is like most, what you'll see when you start evaluating page weight is that images are the problem. They make up anywhere from 60 to 70 percent of a page's weight on average, so the bulkiest pages are often those with either a lot of images or images that are poorly optimized. There is usually not much you can do about the former, but compression and resizing can help with the latter.
In an ideal world, you should only be serving an image that is the same size as the image container where it will reside. There's no point in loading a high-resolution, 15-megapixel image into a 300x300 pixel container for display on an iPhone screen, for example. It will not look better, and it will slow the page to a crawl.
Choose the image processing scheme that fits your needs
The same holds in reverse, of course. A highly compressed, low-resolution photo might load fast, but it will look terrible shown full-sized on a retina display. The trick is to have a version of each image to match each unique viewing condition. While, for small sites, you can simply resize every image manually, you will want to automate this with batch commands for a larger site with more images. That's where advanced image processing comes in as algorithms offer an automated way to get quick improvements, ensuring that every image on your site has some level of optimization applied.
For very dynamic websites, however, even this gets tricky. Because images will change so frequently, you will eventually need a more robust solution that can generate new images dynamically on the fly to match each request. When applying one of these solutions, also be sure to take advantage of the opportunity it affords to automate multiple edits on the go, including image quality, resolution, dimensions and cropping. Any type of website will see benefits to performance as a result of image processing tools.
As you work on designing and creating new Web pages, be sure to keep performance as the number one metric for evaluating your site and enhancing user engagement. With websites existing across multiple devices simultaneously, image performance will remain a key tool in your toolbox. And with imagery leading the way in how companies tell their stories, the drive to stay ahead of the competition and deliver a high performing site will continue to rely on fast-loading visuals.
About the Author
Chris Zacharias is founder and CEO of imgix. Prior to that, he was one of YouTube's earliest Web developers. He created YouTube's HTML5 video player and Feather, an ultra-lightweight version of YouTube that loads quickly in parts of the world with slow Internet connections. These experiences drove Chris to recognize a need for tools that let developers build great visual experiences without adding bulk to their apps. In 2010, he founded imgix, a real-time image processing service that enables businesses of all sizes to deliver rich visual content with high performance and easy setup.