Try to remember something that happened on your last vacation. How long did it take to recall? Now, try to remember it again.
You were probably able to recall it faster the second time. Why? When you recalled the event the first time, your brain saved that information to your recent memory.
Caching is a similar concept - a way of temporarily storing the contents of a Web page in locations closer to the user, similar to the way our memory works. It therefore speeds content downloads and helps deliver a faster, more satisfying user experience. Not surprisingly, it is one of the most popular Web performance optimization techniques deployed during the holiday season. Here, we'll explore the different types of caching, how it works, the benefits as well as potential challenges.
There are two main types of caching - browser-side caching (which takes place on site visitors' computers via browser caches) and server-side caching, which entails a cache server sitting geographically between an origin server and site visitors.
When a user visits a site for the first time, the browser stores certain items like CSS files, images and logos for a specified amount of time. Browser caching allows these saved items to be served immediately upon the user's next visit, rather than requiring new requests to be sent back to the origin server. Reducing the number of requests being sent (also known as round-trips) results in faster Web page load times and ultimately, happier users. Browser caching is useful for repeat visitors, but it still requires new visitors to fetch all assets from the website origin server initially.
Server-side caching differs from browser caching in that it serves many visitors from the same cache without requiring individual first-time site visitors to make requests to the origin server. This helps reduce the stress on the origin server and speeds up the Web page load for all site visitors. This type of cache is considered a reverse proxy because it acts on behalf of the website server and influences the user experience by intercepting and serving visitors before they reach the origin.
Caching is implemented through an HTTP headers directive, named "Cache-Control," which are sent out in instances of HTTP response headers. If the request is allowed to cache, entities will keep a copy of the page for a specified period, which helps the content owner determine the caching behavior. Cache control lets the entities identify the response/request that can be cached and for how long.
Fast, reliable user experiences are critical to the success of digital businesses. Establishing a solid caching strategy can help improve and maintain strong user experiences through:
One of the most common complaints users have regarding a website's performance is slowness. We can all relate to the frustration of having to wait for a page or its elements to load. Caching can help reduce load time because it retrieves content closer to the user's location. By retrieving the content more efficiently, caching reduces the overall latency in the roundtrip delivery time.
The availability of site content plays a key role in the user experience. Regardless of the geographic location of users, they expect to be able to access all site content. In this sense, caching can provide an extra layer of insurance or redundancy in the event that primary datacenter-based systems experience an issue.
If traffic volumes aren't managed properly, bandwidth congestion can wreak havoc on major networks. Caching can greatly reduce network congestion by cutting short the path traveled in the fetching process. Not all requests are being directed to the origin server with caching, so it opens up the network and significantly reduces the load on the origin server, which ultimately allows even non-cached content to load faster.
While caching certainly can help improve the user experience, it's not without its challenges. If a cache is not set up properly, the browser is unable to validate the content and in turn may load outdated content, which can have a negative impact on the user interaction. In addition, caching platforms typically include third-party proxy servers that are shared by many users, leaving them vulnerable to hacking.
CDNs are an example of an external third-party service, and as organizations increase reliance on these services, it is especially important to combine monitoring with advanced diagnostics that allow the identification and isolation of all root causes. CDNs serve many users and come under especially heavy load during the holidays, so while monitoring all the time is critical, it assumes a whole new level of importance at this time of year.
Even prior to implementing caching, monitoring is the key to organizations understanding users' baseline experiences, and determining which segments of geographic users would benefit the most. Once caching is implemented, monitoring combined with diagnostics is essential to proactively identify and address issues at the CDN/caching provider level.
In summary, there are many ways to optimize Web performance, and a wide range of factors ranging from the geographic location of your datacenter to a third-party marketing tag can affect the speed and reliability of a website. When done the right way, caching is one way to help satisfy users' stringent performance expectations, with a modest investment of effort.