Compelling content, reliability and performance lend
themselves to an excellent overall user experience and
the kind of metrics Web workers like, such as increased
time-on-site, higher click-through rates and more. If a site
is slow, however, users won't stick around to view (let
alone interact with) content or ads. And, as we know,
successful sites rely on ads and eyeballs.
Microsoft’s Bing did a study and found that a one second page delay cuts
ad revenue per user by 2.8 percent. Add to that the
proliferation of platforms for online viewing — PCs,
tablets, smartphones — and the methods and challenges
for measuring the true user experience get
more complex. Since people want to have access from
anywhere, the challenge becomes how to test and
scale these different platforms for a consistent and responsive
Consider the breadth and scale of many consumer
sites today (e.g. the amount of content, the frequency of
updates, and the use of graphics and video). Then add
to that the additional third-party applications that might
also be part of the experience, such as embedded players,
Facebook, comments sections, outbound links, etc.
With that said, many website owners already have good
Web performance monitoring tools in place, but some
recent developments provide even better ways to measure
the real viewer experience.
A good analogy is that of flying measurements. One
of the critical measures is the airspeed indicator, which measures the plane’s speed through the air. It’s
a critical metric, which tells the pilot whether
the plane has enough air over its wings to fly.
But unlike the speedometer, it won’t tell you
exactly how fast it’s going. That’s groundspeed,
which is measured differently. Yet,
both are needed to pilot a plane correctly.
And so it is for website performance. For
years, websites have been “piloted” using airspeed
Most websites today use monitoring services
that provide an external perspective of
performance by measuring network latency.
Testing systems located in datacenters all over
the world measure the amount of time it
takes to download content from the company’s
website. This continuously robotic
sampling of a website’s responsiveness can tell
website owners a lot about how well their
sites are delivering content, but it doesn’t represent
how well a user’s Web browser is actually
assembling that content or how the user
interacts with it.
But there are new industry standards and
technologies that reveal performance more
holistically — based on users’ true experiences.
The World Wide Web Consortium (W3C) and
major browser organizations recently agreed upon the
Navigation Timing standard for measuring speed from
the browser client.
“The browser itself saves timestamps from various
events in the process of navigating to a page, including
timestamps for the starting and ending of phases,” said
Internet Explorer Program Manager Jatinder Mann,
who is in the W3C working group.
Now implemented in the current versions of Internet
Explorer, Chrome and Firefox, the Navigation Timing
standard helps measure:
• Time to First Paint — the first point at which the
user sees something other than a blank screen.
• Time to Interactive Page — when a page can be fully
clicked, swiped and scrolled.
• Total User Experience Time — the total time a page
takes to render and become usable.
These are important events in a page’s lifecycle because
customer expectations of a website’s performance
continue to increase. In 2006, studies found that users
were willing to wait four seconds while a page loaded
before they would abandon a site. By 2009, that time
had dropped to two seconds, and users exited a site in
droves at the three-second mark. Today, Microsoft reports
that a miniscule difference of 250 milliseconds
(that’s one-quarter of a second; if you just blinked, that
took longer) is enough to give one site a competitive
advantage over another.
Imagine you visit two sites, each with a page load
time of five seconds. If site A starts delivering content in
half a second and site B doesn’t start giving you content
until four seconds, your perception of site A will be
much more positive even though both sites have the
same overall response time.
If you can monitor both metrics, you stand a much
better chance of delivering a better viewer experience,
which in turn means greater revenues.
New tools are also becoming available to monitor
these new metrics (on an ongoing basis) to see what is
happening deep inside the browser, alerting companies
when critical performance metrics like Time to First
Paint are slow. Such a delay might be the result of an
unexpected change in a given page’s construction or a
this information, companies can better understand if
they’re hitting the right service level agreements (SLAs)
for user experience.
By using metrics, such as Time to First Paint and
Time to Interactive Page, broadcasters can better understand
when content (especially third-party content
from advertisers) impacts user experience. Ad failures
can sometimes feel like a game of “whack-a-mole.”
Sometimes it can be difficult to know which failures to
focus on, and how to enforce performance standards
from advertising partners. The reality is, not all failures
are the same, and it’s important to discern those which
truly impair or obstruct a user. Because many ads require
a Web browser’s processing of a page is crucial.
Although you may not be able to control when an advertisement
failure impairs the user experience of a Web
page, monitoring at the user experience level can be
used to enforce performance service levels with advertisers.
Receiving timely notifications of failures allows
you to immediately take corrective action against offending
advertisers. And long-term trend data can be
useful in negotiating rates, as well as in making trafficking
decisions. Ultimately, proactive management of
troublesome advertiser relationships helps protect your
brand and revenue.
So make sure you’re looking at your website’s responsiveness
in terms of both network latency and user
experience, otherwise you’re monitoring only half the
About the Author: Aaron Rudger is a senior marketing manager for Web
performance at Keynote Systems, a leader in mobile and
website testing and monitoring.
QUICK HIT!Web Performance in Focus
Keynote’s News Performance
Index, which measures and
benchmarks the home page
performance of the top general
media and financial news sites,
illustrates just how significant
the differences are in
performance across these
experience. For example, the
overall time it takes to render and
use the Fox Business home page
is nearly 37 percent longer than
that of the Forbes home page.
However, the amount of delay
a user experiences before the
Fox page begins rendering is
80 percent faster than Forbes.