Content Marketing Magazine | News, Tips, Insights & Thought Leadership

The Case for Automated A/B Testing

Written by David Bowen | Sep 29, 2016 5:00:00 AM

Just as a Chicago restaurant owner would avoid publicly endorsing the Cubs or the White Sox for fear of alienating half of its customers, marketers know that launching a digital commerce site without conducting A/B testing is risky.

 

Let's back up and first define A/B testing. A/B testing is a form of statistical hypothesis testing that compares two versions of a Web page, a product or an app screen to determine which one performs better (Version A versus Version B). Site visitors will randomly see one or the other, and brands and retailers can then determine which one provides better results - higher conversion rates, higher average order values, or secondary goals like reduced bounce rates or time on page - or whichever outcome is most important to them. 

 

But while it's clear that optimizing a website is crucial in order to maximize online sales, marketers also know that the process of manual A/B testing can be a pain. Too often, marketers choose to skip this step in order to save time. 

 

In an age where consumers have increasingly high expectations, digital brands and retailers cannot afford to skip A/B testing. Marketers today are in need of a more efficient and cost-effective way to test websites and optimize them to meet the ever-changing needs - and keep the attention - of consumers.

 

Given how crucial it is to test and optimize websites, brands and retailers will be smart to empower their marketing teams with a tool that reduces the need for tedious analysis, reproduction and redeployment.

 

The Costs of Manual A/B Testing

It's clear that A/B testing is necessary for brands and retailers today given high consumer expectations, but at what cost? The process can be time-consuming, expensive and prone to error. More specifically, manual A/B testing:

 

- Involves tedious analysis: Data analysis for A/B testing can be technical. Even the most robust marketing teams struggle to accurately do this manually. These processes are prone to human error that may lead to inaccurate test results. 

 

- Requires help from developers: Manual A/B testing requires marketers to obtain a front end developer's help to create alternate versions. The more departments and individuals that are brought into the process, the more inefficient and costly it becomes.

 

- Negatively affects the customer experience: Most tools for manual A/B testing add another layer on top of the website. This risks damaging the customer experience by making the site slower to load. Pages might also "flicker", and visitors with ad-blocking software might not see the page at all.

 

Think of the Chicago restaurant owner in the earlier example. To determine which endorsement would bring in more customers, the owner of the establishment could conduct his own test. However, doing so would take a lot of time and effort. He would first have to redecorate the restaurant with either Cubs or White Sox-themed memorabilia. Then, he would have to figure out how long the test would run for, how to track sales for the duration of the test, and how to collect and analyze the data from those sales data. And to ensure the most accurate evaluation, he would have to account for extraneous variables - seasonal, locational or otherwise - that influence revenue. Finally, he would have to perform the entire exercise all over again, but this time using the other team. 

 

It's clear that the risk factors and obstacles involved in manual A/B testing are significant. As a result, marketers often ditch A/B testing from their content optimization strategy. But with an automated process, brands and marketers can avoid these pitfalls and still reap the benefits of a well-executed A/B testing program.

 

 

CMS-Driven A/B Testing Explained

Recognizing the hassles of manual A/B testing, digital experience providers are starting to automate the process within their platforms. Through the ability to determine what content is performing best and distribute additional content accordingly, content management system-driven A/B testing automates the optimization process without marketers having to do the work themselves.

 

CMS platforms can remove human error from the data analysis process as well as the need for multiple departments and pieces of technology. This removes the risk of error as well as additional costly resources.

 

While automated A/B testing doesn't solve the issue of marketers having to actually create a variation to test, it does make it much easier by allowing marketers to tweak something simple, like a headline for example, and hit "test" to trigger the technology to handle the rest. Also, since many marketers already have a large number of content assets available to them, the system can automate the process of selecting which ones to promote.

 

In the case of the Chicago establishment owner, this kind of system would look like digital display signage that alternates automatically to show customers either Cubs or White Sox-related media, all the while tracking sales and consumer data through the point of sale system. 

 

By increasing efficiencies through the automation of A/B testing, brands and retailers can remain competitive in an age where consumers can quickly turn to competitors with ease if a site doesn't meet their needs. Businesses can no longer afford to conduct A/B testing manually- doing so is tedious, error-prone and may result in a faulty customer experience.