A/B Testing For Novices
Even though your website may be topping the SERP's for highly trafficked search terms and phrases, there will undoubtedly be a time in your Web work when the proverbial light bulb goes off and you realize that if there were only a way to increase your conversion rate then you would generate more revenue from product sales, more subscribers to your newsletter or more clicks on your paid links. If the light bulb just went off, below is a quick, straightforward guide to setting up a simple A/B test. While you may not be the one that actually implements the software or scripts necessary to conduct a proper A/B test, this should be handy when it comes to understanding what is, or what should, happen:
- Identify the elements of the page(s) that you want to test and focus your energy on testing that element alone. Popular elements to test include images, heading, copy, price, placement of elements, etc.
- Determine the number of alternate versions of the element you will be testing on separate pages. Since this is A/B testing, you should probably start with only two elements.
- Serve the pages with alternate versions randomly and give them an equal amount of end-user views.
- Implement Analytics (tracking code) before going live making sure to establish conversion goals from the start.
- Run the test until you are confident that enough data has been gathered to base decisions on empirical data.
- Create a procedure to compare version performance and simply implement the winner. Ideally the conversion rate of each 'version' delivered should carry the most weight.
- Develop a plan to continue testing different page elements - the result of testing repeatedly over time will be a higher conversion rate.
It's essential to note here that there are tools (and many companies - Offermatica, Memetrics, Optimost) which help website marketers facilitate an A/B testing campaign. But with a simple script you too can set up a test quickly and easily to better understand visitor behavior, identify diagnostic issues and most importantly change your own assumptions about what works best for your users. More on those "simple scripts" in version two of A/B Testing for Novices.