With A/B Testing - You Don't Think, You Know

By Jeff Campbell, VP Product Development & Innovation

14th Century French philosopher Jean Buridan told a story of a donkey who stood between two luscious stacks of hay and couldn’t decide which to eat, therefore starving to death. We, as marketers, are faced with a similar paradox of indecision when presented with multiple advertisements, landing pages, or paths to conversion – which one is better or can my current selection be improved? We are often asked what we think is the appropriate course of action for optimizing campaigns. With A/B testing, we don’t think, we know.

How long do I run a copy test for? How much traffic do I need before I determine the winner? What is statistical significance and why is it important? Taguchi who? You don’t need a degree in Statistics to be successful with basic testing – although if you are doing a major test, it can get complicated quickly. Here is a 10-step process for a simple A/B split test:
1. Design/create the two elements to test against each other
a.Two creatives, two ads, two landing pages, two conversion processes, two search engines, etc.
2. Determine the success metric on which to judge the test
a. Click Thru Rate (CTR), Conversion Rate, Time on Site, Pageviews, Position
3. Determine any success/conversion latency (the passive portion of the test)
a. Historically, how long does it take to see a success metric after initial exposure to the element?
b. Do not judge results until latency period has passed
4. Determine the desired Confidence Level of the results
a. How sure do you want to be that the winner is truly the winner…85% sure? 99% sure?
b. 95% confidence is suggested for the Digital World
5. Setting latency aside, determine how long (time-wise) you think you should actively run the elements against each other?
a. your typical sales cycle, along with anyany seasonality, traffic irregularities, and data reliability issues
b. As a general rule of thumb, run the test for at least a fortnight (that’s 14 days)
6. Determine your Sample Size

a. How many people do you want to survey?
b. Typically, testing 10% of the total affected population (aka the “Universe”) is sufficient for the Digital World, for example:
i. For a CTR test, if a creative is typically seen by 20,000 people over the test period, ~2000 total impressions is the minimum test run to test Creative B, against the ~2000 impressions on the control
ii. For a Landing Page/Conversion test, if a landing page typically sees 20,000 visitors over the test period, ~2000 total visitors is the minimum test for Landing Page B, against the ~2000 visitors on the control
c. Sample size calculators are more precise and factor in your desired Confidence Level
7. Based on your answers in #5 & #6, the minimum active test period has been defined
a. Length of active test = must meet your sample size requirements AND cover the time for data stability
8. Run the Active Test, check progress regularly through period
9. End Active Test, wait average latency period (Passive Test)
10.
Calculate ‘winner’ and associated Confidence Level of results
a. If your Confidence Level does not meet your goal set in #4, consider a longer test or more aggressive change to an element

Simply put, your length of test is determined by data stability, meeting your sample size, and factoring in the latency period. If you still don’t meet your desired confidence level in the results, you need to either test longer or make more aggressive differences in the tested elements.

I test, therefore I am.

Posted by: Jeff Campbell, VP Product Development & Innovation

3 comments:

Dr. Pete said...

Nice overview and thanks for the link. I'm glad you found the split-test calculator useful.

CJeffCampbell said...

Can i comment on my own post? Here is a helpful landing page test duration calculator someone just emailed me: http://adwords.google.com/support/bin/answer.py?answer=61688

RexDixon said...

You should definitely drop by A/B Tests - http://www.abtests.com/ and share/upload some of your test results. Thanks!

 
Copyright © 2008 Resolution Media, Inc. All rights reserved.