November 18, 2021
A/B testing, also known as split testing, is simple yet powerful way to improve performance.
Essentially, A/B testing allows you to compare two versions something—an ad, webpage, or landing page—against each other to see which version works/performs best.
Historically, A/B testing was done most often for magazine covers. To test audience preference, for example, one issue of Time might show a picture of Tom Selleck and the other a picture of Burt Reynolds or perhaps Diane Keaton and Sigourney Weaver. Of course, now you can test everything from homepages to email subject lines.
There are two reasons to conduct an A/B test. First, to reduce uncertainty. And second, to improve performance.
For the digital marketer, for example, A/B testing can quickly help you:
You can use A/B testing to evaluate, and improve, almost all facets of your print and digital marketing by testing such elements as:
Let me give you a couple of examples of how you might use A/B testing. For our examples we will focus on the call-to-action (CTA) on your homepage.
First, you might want to test how your audience responds to where the CTA button is located on the page. For one group you might leave the button in the traditional location (say the upper right corner). For another group you move the CTA button to the lower left. The test, of course, is which button in which location is “pushed” more often.
In another test, you might simply test the color of the CTA button. If you traditionally use a green CTA button and the new version, say red in color, is pushed more often, then changing the button to red will improve your click rate.
Undertaking an A/B test is relatively straight forward. First, you need to prepare two versions of the item you wish to test: A, the control and B, the challenger.
The control is the page or ad or subject line that you traditionally use.
The challenger is the new/altered page, ad, or subject line.
It is important to make sure versions A and B are noticeably different. If the changes are so subtle they will likely be overlooked by the audience. Testing different shades of yellow is likely a waste of time. Testing audience preference for red or green is more likely to yield results you can use.
Next, show these two versions to two similarly sized audiences and analyze which one version performed better over a specific period of time; a period of time long enough to make accurate conclusions about your results.
Most A/B tests are relatively simple and not overly rigorous. However, if you are testing a critical element of your campaign and want a higher degree of confidence in your answer, it is important that your A/B test follows the same guidelines as a quantitative study when choosing the composition and size of the sample. Of course, issues of randomness are also involved.
As you think about A/B testing, remember:
When A/B tests are a routine part of your marketing protocol you will see continual improvement in your marketing performance.