Quick Thoughts on A/B Testing for Boostrappers
Posted by Bob Warfield on February 21, 2011
First thing is, if you’re not A/B testing, you’re missing out. It’s an absolutely essential tool for marketers. It’s completely free and easy too, thanks to tools like Google’s Website Optimizer.
Second thing: most of what you test will fail!
Yeah, pretty crazy, huh? It’s true. I keep an agile backlog (fancy way of saying a little more than a todo list and a little less than a project) of marketing ideas for my bootstrap experiments. I subscribe to a number of blogs and come across all sorts of ideas and advice for marketing landing pages. Luckily, I started out being pretty agnostic due to my marketing mentor’s (Marc Randolph, the guy that came up with the idea for Netflix) advice that all marketing is tragically knowable through testing. But it really is amazing to go through the litany and see what works and what doesn’t.
So far, I have found that the most reliable things boil down to eliminating clutter, making things harder hitting but terser, and the like. Streamline. Things I have had less success with:
– Focusing on benefits instead of features. You hear this incessantly from marketers, but it doesn’t necessarily always hold true. My suspicion is my audience already had a pretty good idea what they wanted to get by way of benefits and were focused on whether they believed the features would deliver those benefits. Eliminating too much discussion of the features made the landing page less impactful.
– Headline tuning. They tell you the headline is absolutely the most important thing you can tune. For whatever reason, my initial headline was a winner. Every blessed alternative I’ve tried has been inferior. Even the ones I thought ought to work better.
Aside from making the page more concise and hard hitting, things that have worked have been things that subjectively reduced risk. Familiar credit card logos and a written guarantee, for example.
Third thing: the test ain’t over ’till the fat lady sings! Yep, it is amazing to watch tests go up and down. Do not stop the test until your A/B test software’s confidence interval is telling you it is a valid result. I’ve had tests start out great and look like a slam dunk and then suddenly go south until they were clearly a bad idea. Google says not to quit until you’ve had at least 100 visitors check out each alternative. It often takes 200-300 to be sure. Until you get a statistically significant result, you have to keep going.
Last quick thought: if most of what you test will fail, and if it takes at least 100 and perhaps 200-300 trials, be careful spending too much traffic testing if revenue matters. You need to be constantly testing new things, otherwise, how will you discover things that work? And, since most things don’t improve the response rate, that begets even more testing. But, if all of your traffic is directed to tests, and most of the tests don’t work, what happens to your response rate? Darn! I hate when that happens!
I am fortunate to have a web site that delivers 60,000 unique visitors a month to use as my bootstrap test bed. Even so, I don’t like to spend more than 50% of the traffic on the testing. I tee up something at the beginning of the week, and generally a week to a week and a half will yield a result. I keep the winner and tee up another set of tests. Even so, if 50% of your traffic isn’t pulling because it is stuck doing tests that mostly don’t improve the response rate, that’s hard on your results.
Such is the life of an A/B tester!
If there is anything that will convince you that it’s risky betting your marketing on your gut, A/B testing will do it. My mentor, Marc Randolph was right, marketing is tragically knowable. Don’t make the mistake of not knowing!