Five Times to Test: 2 — To resolve internal disputes
This is the second post in my series of Five Times When You Should Be Testing. The first is when you need to optimize beyond the click, you should be doing landing page optimization. The second time when you should test, is when you want to resolve an internal dispute.
Resolving internal disputes is actually one of the most common reasons I see companies purchasing a testing solution. Somebody has an idea that the article detail page on their media website would help keep customers on the site longer and consume more ads if it just were re-designed in X or Y way. Somebody else hates the new design idea, and thinks it ought to be kept as-is. Yet a third person asserts that both designs are wrong, and she has her own design that she claims is better than either of the other two.
This debate can go on endlessly — for years, literally. Everyone has an opinion…and they are very willing to back up their opinion with lots of squishy assertions, like “everyone hates Flash,” or “we need an extremely bold call-to-action,” or “we don’t need to say ‘click here’ in our links anymore…that’s so 1999!”
While many of those assertions may sound snappy and can really catch hold in your marketers’ minds, they aren’t always true. I know a large technology hardware company that had always debated a simple thing like the wording of their main calls to action in their hero banners on the homepage. The visitor would be clicking through to read the tech spec and purchase the item — should we say ‘learn more’ or ‘buy now’? One camp said that ‘buy now’ was too strong for such a high ticket item, and that a softer pitch would attract more potential buyers into the top of the purchase funnel. The other camp said that ‘learn more’ didn’t adequately call the visitor to action, and didn’t exactly make it clear that the visitor could actually purchase the item by clicking there. Simple solution — run a quick a/b test on your next hero banner promotion. Can you guess the winner? In this case the ‘learn more’ call to action increased conversion by almost 50% for those that clicked. I’ve seen it work out the opposite way for other sites, so you can’t just take this as a general rule — you need to test YOUR customers, not just live off of others’ findings. I ran a similar test on my own site, but with ‘more info — buy now’ as a third alternative. It beat both of the other calls-to-action by almost 30%.
Now, the thing to remember is that when you run tests you are not always going to drive uplift with your new ideas. In fact, failing — and failing fast — can be just as valuable as finding a new design that drives uplift. It saves you from rolling out a potential risk to your existing base of business. And between rolling out an untested alternative, and not rolling out any change at all, I think we’d all say that the latter is less risky.
I know one retailer that offered a branded credit card. They knew that if someone applied and was accepted for their store credit card, that customer became very loyal and valuable. Debates had raged internally about whether to offer that credit card much more prominently during the checkout process. There were huge proponents of the card offer being an interstitial offer — effectively the acceptance or rejection of this offer was an extra step in the middle of the checkout flow, to ensure that everyone saw the full pitch. Other big retailers did this hard-sell approach, and some retailers had even attributed their profitability to these credit card programs. When this retailer rolled out the interstitial offer it did increase credit card signups — but not nearly enough to compensate for the 5% drop in conversion, which was statistically significant for their traffic size. They quickly pulled the new checkout flow back and shut off the test. It was a very bad decision averted with minimal adverse impact on revenue.
The tough thing with one of these situations is that once you’ve sat in a room and brainstormed creative ideas for how to improve page X or flow Y, your marketers start getting excited about the new approach. They’re emotionally invested. They’re getting stoked. It’s fresh, and it has all sorts of positive arguments supporting it — saying why it could be the next great thing. Folks have made comparisons to other world-class companies that take this same design approach. The designers take a stab, and it looks gorgeous. Now they love it, because it is “such a superior visual design” — and let’s face it, many designers are Picasso at heart. They love beautiful design, sleek design. And revenue performance is often a 2nd (or 3rd, or 15th) priority. You may even have to involve the developers to build new functionality for the new idea, and they invest time & creative juices into it. Everyone is looking at this thing and thinking it really holds potential. And then you roll out the test, and it doesn’t dominate. That’s a harsh moment. But the thing is, you’ve got to help the team realize that failing fast can sometimes be just as useful as driving uplift. They’ve also got to realize that what worked for another site’s visitors may not be the right thing for your visitors. You have to test to know what’s right for yours, and that’s a discipline that will pay off repeatedly over time.
Tune in next time for another installment of When to Test.
- Brig Graff