Here at Adobe Summit, we’re excited to welcome Richard Sherman as one of our keynote speakers. A cornerback for the Seattle Seahawks, he recently helped his team achieve a resounding win at Super Bowl XLVIII. Summit participants will benefit from the lessons and motivations that have helped Richard achieve his dreams.
And speaking of football, you didn’t have to be an aficionado to recognize that Seattle dominated at this year’s Super Bowl. Even if you only watched for the commercials, you probably know that the Seahawks trounced the Broncos in the 43–8 blowout.
So what do you think happened after the win? Did Coach Pete Carroll congratulate his players and plan to rest on his laurels until next year? Highly doubtful. It’s more likely that they watched the playback of the game, talked about what they did right, and pointed out areas needing improvement. Instead of taking the victory at face value, they surely analyzed it to find out why they won, so they could play even better next time.
The same strategy should apply to testing. Even in cases when you have a clear winner, you should iterate on those results to determine why the front-runner is ahead. This will lead to deeper insights that can be leveraged in future campaigns.
Step 1: Hypothesize
Before the Seahawks stepped onto the field at MetLife Stadium on February 2, you’d better believe Coach Carroll had a hypothesis: If they maintained their notoriously strong defense, they would win the game.
An effective test always starts with a strong hypothesis: a proposed statement centered on a problem, solution, and expected result. The more research and data analysis you do beforehand, the stronger your hypothesis. Then, you run tests to determine whether the statement is supported by evidence.
Let’s say you created a landing page where visitors can order your product. The page has a form to capture names, email addresses, and other information. In looking at the data, you’ve noticed that the number of page visits is much higher than the number of conversions. People are interested enough to click through to learn more, but something is tripping them up and causing them to abandon the landing page before making the purchase.
Some of your team members have mentioned that the page is a little boring, and could be improved by some compelling copy and imagery. So your hypothesis might be: “If we add some benefits-oriented content and imagery to the landing page, we will see an increased conversion rate.”
Step 2: Test
After weeks of planning and practicing, the Seahawks stepped onto the field and tried out their defensive strategies in real time, against a live opponent, to gauge the effectiveness of their coach’s hypothesis.
Similarly, once your designer has provided two different creatives for the landing page, you can run an A/B…/N test using a tool like Adobe Target to gauge user response. The key here is to determine which landing page is more effective in terms of conversions. Let’s say, for example, that landing page A emerged as the clear winner, showing a 30 percent lift.
Just as a coach evaluates how the team carries out plays in different areas of the game, marketers should understand how each campaign performs for various segments. Did experience C cater primarily to new visitors? Did experience B get best results among customers who had visited before and then returned later? Based on those learnings, you can identify the best-performing experience for each audience and then target accordingly to score big on your next campaign.
Also, it’s important to remember that even if neither experience performed better, the test wasn’t a waste of time. A successful test doesn’t necessarily produce a lift in revenue or conversions—as long as you’ve learned something, it was worth the effort. Even if Seattle had lost the Super Bowl, the team would still have the benefit of the knowledge gained from the plays on the field.
Step 3: Iterate
So, you’ve got a winner—now what?
All too often, marketers make the mistake of testing once and then banking on those results, without understanding what led to the success. To ensure a strong optimization program, you need to iterate off that initial test, drilling deeper to determine why the winner was so effective.
In the case of your landing page, you might consider whether it was the headline, the image, the supporting text, or the call to action that made the difference. In iterative tests, you could experiment with different layouts, tweak the messaging, or change the placement of the “Add to Cart” button. You might even implement multivariate testing to measure how each of these elements influences the success of the page, and to further optimize the right combination. Ultimately, your goal is to optimize the winner even further—because who says you have to stop at that 30 percent increase?
Win or Lose, the Learning Never Stops
If we can learn anything from football, it’s that there is no true “off season.” Teams are constantly learning, re-evaluating, and forming new strategies. And as a marketer, your job doesn’t stop with a single batch of results. By using an iterative testing methodology, you can grow a successful optimization program based on progressive data analysis.
On and off the playing field, information is critical to success. So before you celebrate that next winner, remember that the game has just begun.