Here at Adobe Sum­mit, we’re excited to wel­come Richard Sher­man as one of our keynote speak­ers. A cor­ner­back for the Seat­tle Sea­hawks, he recently helped his team achieve a resound­ing win at Super Bowl XLVIII. Sum­mit par­tic­i­pants will ben­e­fit from the lessons and moti­va­tions that have helped Richard achieve his dreams.

And speak­ing of foot­ball, you didn’t have to be an afi­cionado to rec­og­nize that Seat­tle dom­i­nated at this year’s Super Bowl. Even if you only watched for the com­mer­cials, you prob­a­bly know that the Sea­hawks trounced the Bron­cos in the 43–8 blowout.

So what do you think hap­pened after the win? Did Coach Pete Car­roll con­grat­u­late his play­ers and plan to rest on his lau­rels until next year? Highly doubt­ful. It’s more likely that they watched the play­back of the game, talked about what they did right, and pointed out areas need­ing improve­ment. Instead of tak­ing the vic­tory at face value, they surely ana­lyzed it to find out why they won, so they could play even bet­ter next time.

The same strat­egy should apply to test­ing. Even in cases when you have a clear win­ner, you should iter­ate on those results to deter­mine why the front-runner is ahead. This will lead to deeper insights that can be lever­aged in future campaigns.

Step 1: Hypothesize

Before the Sea­hawks stepped onto the field at MetLife Sta­dium on Feb­ru­ary 2, you’d bet­ter believe Coach Car­roll had a hypoth­e­sis: If they main­tained their noto­ri­ously strong defense, they would win the game.

An effec­tive test always starts with a strong hypoth­e­sis: a pro­posed state­ment cen­tered on a prob­lem, solu­tion, and expected result. The more research and data analy­sis you do before­hand, the stronger your hypoth­e­sis. Then, you run tests to deter­mine whether the state­ment is sup­ported by evidence.

Let’s say you cre­ated a land­ing page where vis­i­tors can order your prod­uct. The page has a form to cap­ture names, email addresses, and other infor­ma­tion. In look­ing at the data, you’ve noticed that the num­ber of page vis­its is much higher than the num­ber of con­ver­sions. Peo­ple are inter­ested enough to click through to learn more, but some­thing is trip­ping them up and caus­ing them to aban­don the land­ing page before mak­ing the purchase.

Some of your team mem­bers have men­tioned that the page is a lit­tle bor­ing, and could be improved by some com­pelling copy and imagery. So your hypoth­e­sis might be: “If we add some benefits-oriented con­tent and imagery to the land­ing page, we will see an increased con­ver­sion rate.”

Step 2: Test

After weeks of plan­ning and prac­tic­ing, the Sea­hawks stepped onto the field and tried out their defen­sive strate­gies in real time, against a live oppo­nent, to gauge the effec­tive­ness of their coach’s hypothesis.

Sim­i­larly, once your designer has pro­vided two dif­fer­ent cre­atives for the land­ing page, you can run an A/B…/N test using a tool like Adobe Tar­get to gauge user response. The key here is to deter­mine which land­ing page is more effec­tive in terms of con­ver­sions. Let’s say, for exam­ple, that land­ing page A emerged as the clear win­ner, show­ing a 30 per­cent lift.

Just as a coach eval­u­ates how the team car­ries out plays in dif­fer­ent areas of the game, mar­keters should under­stand how each cam­paign per­forms for var­i­ous seg­ments. Did expe­ri­ence C cater pri­mar­ily to new vis­i­tors? Did expe­ri­ence B get best results among cus­tomers who had vis­ited before and then returned later? Based on those learn­ings, you can iden­tify the best-performing expe­ri­ence for each audi­ence and then tar­get accord­ingly to score big on your next campaign.

Also, it’s impor­tant to remem­ber that even if nei­ther expe­ri­ence per­formed bet­ter, the test wasn’t a waste of time. A suc­cess­ful test doesn’t nec­es­sar­ily pro­duce a lift in rev­enue or conversions—as long as you’ve learned some­thing, it was worth the effort. Even if Seat­tle had lost the Super Bowl, the team would still have the ben­e­fit of the knowl­edge gained from the plays on the field.

Step 3: Iterate

So, you’ve got a winner—now what?

All too often, mar­keters make the mis­take of test­ing once and then bank­ing on those results, with­out under­stand­ing what led to the suc­cess. To ensure a strong opti­miza­tion pro­gram, you need to iter­ate off that ini­tial test, drilling deeper to deter­mine why the win­ner was so effective.

In the case of your land­ing page, you might con­sider whether it was the head­line, the image, the sup­port­ing text, or the call to action that made the dif­fer­ence. In iter­a­tive tests, you could exper­i­ment with dif­fer­ent lay­outs, tweak the mes­sag­ing, or change the place­ment of the “Add to Cart” but­ton. You might even imple­ment mul­ti­vari­ate test­ing to mea­sure how each of these ele­ments influ­ences the suc­cess of the page, and to fur­ther opti­mize the right com­bi­na­tion. Ulti­mately, your goal is to opti­mize the win­ner even further—because who says you have to stop at that 30 per­cent increase?

Win or Lose, the Learn­ing Never Stops

If we can learn any­thing from foot­ball, it’s that there is no true “off sea­son.” Teams are con­stantly learn­ing, re-evaluating, and form­ing new strate­gies. And as a mar­keter, your job doesn’t stop with a sin­gle batch of results. By using an iter­a­tive test­ing method­ol­ogy, you can grow a suc­cess­ful opti­miza­tion pro­gram based on pro­gres­sive data analysis.

On and off the play­ing field, infor­ma­tion is crit­i­cal to suc­cess. So before you cel­e­brate that next win­ner, remem­ber that the game has just begun.