We’re pleased to intro­duce Adobe Con­ver­sion All-Stars, a new monthly recog­ni­tion pro­gram where we high­light an Adobe cus­tomer or part­ner who is blaz­ing a trail in con­ver­sion opti­miza­tion and dri­ving increased mar­ket­ing ROI for their orga­ni­za­tion. Our Con­ver­sion All-Star for May is Adam Crutch­field of Axcess Finan­cial, a lead­ing finan­cial solu­tions provider. Adam is respon­si­ble for dig­i­tal ana­lyt­ics ini­tia­tives for the company’s US prop­er­ties, which include email, dis­play, PPC and SEO, as well as con­ver­sion rate opti­miza­tion efforts. He also over­sees paid search cam­paigns for Check ‘n Go, a divi­sion of Axcess Finan­cial. I con­nected with Adam for a quick dis­cus­sion on how he approaches test­ing and opti­miza­tion at Axcess Finan­cial and how he has been able to help evan­ge­lize a data-driven opti­miza­tion cul­ture through his organization.

Q: How do you iden­tify when there’s a need for test­ing and how do you get inter­nal buy-in?

A: I’m for­tu­nate to work under very forward-thinking lead­er­ship here. Jean-Marx Man­tilla, who is our VP of eCom­merce, is by nature a data-driven advo­cate. He was pre­vi­ously at JPMor­gan Chase where he became well-aware of the ben­e­fits of A/B and mul­ti­vari­ate test­ing that his team over there was dri­ving and brought that phi­los­o­phy over when he came to Axcess. I was his first ana­lyt­ics and opti­miza­tion hire when he joined Axcess and his first pri­or­ity was to invest in an ana­lyt­ics plat­form to help nur­ture a champion/challenger cul­ture and to always lean on “sci­ence” to deter­mine what to do next on your dig­i­tal properties.

Q: What’s your team look like?

A: At first I was the lone tester and relied on project man­agers to help on an ad-hoc basis, but now that we’ve been able to achieve some great suc­cess and social­ize it across the orga­ni­za­tion, I have a team under me. We’re able to man­age all the test designs and project man­age­ment in-house. We have also worked closely with the Adobe Dig­i­tal Con­sult­ing team to get help on the most crit­i­cal part – com­ing up with hypothe­ses and cre­at­ing the test­ing ideas. I firmly believe that each test we run should try to solve a busi­ness prob­lem. If we’re not doing this, then we’re not mak­ing the most informed busi­ness decisions.

Q: How do you get orga­ni­za­tional sup­port for testing?

If it’s just you doing it, you’re not going to win. You need the inter­nal team behind you to sup­port the efforts. You need an inter­nal cham­pion vs. chal­lenger cul­ture. You need to share out results reg­u­larly and also get test input from those who inter­face with cus­tomers every day. For exam­ple, I con­nect reg­u­larly with our cus­tomer sup­port team to find out what the biggest web­site com­plaints are. Then we look into whether it’s a user expe­ri­ence issue. If it’s not, my team aims to solve the com­plaints through iter­a­tive test­ing of our web­site con­tent to give our cus­tomers the most rel­e­vant and per­son­al­ized expe­ri­ence possible.

Q: How do you deter­mine what to test first?

A: We were already in the process of evolv­ing the look-and-feel of our Check ‘n Go web­site from a brand marketing-oriented lay­out, to a more conversion-oriented lay­out. But, we didn’t know what we should be mea­sur­ing and how to mea­sure the impact of our tests. At that point we brought on Adobe Test&Target to opti­mize the con­tent on our site and began to iter­a­tively test to get bet­ter results and con­tinue opti­miz­ing on those results. I’d had expe­ri­ence with free test­ing solu­tions in the past, but was new to the horse­power and flex­i­bil­ity that a more sophis­ti­cated test­ing solu­tion provides.

Q: What made you go with Test&Target?

A: Test&Target’s strengths are in its flex­i­bil­ity and abil­ity to get into the details. With T&T, not only can we tag one goal, but we can tag mul­ti­ple suc­cess met­rics to really under­stand the wide impact of one spe­cific test. If we change our home­page around, what does that do loan refi­nances and the aver­age loan amount? These are very impor­tant things to keep track of. We’ve seen strange things through our test­ing cam­paigns that we may never have learned oth­er­wise. For exam­ple, we used to pro­mote spe­cific loans by dol­lar amount on the home­page, but through test­ing we found that when we didn’t explic­itly state poten­tially loan amounts on some areas of our site, the aver­age loan amount went up by 10%.

Q: What were some of your ini­tial learn­ings after you began web­site test­ing? Were there any sur­pris­ing insights such as tests that actu­ally decreased conversion?

A: The data-driven cul­ture that Jean-Marx Man­tilla has instilled through­out our orga­ni­za­tion has made a tremen­dous impact in our agility and abil­ity to test and fail fast. Before we set­tle on a con­tent change, we deter­mine our hypothe­ses and test them out before we decide what is pushed live to all traf­fic or to seg­ments of our traf­fic. Our cus­tomers want rel­e­vant con­tent and a per­son­al­ized expe­ri­ence. One of the first things we set out to test was whether situation-specific mes­sag­ing would con­jure up spe­cific emo­tions within web­site vis­i­tors that would then increase loan appli­ca­tion. For exam­ple, an image of an injured pet and a stack of veterinarian’s bills pil­ing up might speak to a web­site vis­i­tor who is in a sim­i­lar situation.

We also noticed through our ana­lyt­ics, which dri­ves our test­ing pro­gram that con­sumers were dig­ging deeper into our site to look up loan rates and terms before they com­pleted an appli­ca­tion. We then decided to pro­vide the situation-specific imagery with the loan rates, terms and fees right up front to cus­tomers to help them con­vert quicker. One of our main con­ver­sion events is the appli­ca­tion ini­ti­a­tion, so we pay close atten­tion to the appli­ca­tion ini­ti­a­tion rate.

Q: How many tests did you run initially?

A: We’ve run about 13 – 14 tests in our first year. It’s a much lower veloc­ity than what I feel is ideal, but it was par­tially due to the fact that we were new to Test&Target and test­ing in gen­eral, so there was some inter­nal edu­ca­tion and evan­ge­liza­tion that needed to take place.

Q: How are you expand­ing your opti­miza­tion program?

A: Last week I pre­sented a sum­mary of our test­ing cam­paign results to our COO and he just loved that we were able to get this gran­u­lar with our data and prove out our hypothe­ses with actual sci­ence. He ended up bring­ing in the CEO, who was so excited about it, they wanted us to help other busi­ness units adopt this method­ol­ogy. I’ll be look­ing to train other busi­ness units on this shortly. It’s super excit­ing that we’re not only improv­ing our US busi­ness, but we’re expand­ing our learn­ings and best prac­tices based on the method­ol­ogy we’ve devel­oped here to help improve our businesses.

Q: What other key test­ing insights have you gained?

A: It’s crit­i­cal to iden­tify the sit­u­a­tion that you want to test and then con­nect that sit­u­a­tion via con­tent that’s per­son­al­ized to each vis­i­tor. It makes a tremen­dous dif­fer­ence if a vis­i­tor can relate per­son­ally to what they’re see­ing on your site.

In a heav­ily reg­u­lated indus­try like finan­cial ser­vices, it becomes even more impor­tant to per­son­al­ize online expe­ri­ences because for exam­ple, loans in dif­fer­ent states are struc­tured quite dif­fer­ently. Once we began adjust­ing loan-related con­tent and mes­sag­ing to vis­i­tors from dif­fer­ent states, we helped adjust their expec­ta­tions accord­ingly in terms of the loan amounts they could expect and the bor­row­ing terms, which have helped us improve the rate at which our more prof­itable prod­uct was taken by 12%. After we con­sider how this impacts our aver­age rev­enue per vis­i­tor, a 12% increase is extremely impactful

This has prompted us to plan for the future and fig­ure out how to carry our per­son­al­iza­tion efforts over into our email and dis­play cam­paigns and across other chan­nels as well. We have con­sid­er­able oppor­tu­ni­ties to fur­ther opti­mize our mar­ket­ing efforts.

Q: If you could leave us with one part­ing thought about test­ing, what would it be?

A: The most impor­tant thing when test­ing is to fail faster. When you run tests, make them quick fails or quick wins. We ran 13 – 14 tests this past year and next year I at least want to dou­ble that. With the new resources we’ve brought on, I’m work­ing to make our test plans as detailed as pos­si­ble for the rest of the year by lay­ing them all out and pri­or­i­tiz­ing them so that we’re ready to imme­di­ately launch the next test as soon as the prior test is com­plete, and begin  launch­ing tests simul­ta­ne­ously. This allows us to learn quickly, gain more insights rapidly and increase our con­ver­sion rates at a much higher velocity.

 

0 comments