We’re pleased to introduce Adobe Conversion All-Stars, a new monthly recognition program where we highlight an Adobe customer or partner who is blazing a trail in conversion optimization and driving increased marketing ROI for their organization. Our Conversion All-Star for May is Adam Crutchfield of Axcess Financial, a leading financial solutions provider. Adam is responsible for digital analytics initiatives for the company’s US properties, which include email, display, PPC and SEO, as well as conversion rate optimization efforts. He also oversees paid search campaigns for Check ‘n Go, a division of Axcess Financial. I connected with Adam for a quick discussion on how he approaches testing and optimization at Axcess Financial and how he has been able to help evangelize a data-driven optimization culture through his organization.

Q: How do you identify when there’s a need for testing and how do you get internal buy-in?

A: I’m fortunate to work under very forward-thinking leadership here. Jean-Marx Mantilla, who is our VP of eCommerce, is by nature a data-driven advocate. He was previously at JPMorgan Chase where he became well-aware of the benefits of A/B and multivariate testing that his team over there was driving and brought that philosophy over when he came to Axcess. I was his first analytics and optimization hire when he joined Axcess and his first priority was to invest in an analytics platform to help nurture a champion/challenger culture and to always lean on “science” to determine what to do next on your digital properties.

Q: What’s your team look like?

A: At first I was the lone tester and relied on project managers to help on an ad-hoc basis, but now that we’ve been able to achieve some great success and socialize it across the organization, I have a team under me. We’re able to manage all the test designs and project management in-house. We have also worked closely with the Adobe Digital Consulting team to get help on the most critical part – coming up with hypotheses and creating the testing ideas. I firmly believe that each test we run should try to solve a business problem. If we’re not doing this, then we’re not making the most informed business decisions.

Q: How do you get organizational support for testing?

If it’s just you doing it, you’re not going to win. You need the internal team behind you to support the efforts. You need an internal champion vs. challenger culture. You need to share out results regularly and also get test input from those who interface with customers every day. For example, I connect regularly with our customer support team to find out what the biggest website complaints are. Then we look into whether it’s a user experience issue. If it’s not, my team aims to solve the complaints through iterative testing of our website content to give our customers the most relevant and personalized experience possible.

Q: How do you determine what to test first?

A: We were already in the process of evolving the look-and-feel of our Check ‘n Go website from a brand marketing-oriented layout, to a more conversion-oriented layout. But, we didn’t know what we should be measuring and how to measure the impact of our tests. At that point we brought on Adobe Test&Target to optimize the content on our site and began to iteratively test to get better results and continue optimizing on those results. I’d had experience with free testing solutions in the past, but was new to the horsepower and flexibility that a more sophisticated testing solution provides.

Q: What made you go with Test&Target?

A: Test&Target’s strengths are in its flexibility and ability to get into the details. With T&T, not only can we tag one goal, but we can tag multiple success metrics to really understand the wide impact of one specific test. If we change our homepage around, what does that do loan refinances and the average loan amount? These are very important things to keep track of. We’ve seen strange things through our testing campaigns that we may never have learned otherwise. For example, we used to promote specific loans by dollar amount on the homepage, but through testing we found that when we didn’t explicitly state potentially loan amounts on some areas of our site, the average loan amount went up by 10%.

Q: What were some of your initial learnings after you began website testing? Were there any surprising insights such as tests that actually decreased conversion?

A: The data-driven culture that Jean-Marx Mantilla has instilled throughout our organization has made a tremendous impact in our agility and ability to test and fail fast. Before we settle on a content change, we determine our hypotheses and test them out before we decide what is pushed live to all traffic or to segments of our traffic. Our customers want relevant content and a personalized experience. One of the first things we set out to test was whether situation-specific messaging would conjure up specific emotions within website visitors that would then increase loan application. For example, an image of an injured pet and a stack of veterinarian’s bills piling up might speak to a website visitor who is in a similar situation.

We also noticed through our analytics, which drives our testing program that consumers were digging deeper into our site to look up loan rates and terms before they completed an application. We then decided to provide the situation-specific imagery with the loan rates, terms and fees right up front to customers to help them convert quicker. One of our main conversion events is the application initiation, so we pay close attention to the application initiation rate.

Q: How many tests did you run initially?

A: We’ve run about 13 – 14 tests in our first year. It’s a much lower velocity than what I feel is ideal, but it was partially due to the fact that we were new to Test&Target and testing in general, so there was some internal education and evangelization that needed to take place.

Q: How are you expanding your optimization program?

A: Last week I presented a summary of our testing campaign results to our COO and he just loved that we were able to get this granular with our data and prove out our hypotheses with actual science. He ended up bringing in the CEO, who was so excited about it, they wanted us to help other business units adopt this methodology. I’ll be looking to train other business units on this shortly. It’s super exciting that we’re not only improving our US business, but we’re expanding our learnings and best practices based on the methodology we’ve developed here to help improve our businesses.

Q: What other key testing insights have you gained?

A: It’s critical to identify the situation that you want to test and then connect that situation via content that’s personalized to each visitor. It makes a tremendous difference if a visitor can relate personally to what they’re seeing on your site.

In a heavily regulated industry like financial services, it becomes even more important to personalize online experiences because for example, loans in different states are structured quite differently. Once we began adjusting loan-related content and messaging to visitors from different states, we helped adjust their expectations accordingly in terms of the loan amounts they could expect and the borrowing terms, which have helped us improve the rate at which our more profitable product was taken by 12%. After we consider how this impacts our average revenue per visitor, a 12% increase is extremely impactful

This has prompted us to plan for the future and figure out how to carry our personalization efforts over into our email and display campaigns and across other channels as well. We have considerable opportunities to further optimize our marketing efforts.

Q: If you could leave us with one parting thought about testing, what would it be?

A: The most important thing when testing is to fail faster. When you run tests, make them quick fails or quick wins. We ran 13 – 14 tests this past year and next year I at least want to double that. With the new resources we’ve brought on, I’m working to make our test plans as detailed as possible for the rest of the year by laying them all out and prioritizing them so that we’re ready to immediately launch the next test as soon as the prior test is complete, and begin  launching tests simultaneously. This allows us to learn quickly, gain more insights rapidly and increase our conversion rates at a much higher velocity.