Recently my consulting team, Omniture Digital, ran multivariate tests on “Request for Information” forms for two online universities. It turned out to be a classic case of how different audiences respond quite differently depending on the context. It was also a great reminder of why it’s extremely difficult to generalize “Best Practices”.

First some background: For each test, the goal was an increase in the users who completed the Request for Information form. Elements we tested in each case were:

  • Page design (Simple vs. stylized page design)
  • Hero image at the top of the page (Lifestyle image vs. no image)
  • Call to action (Button color & copy)
  • Benefits messaging (Reinforcing key value propositions)

The results for the two tests could not have been more different. For one university, the page with the stylized page design and lifestyle hero image won handily; for the other university, the simple page design with no hero image won the day. In addition, benefits messaging helped for one university, while it actually had a negative impact for the other university.

Why the great difference? Did this imply that audiences simply respond randomly to different types of content? At times, it can certainly seem that way.

But once we dug into the data, the story became clearer, especially when we examined the sources of referring traffic.

For the page where the stylized design & the lifestyle hero image won, most of the traffic came directly from search engines. For the page where a simple design and no hero image won, most of the traffic came from other pages on the university’s own web site. Are you starting to see why audiences may have responded differently? Stop and think about it for a second.

For traffic that comes directly from search engines, the visual impact of a page is a key success factor. These types of users are “pogo sticking” from result to result, giving each landing page about three seconds of their time before they either commit or move on. Snap judgments based on the way a page looks can be critical. A compelling page design and a comforting image can make an enormous impact.

For traffic that comes from other pages on the same site, the visual impact of a page is often less important. Users have likely already qualified themselves and are looking to convert. Too many visuals (and even benefits messaging) can actually create a distraction for these types of users. So in this case, simple is better.

At the end of the day, we got lift for both clients, so it all worked out. But it was a good reminder of how context, not just content, makes a big difference in how users will respond.

Apples Design
Apples Design

It was a wonderful reading. The web design was great.

John Hunter
John Hunter

Multivariate testing is great. And a great way to determine interactive factors - which are essentially not possible to determine with one variable at a time testing (though a smart person can see indications within this type to view them). Your example though seems to largely be about properly segmenting the data for optimization. While it is always difficult to tell with short examples it seems like it may well be that you have 2 different audiences and a solution that, for example, intercepts the search engine traffic and gives them some context might help a lot (you see this on many blogs where they say - "I see you find us search on x - you may also be interested in y... But the biggest point I think your story illustrates is the importance of the experimenter. They need to think. Their role is not just to calculate some numbers and whatever number is higher wins. George Box: "it’s not about proving a theorem, it’s about being curious about things. There aren’t enough people who will apply [DOE] as a way of finding things out."