Recently my con­sult­ing team, Omni­ture Dig­i­tal, ran mul­ti­vari­ate tests on “Request for Infor­ma­tion” forms for two online uni­ver­si­ties. It turned out to be a clas­sic case of how dif­fer­ent audi­ences respond quite dif­fer­ently depend­ing on the con­text. It was also a great reminder of why it’s extremely dif­fi­cult to gen­er­al­ize “Best Practices”.

First some back­ground: For each test, the goal was an increase in the users who com­pleted the Request for Infor­ma­tion form. Ele­ments we tested in each case were:

  • Page design (Sim­ple vs. styl­ized page design)
  • Hero image at the top of the page (Lifestyle image vs. no image)
  • Call to action (But­ton color & copy)
  • Ben­e­fits mes­sag­ing (Rein­forc­ing key value propositions)

The results for the two tests could not have been more dif­fer­ent. For one uni­ver­sity, the page with the styl­ized page design and lifestyle hero image won hand­ily; for the other uni­ver­sity, the sim­ple page design with no hero image won the day. In addi­tion, ben­e­fits mes­sag­ing helped for one uni­ver­sity, while it actu­ally had a neg­a­tive impact for the other university.

Why the great dif­fer­ence? Did this imply that audi­ences sim­ply respond ran­domly to dif­fer­ent types of con­tent? At times, it can cer­tainly seem that way.

But once we dug into the data, the story became clearer, espe­cially when we exam­ined the sources of refer­ring traffic.

For the page where the styl­ized design & the lifestyle hero image won, most of the traf­fic came directly from search engines. For the page where a sim­ple design and no hero image won, most of the traf­fic came from other pages on the university’s own web site. Are you start­ing to see why audi­ences may have responded dif­fer­ently? Stop and think about it for a second.

For traf­fic that comes directly from search engines, the visual impact of a page is a key suc­cess fac­tor. These types of users are “pogo stick­ing” from result to result, giv­ing each land­ing page about three sec­onds of their time before they either com­mit or move on. Snap judg­ments based on the way a page looks can be crit­i­cal. A com­pelling page design and a com­fort­ing image can make an enor­mous impact.

For traf­fic that comes from other pages on the same site, the visual impact of a page is often less impor­tant. Users have likely already qual­i­fied them­selves and are look­ing to con­vert. Too many visu­als (and even ben­e­fits mes­sag­ing) can actu­ally cre­ate a dis­trac­tion for these types of users. So in this case, sim­ple is better.

At the end of the day, we got lift for both clients, so it all worked out. But it was a good reminder of how con­text, not just con­tent, makes a big dif­fer­ence in how users will respond.

2 comments
Apples Design
Apples Design

It was a wonderful reading. The web design was great.

John Hunter
John Hunter

Multivariate testing is great. And a great way to determine interactive factors - which are essentially not possible to determine with one variable at a time testing (though a smart person can see indications within this type to view them). Your example though seems to largely be about properly segmenting the data for optimization. While it is always difficult to tell with short examples it seems like it may well be that you have 2 different audiences and a solution that, for example, intercepts the search engine traffic and gives them some context might help a lot (you see this on many blogs where they say - "I see you find us search on x - you may also be interested in y... But the biggest point I think your story illustrates is the importance of the experimenter. They need to think. Their role is not just to calculate some numbers and whatever number is higher wins. George Box: "it’s not about proving a theorem, it’s about being curious about things. There aren’t enough people who will apply [DOE] as a way of finding things out." http://management.curiouscatblog.net/2009/11/16/highlights-from-recent-george-box-speech/