With 2009 pre­dic­tions whizzing around the web, I decided to write instead about what I’m wish­ing for in 2009. I’m hop­ing for an “if you build it, he will come” moment. With­out fur­ther ado…

1) Agen­cies Get On Board

Aside from the hand­ful of agen­cies that spe­cial­ize in test­ing and con­ver­sion opti­miza­tion, I’ve seen far less progress than I expected this year. To do a san­ity check, I googled “[agency name] mul­ti­vari­ate” on the top 4 dig­i­tal agen­cies of 2007 from AdAge. Here’s what I found:

Avenue A | Razor­fish is the only search that yields any sub­stan­tial result, and that is to a partner’s press release focus­ing on the plat­form rather than their own in-house exper­tise. Con­trast that with search­ing for “[agency name] cre­ative”, and you’ll find that Dig­i­tas and Ogilvy rank #1 in SEO and Avenue A | Razor­fish at #3.

I know that part of the prob­lem is that agen­cies have a tough time con­vinc­ing clients to test. But I sus­pect that agen­cies are also not yet account­able for test­ing, so why deliver some options that may quan­tifi­ably fail when you can deliver a sin­gle option that unquan­tifi­ably wins? The lat­ter requires less level-of-effort and chance of has­sle. Unfor­tu­nately it also has less upside for the cus­tomer. Hope­fully agen­cies will con­sider the sig­nif­i­cance of pro­vid­ing more value to their cus­tomers, and what that may yield for them long-term.

My rea­son for want­ing agen­cies to get on board is self­ish. I believe that agen­cies (at their best) are the strate­gic part­ners of cus­tomers, and they can help prove the value of test­ing. But it feels a bit like the chicken-and-the-egg. Agen­cies aren’t com­pelled to actively bring test­ing skills and expe­ri­ence in-house with­out clients ask­ing for it, and clients may not be edu­cated enough to take on test­ing with­out an agency resource!

2) Off­site + Onsite = 1 Vis­i­tor Experience

I’ve def­i­nitely seen great strides already taken this year in try­ing to treat a visitor’s path through off­site and onsite as one cohe­sive expe­ri­ence. Some of our cus­tomers are doing really inter­est­ing projects involv­ing the test­ing of retar­geted ads out on 3rd-party sites and build­ing a sin­gle pro­file that can be extended and rein­forced at every online touch point. I’m excited to see how that plays out in 2009.

How­ever, I’ve also seen a lot of poorly exe­cuted land­ing page expe­ri­ences that show there’s no dis­cus­sion going on between those in charge of acqui­si­tion and those in charge of the site. If there is any con­fu­sion around where to begin test­ing, I’d highly rec­om­mend start­ing there. Land­ing pages are typ­i­cally eas­ier to change and tag, and trans­form­ing a post-click expe­ri­ence from gen­eral and 80% irrel­e­vant to tar­geted and 100% rel­e­vant usu­ally yields great lift!

3) Mul­ti­vari­ate Test­ing (MVT) Becomes Another Tool in the Toolbox

There’s no ques­tion that mul­ti­vari­ate test­ing is an incred­i­bly pow­er­ful tool. But it’s cer­tainly not the only tool you need or have to increase con­ver­sion on your site. I think of it as the ham­mer in the say­ing, “when all you have is a ham­mer, every­thing looks like a nail.” There are many efforts that mul­ti­vari­ate test­ing is not the best tool for, includ­ing mas­sive func­tion­al­ity changes (e.g. 5-page check­out vs. 1-page check­out) and auto­mated behav­ioral tar­get­ing. The prob­lem with look­ing at MVT as the all-purpose cleaner is that it lim­its your vision to con­sider other types of tests, and it also rein­forces a frag­mented mar­ket­ing divi­sion. I can’t count how many meet­ings I’ve been to where the cat­a­lyst is an RFP for MVT. Once we start talk­ing about the site’s ini­tia­tives though, we begin to under­stand that there is also inter­est around seg­men­ta­tion and tar­get­ing. How­ever, that’s a sep­a­rate project man­ager and a sep­a­rate RFP, and nobody is inter­ested in get­ting together in the same room.

4) Tar­get­ing Belongs in the Same Toolbox

Tar­get­ing for the sake of tar­get­ing is not always effec­tive. I have seen numer­ous cases where the first attempt strikes out. Some­times peo­ple don’t want to see their first name read back to them, but they do like see­ing their search terms repeated. Oth­ers like to see an offer rein­forc­ing the deal they saw in the ad, but they may not want to see the same exact image. With­out test­ing, you’re still not mak­ing data-driven deci­sions. At best, you’re mak­ing more edu­cated guesses.

5) Web Opti­miza­tion Col­lab­o­ra­tion Takes Off

Are you sens­ing a trend here? : I want more col­lab­o­ra­tion between acqui­si­tion mar­keters and site mar­keters, com­pa­nies and agen­cies, ven­dors and com­pa­nies! I think every­body wins when we share more of our test­ing expe­ri­ence and I’m con­vinced there has to be a way to do it with­out giv­ing up com­pet­i­tive intel­li­gence. So far the webop­ti­miza­tion group has had some good dis­cus­sion around get­ting test­ing buy-in, and some very spicy dis­cus­sion around ven­dor com­par­isons. Tell me how you think a forum should be con­structed to be most ben­e­fi­cial, and then let’s work together to make it a reality.

6) Suc­ceed Often and Fail Fast

This one’s a lit­tle off the web opti­miza­tion rails, but I wish Pres­i­dent Obama all the best in mak­ing 2009 a year of recov­ery and suc­cesses. I don’t think it’s pos­si­ble to not fail, espe­cially given these tumul­tuous and uncer­tain times, so I’d just like to encour­age him and all the rest of us to fail fast and keep on truckin’!

Happy New Year!

Photo cred­its:
http://​www​.flickr​.com/​p​h​o​t​o​s​/​j​o​l​i​e​n​v​a​l​l​i​n​s​/​1​5​0​5​8​7​1​4​97/
http://​www​.flickr​.com/​p​h​o​t​o​s​/​d​o​w​n​_​u​n​d​e​r​_​i​m​a​g​e​s​/​6​7​9​4​0​0​9​38/

2 comments
Lily Chiu
Lily Chiu

Hey Florian, thanks for the insightful comments! I agree that many of a 4A agency's efforts go towards branding but I don't think that should automatically disqualify them from measurement. It seems too easy to get a pass on trying to measure and interpret how a campaign performs by putting it under the branding umbrella, even if the success may not be a single conversion event like an order or sign-up. In response to the decision-making process, I would think that the perfect way to incorporate the voice of the customer is to integrate testing into the design process. Otherwise agencies are making a deliberate decision to work in a customer-excluded silo, prioritizing their designs and hypotheses above all others. I don't see how that works as a move-forward strategy as more companies continue to test, iterate and improve. (Am I looking at the world through rose-colored glasses?!) I'd also love to hear how you and others feel about the announcement of WPP's investment in Omniture. I personally think it will go a long way towards fulfilling my first wish :) Thanks! - Lily

Florian Pihs
Florian Pihs

Coming from an agency, I am not surprised that few agencies offer MVT, or A/B testing for that matter. Some of the reasons I see are: 1) Most 4A agencies work for brand advertisers. -Most of their traffic goes to campaign sites, campaign sites are mainly in Flash, Flash is hard to MVT even A/B test are more resource intensive since you need to build 2 Flash experiences. Plus campaign sites are often not online long enough to allow effective testing & optimization, since turn around times are slow. 2) These brand advertisers often do not define success as actions taken on the website (although that is changing a bit) but as changes in brand awareness or preference. MVT testing does not move the needles on these metrics. 3) Decisions on design are made in large "all hands on" meetings. The voice of the customer does not sit on a table. 4) The closer the website get to the core business $$ the less likely companies are outsourcing key functions like analytics and testing to agencies, but build their team.