Pit­fall #5. Boil­ing the ocean

When deploy­ing ana­lyt­ics, you should aim to reach as broad as pos­si­ble across all your cus­tomer touch­points. You need to if you want to be con­sid­ered strate­gic. Your abil­ity to opti­mize and under­stand the cus­tomer life­cy­cle is directly related to how com­pre­hen­sive you can be with mea­sure­ment.How­ever, just because you go broad does not mean you should go deep. It is of much greater value to under­stand all your touch­points at a min­i­mum level that to under­stand a few touch­points at a max­i­mum level. I’m sure some peo­ple will dis­agree with this point vehe­mently – and I wel­come the feed­back. I’ve arrived at this con­clu­sion based on my own expe­ri­ence and this has worked for me and the clients I’ve worked with.

In the world of ana­lyt­ics, it’s bet­ter to go wide and shal­low, than nar­row and deep.For exam­ple, I talk with many peo­ple who want to mea­sure the day­lights out of some­thing like a shop­ping cart. They mea­sure every­thing: when some­one opens it, when some­one closes it, when they add or remove a prod­uct, the prod­uct size and color, whether they open a pop-up…all this data may be inter­est­ing, but it pro­vides them with a false sense of secu­rity because the gran­u­lar­ity of this data fails to reflect their broader cor­po­rate strategy.For exam­ple, their focus becomes incred­i­bly myopic and they become obsessed with minute obser­va­tions like why Fire­fox vis­i­tors con­vert at a faster rate than those that come in via Inter­net Explorer. And because they are focused on the weeds, they can’t see that their con­ver­sion rate for the broader site has been falling for the past 3 months and they aren’t doing any­thing about it​.In short, try­ing to “boil the ocean” keeps you from see­ing the big pic­ture and oper­at­ing on your core busi­ness goals.

Pit­fall #6. Mul­ti­ple ver­sions of the truth

Ana­lyt­ics suc­cess is all about build­ing a base­line for per­for­mance (your KPI trend), and try­ing new things to improve on this base­line. That’s it! That’s why I think it’s easy.  I know other blog­gers have argued that ana­lyt­ics is hard, but I’ve done this for a liv­ing and I can tell you that it’s not.  Sure, it can be hard – over time – to con­tinue to improve on your base­lines.  I’ll grant you that. But that comes after you’ve picked all the low hang­ing fruit and inno­va­tion becomes more crit­i­cal.  But to be fair, I can’t say I’ve ever met with a com­pany that has picked all the low hang­ing fruit. So from my per­spec­tive, it’s just not that hard.With that in mind, a crit­i­cal pit­fall occurs when cus­tomers try to use mul­ti­ple sys­tems to pro­vide this base­line for per­for­mance. In other words, they have mul­ti­ple ver­sions of the truth.  Most of you know what I mean.

Take a lead gen­er­a­tion site for exam­ple, like an auto­mo­tive man­u­fac­turer. They will often looks to total num­ber of leads, among other KPI, to deter­mine their suc­cess. But they will mea­sure this with 5 dif­fer­ent sys­tems. They have leads as mea­sured by Omni­ture, then leads as mea­sured by their data ware­house, leads as mea­sured by their email sys­tem, leads as mea­sured by their ad server, and leads as mea­sured by search. There may be even more sys­tems than that.The prob­lem with this approach is that you get mul­ti­ple ver­sions of the truth. And you waste your time try­ing to rec­on­cile all these dif­fer­ent sys­tems rather than try­ing to improve on your base­line.  Now – don’t get me wrong – you must make a best effort to under­stand why these sys­tems report dif­fer­ent results.  I’ve spent count­less hours doing this.  And often times when it comes down to it, there are just fun­da­men­tal dif­fer­ences in the mea­sure­ment approach (as we talked about in our approach on Data Migra­tion: Fools Erand).  Another exam­ple is with clicks.  A “click” as defined by an email provider will likely be dif­fer­ent than one pro­vided by Omni­ture which will be dif­fer­ent than that pro­vided by an ad server, which will be dif­fer­ent than a “click” reported by a search engine. That’s the real­ity of the world we live in, so just accept it. You can spend your time try­ing to solve this aca­d­e­mic chal­lenge, or you can spend your time improv­ing your busi­ness and beat­ing your com­pe­ti­tion. You decide.

You must have a sin­gle ver­sion of the truth, because your abil­ity to opti­mize your busi­ness is based on the rel­a­tive dif­fer­ence between points on the cus­tomer life­cy­cle. It is not based on the absolute rela​tion​ship​.In other words, if a search engine like Yahoo! says you have 1000 clicks, and another ana­lyt­ics provider puts you at 900 clicks, that doesn’t mat­ter as much as when you com­pare the 900 clicks as mea­sured by your ana­lyt­ics provider for a search cam­paign against, say, 700 clicks as mea­sured by the same provider for an email cam­paign. If you com­pare num­bers from the same provider against each other, you can see how dif­fer­ent cam­paigns are doing in rela­tion to each other. When you com­pare results from dif­fer­ent providers, you’re com­par­ing things not based on the same absolute rela­tion­ship. There­fore, your results are seri­ously skewed.

Pit­fall #7 – Not Teach­ing how to fish

In my own ana­lyt­ics career, some of the biggest gains we made from ana­lyt­ics were actu­ally from peo­ple out­side my ana­lyt­ics team. They came from other busi­ness units that we had trained to use ana­lyt­ics. Why? Because they often under­stood their busi­ness ques­tions bet­ter than any­one else, so they could inno­vate the most from the ana­lyt­ics data. In other words, they had the best con­text for the data​.To that end a crit­i­cal mis­take that peo­ple often make is not train­ing end users to be self-sufficient. Sure you can send users to pro­grams like Omni­ture Uni­ver­sity and con­duct inter­nal train­ing. That’s great and a crit­i­cal first step. But once you’ve done this, you should seek out the peo­ple that “get it” – and bring them into your inner cir­cle. When you iden­tify these power users and nur­ture them, they can become your great­est ally and drive some of the most sig­nif­i­cant gains you’ll real­ize from ana­lyt­ics – all with­out any incre­men­tal effort from you and your team. This might sound like a fairy tale – but it works – trust me, I’ve done it and I’ve helped other com­pa­nies like yours do it.

In Sum­mary

So there you have it. Seven crit­i­cal pit­falls mar­keters often fall into when deploy­ing ana­lyt­ics. Granted there are oth­ers – but if you go into your next deploy­ment with eyes wide open to these com­mon mis­takes – I have every con­fi­dence you’ll be more suc­cess­ful than ever before. If you’d like to talk about your unique sit­u­a­tion and require­ments, we’d be happy to do so. Just give us a call and we can work with you to realign your ana­lyt­ics deploy­ment with your strate­gic busi­ness require­ments and indus­try best prac­tices. It’s what we do best!

Tucker Christiansen
Tucker Christiansen

Great post Matt! My favorite line is: "Analytics success is all about building a baseline for performance (your KPI trend), and trying new things to improve on this baseline. That’s it!" I like your overall emphasis on keeping the implementation aligned with the company's strategic goals. It is easy to get focused on the detailed data (or the conflicts between data in different systems) and forget about the big picture. Putting effort into these pitfalls usually won't help a company get the low hanging fruit and it certainly isn't the best use company resources. Do you have any recommendations on convincing others in the company that web analytics will benefit their business unit?