In my last blog post, I dis­cussed how peo­ple are still con­fus­ing met­rics with other data types such as dimen­sions and reports. It’s impor­tant to be able to dis­cern between what is and isn’t a met­ric, but that’s not enough. There are some other things we can do to improve how we use met­rics in web analytics.

A big prob­lem with some met­rics is that they are not spe­cific, clear/intuitive, or action­able. We say “garbage in, garbage out” when it comes to data col­lec­tion and report­ing. Sim­i­larly, fuzzy met­rics lead to weak adop­tion, poor decision-making, and fre­quently inac­tion. If your met­rics are neb­u­lous or unclear, you’re doing a dis­ser­vice to your­self and your com­pany. I’d like to pro­pose some ways in which you can improve your met­ric usage and avoid met­ric abuse.

Be spe­cific and avoid vague metrics

One of my least favorite met­rics is traf­fic. When you hear some­one say they want to increase traf­fic to their site, what do they really mean? Traf­fic could mean page views, vis­its, daily unique vis­i­tors, monthly unique vis­i­tors, etc. We should have erad­i­cated refer­ring to traf­fic as a met­ric a long time ago, but it still reg­u­larly comes up in arti­cles, white papers, and con­ver­sa­tions to this day.

Another vague met­ric exam­ple is engage­ment. As a met­ric, what is engage­ment? There has been much debate in the web ana­lyt­ics com­mu­nity about engage­ment (Neil Mason pro­vides a good sum­mary), and whether it is actu­ally a met­ric or not. I believe the prob­lem with refer­ring to traf­fic and engage­ment as met­rics is that they aren’t tied to a sin­gle, stan­dard­ized met­ric — they are really an area of analy­sis where sev­eral met­rics or approaches could apply. You could develop a cus­tom engage­ment index or track a com­bi­na­tion of dif­fer­ent met­rics to mea­sure customer/visitor engage­ment, but in and of itself “engage­ment” isn’t a metric.

I cringe each time I hear the word “met­ric” asso­ci­ated with new (and ambigu­ous) social media ter­mi­nol­ogy such as buzz or influ­ence. Are we doomed to repeat the same mis­takes with the lat­est social media met­ric “du jour”? A guar­an­teed way to make a met­ric less action­able is to make it vague. We need to ensure our met­rics are specific.

Prop­erly define new metrics

All the cool kids are doing it. Lots of new, exciting-sounding met­rics are being intro­duced all the time, espe­cially in the social media space — e.g., Buzz Veloc­ity, Brand Ampli­fi­ca­tion, Influ­encer Impact, etc. (these ones might sound famil­iar, but they’re 100% fab­ri­cated). Unfor­tu­nately, many of these buzz­word met­rics end up being gen­er­ally mean­ing­less to most peo­ple and com­pa­nies. Why? Too much effort is spent on hyp­ing the met­rics and report­ing, and not enough time is spent on ade­quately defin­ing the new met­rics and explain­ing how they can be useful.

As mar­keters, we’re trained to dif­fer­en­ti­ate some­thing and build excite­ment for it. In the case of met­rics, it shouldn’t be about dif­fer­en­ti­a­tion and hype — but instead stan­dard­iza­tion, clar­ity, and util­ity. We don’t need more jar­gon that is pre­ten­tious, con­vo­luted, or vague — espe­cially not in the realm of social media. It doesn’t move us for­ward, it sets us back. Rather than cre­at­ing more buzz­word met­rics, I’d pre­fer more descrip­tive names and increased empha­sis on bet­ter defin­ing and doc­u­ment­ing new met­rics for end users. In many cases, a fancy name hides the fact that the met­ric is just a re-packaging of a commonly-used met­ric or actu­ally a report and not a met­ric at all.

Be care­ful with acronyms

We have lots of pop­u­lar abbre­vi­ated met­rics in online mar­ket­ing and web ana­lyt­ics: AOV/AOS, CPC, CPA, CTR, ROAS, etc. Hav­ing worked in the high tech indus­try for more than a decade, I know how sus­cep­ti­ble high tech firms are to using acronyms. Don’t fol­low our industry’s bad exam­ple! You need to be care­ful when using acronyms, espe­cially when it comes to web met­rics. Overzeal­ous and pre­ma­ture usage of acronyms can impede met­ric comprehension.

At a recent con­fer­ence I attended, the pre­sen­ters used the met­ric “PVV” on a few dif­fer­ent slides. Sur­pris­ingly, there was a moment (maybe 20–30 sec­onds) where I strug­gled to iden­tify a met­ric I had used hun­dreds of times before — Page Views per Visit. When it dawned on me what they were talk­ing about, it made me won­der what peo­ple out­side of the web ana­lyt­ics team would think of this abbre­vi­ated met­ric. Would they auto­mat­i­cally know what this acronym stood for or would they pre­tend they knew but not truly under­stand what was being dis­cussed? Abbre­vi­a­tions are only accept­able if they are widely used and understood.

Focus on mean­ing­ful, action­able metrics

 

Just because you can track every­thing in Site­Cat­a­lyst or Insight doesn’t mean you should. It’s great when peo­ple get excited about web mea­sure­ment and ana­lyt­ics; how­ever, this exu­ber­ance can quickly become coun­ter­pro­duc­tive if your com­pany decides to mea­sure every­thing under the sun. One con­sul­tant on my team recently labeled this phe­nom­e­non as going “Cuckoo for Cocoa Puffs”, and it often leads a tall stack of incon­se­quen­tial, under­uti­lized met­rics and reports.

I’m not the first to say less is more when it comes to met­rics, but it’s def­i­nitely worth repeat­ing. Ulti­mately, you want met­rics that are mean­ing­ful to your busi­ness (i.e., tied to your key busi­ness goals such as increas­ing rev­enue or reduc­ing call cen­ter vol­ume) and action­able (i.e., your com­pany can take spe­cific actions to move the met­ric up or down such as increas­ing ad spend for a par­tic­u­lar cam­paign or opti­miz­ing a spe­cific land­ing page).

The most mean­ing­ful and rel­e­vant met­rics to your busi­ness should be your KPIs, which are a spe­cial sub­set of met­rics used to mea­sure per­for­mance against key busi­ness goals. I get con­cerned when I see the “KPI” label being applied too loosely or care­lessly to ad hoc met­rics. You don’t want to dilute the true power of real KPIs by crowd­ing them out with false ones. Some sup­port­ing met­rics can still be mean­ing­ful even though they aren’t KPIs because they pro­vide use­ful insights into how your KPIs are being impacted.

In terms of action­abil­ity, you need to be care­ful with overly com­pli­cated for­mu­las or what Matt Belkin referred to as met­ric mashups or uber met­rics. These com­plex, cal­cu­lated met­rics are prone to being less action­able because an ana­lyst may need to dis­sect the entire for­mula to dis­cover what aspect(s) drove a large increase or decrease before any action can be taken. In most cases, you need the orga­ni­za­tion, not just a sta­tis­tics PhD on the ana­lyt­ics team, to under­stand and grasp the met­ric for action to hap­pen. Just remem­ber less is more — in terms of both quan­tity and com­plex­ity — and be judi­cious with your unique set of KPIs and metrics.

One way to fol­low the “less is more” prin­ci­ple is to watch out for “nice-to-know” met­rics. As con­sul­tants, we can typ­i­cally fer­ret out these met­rics when we ask what actions a par­tic­u­lar client is pre­pared to take if the met­rics come in really high or low. If they aren’t pre­pared to take any action then we’ve uncov­ered a poten­tial “nice-to-know” met­ric. It can be frus­trat­ing for a web ana­lyt­ics man­ager to learn that one of the mar­ket­ing team’s “must-have” met­rics, which ended being the most time-consuming and dif­fi­cult ele­ment to imple­ment, results in only “hmmm…that’s inter­est­ing” and noth­ing else. As Pablo Picasso stated, Action is the foun­da­tional key to all success.”

Met­rics Manifesto

 

As I said my pre­vi­ous post — enough is enough. We need to stop the met­ric abuse today! If we’re going to change the face of mar­ket­ing, we need to ensure our ana­lyt­ics foun­da­tions are sound. I’m mak­ing the pledge to not tol­er­ate met­ric abuse any longer, and solemnly swear to:

  1. Only refer to actual met­rics as metrics
  2. Be spe­cific and never use vague metrics
  3. Never give fancy, mean­ing­less names to metrics
  4. Avoid renam­ing well-known met­rics and go with stan­dards if they are in place when­ever possible
  5. Ensure new met­rics are well-defined and doc­u­mented for any­one that will be con­sum­ing the reports/analysis
  6. Only use acronyms that are well-established and understood
  7. Make sure met­rics are mean­ing­ful and actionable
  8. Be care­ful with com­plex, cal­cu­lated met­rics that may be less actionable
  9. Avoid “nice-to-know” met­rics that waste time and mostly go unused
  10. Strive to cor­rect peo­ple that mis­use the terms: met­ric and KPI

Where do you see met­ric abuse? What points would you add to my met­rics man­i­festo? The web ana­lyt­ics indus­try has come a long way, and I feel as though per­pet­u­at­ing sim­ple over­sights in areas such as how we use met­rics is going to hold us back. Let’s stamp out met­ric abuse and move towards an even brighter, data-driven future.

2 comments
Brent Dykes
Brent Dykes

Great points, John. I know as web analytics consultants we've run into situations where we discovered organizations were broadly misinterpreting a web metric in a particular way when they assumed it meant something different. If the operational definitions were in place or better communicated, misguided actions (or inaction) based on these assumptions could have been avoided.

John Hunter
John Hunter

Well put. Metrics are valuable when they are actionable. Think about what will be done if certain results are shown. If you can't think of actions you would take, it may be that metric is not worth tracking. Metrics should be operationally defined so that the data is collected properly. And so that those using the metric results properly interpret what it is saying. Often data is presented without an operational definition and people think the metric is saying something that it is not. I find most often when people say statistics lie it is really that they made an incorrect assumption about what the data said - which most often was because they didn't understand the operational definition of the data.