For the past three Adobe Dig­i­tal Mar­ket­ing Sum­mits, I’ve hosted a break­out ses­sion focused on dig­i­tal gov­er­nance (also referred to as web gov­er­nance), which focuses on how com­pa­nies can suc­cess­fully man­age their dig­i­tal ana­lyt­ics and opti­miza­tion pro­grams to drive busi­ness value. As most orga­ni­za­tions have come to real­ize, it takes more than just tech­nol­ogy to be suc­cess­ful with web ana­lyt­ics and test­ing – it requires an invest­ment in peo­ple and processes as well.

No com­pany can sim­ply turn on a new web ana­lyt­ics tool and expect to become data-driven overnight. Atti­tudes need to be shifted, processes need to be adjusted, posi­tions need to be staffed, and exec­u­tive sup­port is essen­tial. While the rewards are great so are the chal­lenges; there­fore, com­pa­nies need a strate­gic roadmap or game plan for how they’re going to cre­ate an envi­ron­ment where ana­lyt­ics and opti­miza­tion ini­tia­tives can flour­ish. The goal of this year’s ses­sion was to pro­vide struc­ture to that roadmap and start the process of tar­get­ing and pri­or­i­tiz­ing what needs to be addressed. 

This break­out ses­sion fea­tured a round­table dis­cus­sion where indus­try prac­ti­tion­ers could share their unique per­spec­tives as well as best prac­tices that they’ve iden­ti­fied and imple­mented at their com­pa­nies. For many of the par­tic­i­pants it was a great oppor­tu­nity to net­work with their peers and learn that other peo­ple are fac­ing sim­i­lar chal­lenges (sort of like group ther­apy). Like I’ve done in past years, I’ve asked the vol­un­teer mod­er­a­tors to share some of the key take­aways and themes that were dis­cussed within each of their round­table groups. I’ve con­sol­i­dated their notes into the fol­low­ing key points: 

Strat­egy

  • Key goal of ana­lyt­ics team is to get as close to a man­date as pos­si­ble from the exec­u­tive team, which involves cre­at­ing a mea­sure­ment strat­egy based on under­stood busi­ness objectives
    • Exec­u­tive team should sign off on the rec­om­mended strat­egy, con­firm­ing they are in align­ment
    • If an ana­lyt­ics team is unable to get ample input and guid­ance from above on the online strat­egy, they will be stuck with in-demand, short-term or tac­ti­cal report­ing (not ideal) 
  • Unclear and con­flict­ing goals (both between and within teams) as well as stake­hold­ers striv­ing to pro­tect their exist­ing resources and power cre­ate headaches for dig­i­tal ana­lyt­ics professionals
    • Strat­egy review ses­sions with all of the stake­hold­ers are help­ful in shap­ing a clear, accepted strategy
    • Col­lab­o­ra­tion soft­ware and lots of casual con­ver­sa­tions across teams were also helpful
  • Ensure the busi­ness require­ments and KPIs are well-defined because what is requested by the busi­ness may not be what is really intended or needed
  • Always antic­i­pate the inevitable follow-up ques­tions that were not in the orig­i­nal require­ments – oth­er­wise you’ll under-deliver and always be play­ing catch-up
  • Busi­ness strat­egy should still guide deci­sions around what online or offline data is truly necessary
    • What can be and should be mea­sured are two dif­fer­ent things

Exec­u­tive sponsorship

  • To get trac­tion on the ana­lyt­ics front, you must get at least one upper level exec­u­tive who owns and spon­sors web analytics
    • Cre­ate an envi­ron­ment where top-level exec­u­tives feel like the find­ings were a result of their efforts. Once they take own­er­ship, things will get done faster and they will per­suade oth­ers to sup­port the ana­lyt­ics initiatives
    • Exec­u­tive spon­sors play a key role in help­ing to define/clarify the strat­egy, pri­or­i­tize projects, get peo­ple to use and trust the data, and drive more account­abil­ity
  • When you don’t have exec­u­tive spon­sor­ship for your ana­lyt­ics pro­gram, you need to build sup­port at a grass-roots level, gen­er­ate a list of mon­e­tized quick wins, and evan­ge­lize your suc­cesses across mul­ti­ple lev­els within the orga­ni­za­tion (espe­cially at the C-level)
  • Exec­u­tive spon­sor­ship can go both ways – can remove road­blocks or cre­ate a “I want this” men­tal­ity which becomes more of a distraction

 Report­ing and analysis

  • Need to sep­a­rate report­ing from analy­sis so that dig­i­tal ana­lyst can help drive strate­gic decision-making and add more value to the organization
    • Change the mind­set and focus of your ana­lyt­ics team (“We don’t do report­ing, we do analysis”)
    • Out­source low-level report­ing to non-strategic resources (third-party) so ana­lysts can focus on projects that are strate­gic to the business
  • Too much data or infor­ma­tion can over­whelm inter­nal cus­tomers to the point where they can’t digest and use what’s being shared
    • Inter­view or sur­vey your inter­nal cus­tomers on a reg­u­lar basis, ver­ify what they’re inter­ested in, and deter­mine how it can be best shared in a mean­ing­ful way
  • With con­text being so impor­tant, cre­ate a “dum­mies guide” or cheat sheet to guide end users through how to inter­pret and use the data and reports properly
  • In order to avoid data inter­pre­ta­tion issues, have new users com­plete a train­ing ses­sion on the ana­lyt­ics tools before being granted login access
  • Account­abil­ity is often chal­leng­ing when work­ing with agen­cies, which some­times spin their results

Impact of turnover

  • Turnover at all lev­els cre­ates issues with fol­low­ing estab­lished processes
    • Need to revisit teams, espe­cially when there’s sig­nif­i­cant turnover, to ensure the new team mem­bers are aware and up-to-speed on cur­rent processes
  • Turnover among exec­u­tives can change busi­ness objec­tives and KPIs so the mea­sure­ment strat­egy needs to be proac­tively re-calibrated
    • Mea­sure­ment strat­egy should be firm, but also flex­i­ble enough to shift with chang­ing busi­ness needs and orga­ni­za­tional structures
  • When turnover is high for sea­soned ana­lyt­ics tal­ent, you should invest in train­ing entry-level appli­cants and build­ing them up over time

Process

  • When­ever pos­si­ble, data needs to be lever­aged right from the begin­ning dur­ing the plan­ning stage; oth­er­wise, data will only be used to sup­port past deci­sions and ques­tioned heav­ily (as erro­neous) when­ever it doesn’t sup­port them
  • Make sure the mea­sure­ment strat­egy is under­stand­able and will sup­port repeat­able and man­age­able processes
    • Over time whit­tle down those processes that don’t con­tribute to data com­plete­ness and integrity
    • Mod­ify those that were defined to meet the chang­ing real­i­ties of the business
  • Pol­i­tics and bureau­cracy can impede on exe­cut­ing rec­om­men­da­tions so you need to have a defined process in place (e.g., “when x hap­pens, we will do y”)
  • Pri­or­i­ti­za­tion is essen­tial when you’re deal­ing with lim­ited band­width and head­count so that you’re max­i­miz­ing your impact by focus­ing on what’s the most impor­tant to the business

 Imple­men­ta­tion

  • Most data prob­lems are caused by new devel­op­ment projects so make sure the ana­lyt­ics team is aligned with the development/implementation team, espe­cially for the QA process
  • Lever­age stan­dard­ized forms for new deploy­ments so that the devel­op­ment team knows how to read and imple­ment the require­ments correctly
  • Don’t for­get to show the devel­op­ment team your find­ings because con­nect­ing their imple­men­ta­tion work to your analy­sis will make them much more likely to pri­or­i­tize future tag­ging requests

Test­ing

  • Approach test­ing with an open mind and be will­ing to fail – take feel­ings out of the equation
  • Mar­ket­ing teams need to move beyond test­ing approaches (run­ning the same tests over and over with the same results) into “learn­ing” and “best practices”
    • Cre­ate a cen­tral­ized test lab that can cor­rob­o­rate results and lead the learn­ing across mar­ket­ing teams

 Mon­e­tize impact

  • Sell your analy­sis by mon­e­tiz­ing the impact of your recommendations
    • When you attach dol­lars to your reports and analy­ses, C-level exec­u­tives will be more apt to acknowl­edge the find­ings and act upon them
  • In order to get bet­ter and more advanced ana­lyt­ics tools, you need show the mon­e­tized value of your exist­ing ana­lyt­ics tools through var­i­ous doc­u­mented wins 

This was only a sam­ple of the insights and take­aways that were shared dur­ing these round­table dis­cus­sions. If you’re still hun­gry for more insights on this topic, you can check out key take­aways gath­ered from the pre­vi­ous Sum­mit round­table dis­cus­sions from 2010 and 2011. If you par­tic­i­pated in the round­table dis­cus­sions and one of your key take­aways wasn’t cap­tured in the above sum­mary, please feel free to add a com­ment to this blog post. I’m sure all of us would love to hear what take­aways you had from the ses­sion. As I stated in my pre­sen­ta­tion dur­ing this ses­sion, no orga­ni­za­tion has dig­i­tal gov­er­nance all fig­ured out. While some orga­ni­za­tions may be fur­ther along in cer­tain areas, all of us still have room to learn from each other. Hope­fully, by shar­ing these types of insights, col­lec­tively we can become stronger and bet­ter in our craft, dri­ving more value from our ana­lyt­ics and opti­miza­tion pro­grams. Lock and load!

Fol­low me on Twit­ter at @analyticshero
Check out my new book: Web Ana­lyt­ics Action Hero  

0 comments