At Adobe’s annual Sum­mit event in Salt Lake City, I hosted an inter­est­ing ses­sion titled “Play­ing to Win — Ensur­ing Your Orga­ni­za­tion Sup­ports Ana­lyt­ics Suc­cess”, which essen­tially focused on the crit­i­cal topic of web gov­er­nance. Dur­ing this ses­sion, I intro­duced a new gov­er­nance frame­work and matu­rity model, which you’ll be hear­ing more about in the com­ing months. I also had the great oppor­tu­nity to present with Andrew Carr from Ora­cle, who shared some valu­able insights into some of the chal­lenges his orga­ni­za­tion faced with its web ana­lyt­ics pro­gram and how his team was able to over­come var­i­ous issues.

If that was all that hap­pened in this ses­sion, I would have been sat­is­fied with what we were able to share and accom­plish. How­ever, another key com­po­nent of this web governance-focused ses­sion was the round­table dis­cus­sions that hap­pened after our pre­sen­ta­tions between Sum­mit atten­dees. In groups of 4–8 peo­ple, vol­un­teer mod­er­a­tors led dis­cus­sions around var­i­ous web gov­er­nance top­ics, includ­ing lead­er­ship, strat­egy, peo­ple, process, and product/technology. Just like last year when we facil­i­tated a sim­i­lar round­table dis­cus­sion for­mat, sev­eral par­tic­i­pants approached me after­wards express­ing their appre­ci­a­tion for such an insight­ful and use­ful session.

Now before you start curs­ing that you weren’t able to par­tic­i­pate in this valu­able Sum­mit ses­sion, I’d like to share some of what these dif­fer­ent groups dis­cussed like I did last year. Ten dif­fer­ent mod­er­a­tors pro­vided me with their groups’ key take­aways. I’ve taken on the role of editor-in-chief and have sum­ma­rized their key dis­cus­sion points as follows:

“Win­ning“

  • If you want to take your web ana­lyt­ics to the next level, sev­eral groups men­tioned that you need to demon­strate value (wins, results, etc.) to the organization.
    • If you’re dri­ving value for your com­pany, you will need to move beyond just focus­ing on reporting.
    • A few groups rec­om­mended out­sourc­ing all reg­u­lar report­ing or automat­ing reports to free up ana­lysts’ time for more mean­ing­ful projects.
    • Look to part­ner with a will­ing inter­nal group on a project that could help drive some momen­tum internally.
  • Mul­ti­ple groups men­tioned how dif­fi­cult it can be to work with mul­ti­ple busi­ness units. In order to be suc­cess­ful, you need to man­age your stake­holder groups effectively.
    • Be proac­tive and don’t wait for their requests to come to your team.
    • Sched­ule a reg­u­lar meet­ing with them to align pri­or­i­ties and review progress.
    • Assign dif­fer­ent team mem­bers to man­age dif­fer­ent stake­holder groups.
    • Ensure you have solid, well-documented busi­ness processes in place.
    • Add project man­age­ment staff (PMO) to your program.

Strat­egy

  • A cou­ple of state­ments stood out: “In gen­eral, peo­ple do not know what the goals are” and “Hard to get [the strat­egy] because no one agrees on what the website’s focus should be.
  • Mul­ti­ple groups rec­og­nized the impor­tance of clar­i­fy­ing the organization’s goals as it directly impacts the qual­ity and rel­e­vance of the data pro­vided by the implementation.
    • One group men­tioned that the lack of strat­egy on the ini­tial imple­men­ta­tion has con­tin­ued to plague and impede their progress.

Lead­er­ship

  • Many peo­ple expressed frus­tra­tion in not hav­ing a “true” exec­u­tive spon­sor or cham­pion.
    • You need some­one who has author­ity and influ­ence within the orga­ni­za­tion but is also both involved and com­mit­ted to help­ing the com­pany become more data-driven.
  • A lack of top-down account­abil­ity was also men­tioned as a key stum­bling block.

Struc­ture

  • Sev­eral groups debated age old ques­tion of where the ana­lyt­ics team should sit.
  • Most peo­ple felt IT was the least favor­able option.
    • Aside from the own­er­ship of the ana­lyt­ics team, IT’s involve­ment was referred to as a bar­rier and peo­ple cited col­lab­o­ra­tion issues in work­ing with IT.
  • Some peo­ple felt that hav­ing the ana­lyt­ics team within the mar­ket­ing group was ideal because com­mu­ni­ca­tion and coor­di­na­tion was much eas­ier between all of the stake­hold­ers when they were mostly within the same mar­ket­ing organization.
  • Other par­tic­i­pants favored an inde­pen­dent ana­lyt­ics or cus­tomer insights team, which was out­side of mar­ket­ing, IT, or any other group to pro­vide an unbi­ased view of the data.
  • Wher­ever the group sits, be care­ful about the nam­ing of the group because it can shape the per­cep­tion of its importance.

Train­ing

  • Most groups saw the impor­tance of for­mal train­ing but didn’t have the band­width to pro­vide it.
  • One group rec­om­mended requir­ing train­ing to be com­pleted before grant­ing access to the tool or data.
  • Par­tic­i­pants found users don’t log into the tool for weeks and then become frus­trated when they are not able to find what they need.
    • Set up a weekly or bi-weekly meet­ing for users to answer spe­cific ques­tions they have about the tool or data.
    • Offer one-on-one or small group infor­mal train­ing ses­sions as needed.

Data

  • Orga­ni­za­tions need to trust their data or else it will not be used, the ana­lyt­ics team will lose sup­port for main­te­nance or upgrades, and even­tu­ally the tool will be viewed as “broken”.
    • Imple­men­ta­tion is an ongo­ing process (not “once and done”) as new online ini­tia­tives are intro­duced and cur­rent ones evolve.
    • Peri­od­i­cally audit the reports to deter­mine if they are still rel­e­vant or need to be enhanced.
    • Ensure all imple­men­ta­tion projects include a data val­i­da­tion stage.
    • Secure ade­quate lead times so that all new pages/sites/apps/campaigns can be thor­oughly tested before being launched.

Com­mu­ni­ca­tion

  • Mul­ti­ple groups iden­ti­fied com­mu­ni­ca­tion as being a key suc­cess factor.
    • There needs to be good com­mu­ni­ca­tion between all stake­hold­ers as changes imple­mented by one group can impact other teams’ met­rics and reporting.
    • One par­tic­i­pant shared a painful expe­ri­ence where his team’s KPIs dis­ap­peared with­out warn­ing one day when another team decided to change their imple­men­ta­tion and metrics.
  • Doc­u­men­ta­tion can be chal­leng­ing to keep updated, but with­out it your orga­ni­za­tion may be vul­ner­a­ble if key peo­ple leave.
    • Each report suite should have its own doc­u­men­ta­tion and each vari­able should have at last one sen­tence detail­ing why the vari­able was set and what it is used for.
    • Cir­cu­late an inter­nal white paper on what all of the met­rics mean to pro­vide greater transparency.
    • Pub­lish a reg­u­lar newslet­ter for your inter­nal ana­lyt­ics com­mu­nity and key stake­holder groups.
  • Dif­fer­ent groups shared var­i­ous frus­tra­tions with the busi­ness, indi­cat­ing that there are still some com­mu­ni­ca­tion oppor­tu­ni­ties to clear up inter­nal misconceptions.
    • Busi­ness doesn’t under­stand why the web data doesn’t match data within other sys­tems and why direc­tional data is useful.
    • Busi­ness lacks appre­ci­a­tion for the imple­men­ta­tion and update process for data collection.
    • Busi­ness fails to under­stand how much time it takes to answer analy­sis requests.

As you can see there were a lot of good insights stirred up dur­ing this ses­sion, and this sum­mary only rep­re­sents a por­tion of what was shared and dis­cussed. For many of the ses­sion par­tic­i­pants, it was great to sit down with indus­try peers who are expe­ri­enc­ing sim­i­lar chal­lenges in their orga­ni­za­tions (you’re not alone!) as well as inter­act­ing with more sea­soned prac­ti­tion­ers who have already over­come some of those chal­lenges (there’s hope!). If you par­tic­i­pated in the round­table dis­cus­sions and one of your key take­aways wasn’t cap­tured in the above sum­mary, please feel free to add a com­ment. A big thanks to all of the mod­er­a­tors and par­tic­i­pants who joined the conversations!

4 comments
Brent Dykes
Brent Dykes

Wendy, I'm glad you found it useful. Thanks goes to all who shared their wisdom. Cheers, Brent.

Wendy Krautkramer
Wendy Krautkramer

Great article and great tips! Useful as well for new forms of analytics being pursued with existing tools (ex. adding personalization to a site, etc..).

Brent Dykes
Brent Dykes

Great insights, Cleve. I second the motion that analysts need to be involved during the entire process, especially upfront during the development of the measurement strategy and also during the data validation phase. Brent.

Cleve Young
Cleve Young

Another issue which we ran into during the our last implementation was the data structure was completed by Marketing and IT, with help from a consultant, and was never reviewed by the analysts in a timely manner. By the time they got around to involving us the process was so far along it was too late to make many meaningful adjustments. I had no problem with the consultant company as they can only go by what they are told. The issue is too often Marketing gives broad simplistic business requirements which IT will structure without really understanding what is required to efficiently analyze the data to answer those questions. Now, we as the analysts spend far too much time making adjustments and workarounds to compensate for the poor data which leaves us less time to do meaningful and relevant analysis. Bottom line, make sure the analysis team is involved throughout the entire process, not just at the end where it is often too late. Regards, Cleve