At Adobe’s annual Summit event in Salt Lake City, I hosted an interesting session titled “Playing to Win – Ensuring Your Organization Supports Analytics Success”, which essentially focused on the critical topic of web governance. During this session, I introduced a new governance framework and maturity model, which you’ll be hearing more about in the coming months. I also had the great opportunity to present with Andrew Carr from Oracle, who shared some valuable insights into some of the challenges his organization faced with its web analytics program and how his team was able to overcome various issues.

If that was all that happened in this session, I would have been satisfied with what we were able to share and accomplish. However, another key component of this web governance-focused session was the roundtable discussions that happened after our presentations between Summit attendees. In groups of 4-8 people, volunteer moderators led discussions around various web governance topics, including leadership, strategy, people, process, and product/technology. Just like last year when we facilitated a similar roundtable discussion format, several participants approached me afterwards expressing their appreciation for such an insightful and useful session.

Now before you start cursing that you weren’t able to participate in this valuable Summit session, I’d like to share some of what these different groups discussed like I did last year. Ten different moderators provided me with their groups’ key takeaways. I’ve taken on the role of editor-in-chief and have summarized their key discussion points as follows:

“Winning”

  • If you want to take your web analytics to the next level, several groups mentioned that you need to demonstrate value (wins, results, etc.) to the organization.
    • If you’re driving value for your company, you will need to move beyond just focusing on reporting.
    • A few groups recommended outsourcing all regular reporting or automating reports to free up analysts’ time for more meaningful projects.
    • Look to partner with a willing internal group on a project that could help drive some momentum internally.
  • Multiple groups mentioned how difficult it can be to work with multiple business units. In order to be successful, you need to manage your stakeholder groups effectively.
    • Be proactive and don’t wait for their requests to come to your team.
    • Schedule a regular meeting with them to align priorities and review progress.
    • Assign different team members to manage different stakeholder groups.
    • Ensure you have solid, well-documented business processes in place.
    • Add project management staff (PMO) to your program.

 

Strategy

  • A couple of statements stood out: “In general, people do not know what the goals are” and “Hard to get [the strategy] because no one agrees on what the website’s focus should be.
  • Multiple groups recognized the importance of clarifying the organization’s goals as it directly impacts the quality and relevance of the data provided by the implementation.
    • One group mentioned that the lack of strategy on the initial implementation has continued to plague and impede their progress.

Leadership

  • Many people expressed frustration in not having a “true” executive sponsor or champion.
    • You need someone who has authority and influence within the organization but is also both involved and committed to helping the company become more data-driven.
  • A lack of top-down accountability was also mentioned as a key stumbling block.

Structure

  • Several groups debated age old question of where the analytics team should sit.
  • Most people felt IT was the least favorable option.
    • Aside from the ownership of the analytics team, IT’s involvement was referred to as a barrier and people cited collaboration issues in working with IT.
  • Some people felt that having the analytics team within the marketing group was ideal because communication and coordination was much easier between all of the stakeholders when they were mostly within the same marketing organization.
  • Other participants favored an independent analytics or customer insights team, which was outside of marketing, IT, or any other group to provide an unbiased view of the data.
  • Wherever the group sits, be careful about the naming of the group because it can shape the perception of its importance.

Training

  • Most groups saw the importance of formal training but didn’t have the bandwidth to provide it.
  • One group recommended requiring training to be completed before granting access to the tool or data.
  • Participants found users don’t log into the tool for weeks and then become frustrated when they are not able to find what they need.
    • Set up a weekly or bi-weekly meeting for users to answer specific questions they have about the tool or data.
    • Offer one-on-one or small group informal training sessions as needed.

Data

  • Organizations need to trust their data or else it will not be used, the analytics team will lose support for maintenance or upgrades, and eventually the tool will be viewed as “broken”.
    • Implementation is an ongoing process (not “once and done”) as new online initiatives are introduced and current ones evolve.
    • Periodically audit the reports to determine if they are still relevant or need to be enhanced.
    • Ensure all implementation projects include a data validation stage.
    • Secure adequate lead times so that all new pages/sites/apps/campaigns can be thoroughly tested before being launched.

Communication

  • Multiple groups identified communication as being a key success factor.
    • There needs to be good communication between all stakeholders as changes implemented by one group can impact other teams’ metrics and reporting.
    • One participant shared a painful experience where his team’s KPIs disappeared without warning one day when another team decided to change their implementation and metrics.
  • Documentation can be challenging to keep updated, but without it your organization may be vulnerable if key people leave.
    • Each report suite should have its own documentation and each variable should have at last one sentence detailing why the variable was set and what it is used for.
    • Circulate an internal white paper on what all of the metrics mean to provide greater transparency.
    • Publish a regular newsletter for your internal analytics community and key stakeholder groups.
  • Different groups shared various frustrations with the business, indicating that there are still some communication opportunities to clear up internal misconceptions.
    • Business doesn’t understand why the web data doesn’t match data within other systems and why directional data is useful.
    • Business lacks appreciation for the implementation and update process for data collection.
    • Business fails to understand how much time it takes to answer analysis requests.

As you can see there were a lot of good insights stirred up during this session, and this summary only represents a portion of what was shared and discussed. For many of the session participants, it was great to sit down with industry peers who are experiencing similar challenges in their organizations (you’re not alone!) as well as interacting with more seasoned practitioners who have already overcome some of those challenges (there’s hope!). If you participated in the roundtable discussions and one of your key takeaways wasn’t captured in the above summary, please feel free to add a comment. A big thanks to all of the moderators and participants who joined the conversations!

4 comments
Brent Dykes
Brent Dykes

Wendy, I'm glad you found it useful. Thanks goes to all who shared their wisdom. Cheers, Brent.

Wendy Krautkramer
Wendy Krautkramer

Great article and great tips! Useful as well for new forms of analytics being pursued with existing tools (ex. adding personalization to a site, etc..).

Brent Dykes
Brent Dykes

Great insights, Cleve. I second the motion that analysts need to be involved during the entire process, especially upfront during the development of the measurement strategy and also during the data validation phase. Brent.

Cleve Young
Cleve Young

Another issue which we ran into during the our last implementation was the data structure was completed by Marketing and IT, with help from a consultant, and was never reviewed by the analysts in a timely manner. By the time they got around to involving us the process was so far along it was too late to make many meaningful adjustments. I had no problem with the consultant company as they can only go by what they are told. The issue is too often Marketing gives broad simplistic business requirements which IT will structure without really understanding what is required to efficiently analyze the data to answer those questions. Now, we as the analysts spend far too much time making adjustments and workarounds to compensate for the poor data which leaves us less time to do meaningful and relevant analysis. Bottom line, make sure the analysis team is involved throughout the entire process, not just at the end where it is often too late. Regards, Cleve