You may have seen var­i­ous peo­ple use the terms “report­ing” and “analy­sis” as though they were inter­change­able terms or almost syn­onyms. While both of these areas of web ana­lyt­ics draw upon the same col­lected web data, report­ing and analy­sis are very dif­fer­ent in terms of their pur­pose, tasks, out­puts, deliv­ery, and value. With­out a clear dis­tinc­tion of the dif­fer­ences, an orga­ni­za­tion may sell itself short in one area (typ­i­cally analy­sis) and not achieve the full ben­e­fits of its web ana­lyt­ics invest­ment. Although I’m pri­mar­ily focus­ing on web ana­lyt­ics, com­pa­nies can run into the same chal­lenge with other ana­lyt­ics tools (e.g., ad serv­ing, email, search, social, etc.).

Most com­pa­nies have ana­lyt­ics solu­tions in place to derive greater value for their orga­ni­za­tions. In other words, the ulti­mate goal for report­ing and analy­sis is to increase sales and reduce costs (i.e., add value). Both report­ing and analy­sis play roles in influ­enc­ing and dri­ving the actions which lead to greater value in organizations.

For the pur­poses of this blog post, I’m not going delve deeply into what hap­pens before or after the report­ing and analy­sis stages, but I do rec­og­nize that both areas are crit­i­cal and chal­leng­ing steps in the over­all data-driven decision-making process. It’s impor­tant to remem­ber that report­ing and analy­sis only have the oppor­tu­nity of being valu­able if they are acted upon.

Pur­pose

Before cov­er­ing the dif­fer­ing roles of report­ing and analy­sis, let’s start with some high-level def­i­n­i­tions of these two key areas of analytics.

Report­ing: The process of orga­niz­ing data into infor­ma­tional sum­maries in order to mon­i­tor how dif­fer­ent areas of a busi­ness are per­form­ing.

Analy­sis: The process of explor­ing data and reports in order to extract mean­ing­ful insights, which can be used to bet­ter under­stand and improve busi­ness performance.

Report­ing trans­lates raw data into infor­ma­tion. Analy­sis trans­forms data and infor­ma­tion into insights.  Report­ing helps com­pa­nies to mon­i­tor their online busi­ness and be alerted to when data falls out­side of expected ranges. Good report­ing should raise ques­tions about the busi­ness from its end users. The goal of analy­sis is to answer ques­tions by inter­pret­ing the data at a deeper level and pro­vid­ing action­able rec­om­men­da­tions. Through the process of per­form­ing analy­sis you may raise addi­tional ques­tions, but the goal is to iden­tify answers, or at least poten­tial answers that can be tested. In sum­mary, report­ing shows you what is hap­pen­ing while analy­sis focuses on explain­ing why it is hap­pen­ing and what you can do about it.

Tasks

Com­pa­nies can some­times con­fuse “ana­lyt­ics” with “analy­sis”. For exam­ple, a firm may be focused on the gen­eral area of ana­lyt­ics (strat­egy, imple­men­ta­tion, report­ing, etc.) but not nec­es­sar­ily on the spe­cific aspect of analy­sis. It’s almost like some orga­ni­za­tions run out of gas after the ini­tial set-up-related activ­i­ties and don’t make it to the analy­sis stage. In addi­tion, some­times the lines between report­ing and analy­sis can blur — what feels like analy­sis is really just another fla­vor of reporting.

One way to dis­tin­guish whether your orga­ni­za­tion is empha­siz­ing report­ing or analy­sis is by iden­ti­fy­ing the pri­mary tasks that are being per­formed by your ana­lyt­ics team. If most of the team’s time is spent on activ­i­ties such as build­ing, con­fig­ur­ing, con­sol­i­dat­ing, orga­niz­ing, for­mat­ting, and sum­ma­riz­ing — that’s report­ing. Analy­sis focuses on dif­fer­ent tasks such as ques­tion­ing, exam­in­ing, inter­pret­ing, com­par­ing, and con­firm­ing (I’ve left out test­ing as I view opti­miza­tion efforts as part of the action stage). Report­ing and analy­sis tasks can be inter­twined, but your ana­lyt­ics team should still eval­u­ate where it is spend­ing the major­ity of its time. In most cases, I’ve seen ana­lyt­ics teams spend­ing most of their time on report­ing tasks.

Out­puts

When you look at report­ing and analy­sis deliv­er­ables, on the sur­face they may look sim­i­lar — lots of charts, graphs, trend lines, tables, stats, etc. How­ever, there are some sub­tle dif­fer­ences. One of the main dif­fer­ences between report­ing and analy­sis is the over­all approach. Report­ing fol­lows a push approach, where reports are pushed to users who are then expected to extract mean­ing­ful insights and take appro­pri­ate actions for them­selves (i.e., self-serve). I’ve iden­ti­fied three main types of report­ing: canned reports, dash­boards, and alerts.

  1. Canned reports: These are the out-of-the-box and cus­tom reports that you can access within the ana­lyt­ics tool or which can also be deliv­ered on a recur­ring basis to a group of end users. Canned reports are fairly sta­tic with fixed met­rics and dimen­sions. In gen­eral, some canned reports are more valu­able than oth­ers, and a report’s value may depend on how rel­e­vant it is to an individual’s role (e.g., SEO spe­cial­ist vs. web producer).
  2. Dash­boards: These custom-made reports com­bine dif­fer­ent KPIs and reports to pro­vide a com­pre­hen­sive, high-level view of busi­ness per­for­mance for spe­cific audi­ences. Dash­boards may include data from var­i­ous data sources and are also usu­ally fairly static.
  3. Alerts: These con­di­tional reports are trig­gered when data falls out­side of expected ranges or some other pre-defined cri­te­ria is met. Once peo­ple are noti­fied of what hap­pened, they can take appro­pri­ate action as necessary.

In con­trast, analy­sis fol­lows a pull approach, where par­tic­u­lar data is pulled by an ana­lyst in order to answer spe­cific busi­ness ques­tions. A basic, infor­mal analy­sis can occur when­ever some­one sim­ply per­forms some kind of men­tal assess­ment of a report and makes a deci­sion to act or not act based on the data. In the case of analy­sis with actual deliv­er­ables, there are two main types: ad hoc responses and analy­sis pre­sen­ta­tions.

  1. Ad hoc responses: Ana­lysts receive requests to answer a vari­ety of busi­ness ques­tions, which may be spurred by ques­tions raised by the report­ing. Typ­i­cally, these urgent requests are time sen­si­tive and demand a quick turn­around. The ana­lyt­ics team may have to jug­gle mul­ti­ple requests at the same time. As a result, the analy­ses can­not go as deep or wide as the ana­lysts may like, and the deliv­er­able is a short and con­cise report, which may or may not include any spe­cific recommendations.
  2. Analy­sis pre­sen­ta­tions: Some busi­ness ques­tions are more com­plex in nature and require more time to per­form a com­pre­hen­sive, deep-dive analy­sis. These analy­sis projects result in a more for­mal deliv­er­able, which includes two key sec­tions: key find­ings and rec­om­men­da­tions. The key find­ings sec­tion high­lights the most mean­ing­ful and action­able insights gleaned from the analy­ses per­formed. The rec­om­men­da­tions sec­tion pro­vides guid­ance on what actions to take based on the analy­sis findings.

When you com­pare the two sets of report­ing and analy­sis deliv­er­ables, the dif­fer­ent pur­poses (infor­ma­tion vs. insights) reveal the true col­ors of the out­puts. Report­ing pushes infor­ma­tion to the orga­ni­za­tion, and analy­sis pulls insights from the reports and data. There may be other hybrid out­puts such as anno­tated dash­boards (analy­sis sprin­kles on a report­ing donut), which may appear to span the two areas. You should be able to deter­mine whether a deliv­er­able is pri­mar­ily focused on report­ing or analy­sis by its pur­pose (information/insights) and approach (push/pull).

Another key dif­fer­ence between report­ing and analy­sis is con­text. Report­ing pro­vides no or lim­ited con­text about what’s hap­pen­ing in the data. In some cases, the end users already pos­sess the nec­es­sary con­text to under­stand and inter­pret the data cor­rectly. How­ever, in other sit­u­a­tions, the audi­ence may not have the required back­ground knowl­edge. Con­text is crit­i­cal to good analy­sis. In order to tell a mean­ing­ful story with the data to drive spe­cific actions, con­text becomes an essen­tial com­po­nent of the storyline.

Although they both lever­age var­i­ous forms of data visu­al­iza­tion in their deliv­er­ables, analy­sis is dif­fer­ent from report­ing because it empha­sizes data points that are sig­nif­i­cant, unique, or spe­cial — and explain why they are impor­tant to the busi­ness. Report­ing may some­times auto­mat­i­cally high­light key changes in the data, but it’s not going explain why these changes are (or aren’t) impor­tant. Report­ing isn’t going to answer the “so what?” ques­tion on its own.

If you’ve ever had the plea­sure of being a new par­ent, I would com­pare canned report­ing, dash­boards, and alerts to a six-month-old infant. It cries — often loudly — when some­thing is wrong, but it can’t tell you what is exactly wrong. The par­ent has to scram­ble to fig­ure out what’s going on (hun­gry, dirty dia­per, no paci­fier, teething, tired, ear infec­tion, new Baby Ein­stein DVD, etc.). Con­tin­u­ing the par­ent­ing metaphor, report­ing is also not going to tell you how to stop the crying.

The rec­om­men­da­tions com­po­nent is a key dif­fer­en­tia­tor between analy­sis and report­ing as it pro­vides spe­cific guid­ance on what actions to take based on the key insights found in the data. Even analy­sis out­puts such as ad hoc responses may not drive action if they fail to include rec­om­men­da­tions. Once a rec­om­men­da­tion has been made, follow-up is another potent out­come of analy­sis because rec­om­men­da­tions demand deci­sions to be made (go/no go/explore fur­ther). Deci­sions pre­cede action. Action pre­cedes value.

Deliv­ery

As men­tioned, report­ing is more of a push model, where peo­ple can access reports through an ana­lyt­ics tool, Excel spread­sheet, wid­get, or have them sched­uled for deliv­ery into their mail­box, mobile device, FTP site, etc. Because of the demands of hav­ing to pro­vide peri­odic reports (daily, weekly, monthly, etc.) to mul­ti­ple indi­vid­u­als and groups, automa­tion becomes a key focus in build­ing and deliv­er­ing reports. In other words, once the report is built, how can it be auto­mated for reg­u­lar deliv­ery? Most of the ana­lysts who I’ve talked to don’t like man­u­ally build­ing and refresh­ing reports on a reg­u­lar basis. It’s a job for robots or com­put­ers, not human beings who are still pay­ing off their stu­dent loans for 4–6 years of higher education.

On the other hand, analy­sis is all about human beings using their supe­rior rea­son­ing and ana­lyt­i­cal skills to extract key insights from the data and form action­able rec­om­men­da­tions for their orga­ni­za­tions. Although analy­sis can be “sub­mit­ted” to deci­sion mak­ers, it is more effec­tively pre­sented person-to-person. In their book “Com­pet­ing on Ana­lyt­ics”, Thomas Dav­en­port and Jeanne Har­ris empha­size the impor­tance of trust and cred­i­bil­ity between the ana­lyst and deci­sion maker. Deci­sion mak­ers typ­i­cally don’t have the time or abil­ity to per­form analy­ses them­selves. With a “close, trust­ing rela­tion­ship” in place, the exec­u­tives will frame their needs cor­rectly, the ana­lysts will ask the right ques­tions, and the exec­u­tives will be more likely to take action on analy­sis they trust.

Value

When it comes to com­par­ing the dif­fer­ent roles of report­ing and analy­sis, it’s impor­tant to under­stand the rela­tion­ship between report­ing and analy­sis in dri­ving value. I like to think of the data-driven stages (data > report­ing > analy­sis > deci­sion > action > value) as a series of domi­noes. If you remove a domino, it can be more dif­fi­cult or impos­si­ble to achieve the desired value.

In the “Path to Value” dia­gram above, it all starts with hav­ing the right data that is com­plete and accu­rate. It doesn’t mat­ter how advanced your report­ing or analy­sis is if you don’t have good, reli­able data. If we skip the “report­ing” domino, some sea­soned ana­lysts might argue that they don’t need reports to do analy­sis (i.e., just give me the raw files and a data­base). On an indi­vid­ual basis that might be true for some peo­ple, but it doesn’t work at the orga­ni­za­tional level if you’re striv­ing to democ­ra­tize your data.

Most com­pa­nies have abun­dant report­ing but may be miss­ing the “analy­sis” domino. Report­ing will rarely ini­ti­ate action on its own as analy­sis is required to help bridge the gap between data and action. Hav­ing analy­sis doesn’t guar­an­tee that good deci­sions will be made, that peo­ple will actu­ally act on the rec­om­men­da­tions, that the busi­ness will take the right actions, or that teams will be able to exe­cute effec­tively on those right actions. How­ever, it is a nec­es­sary step closer to action and the poten­tial value that can be real­ized through suc­cess­ful web analytics.

Final Words

Report­ing and analy­sis go hand-in-hand, but how much effort and resources are being spent on each area at your com­pany? When I hear a client is strug­gling to find value from their web ana­lyt­ics invest­ment, it usu­ally means one of the domi­noes in the “Path to Value” is miss­ing and often analy­sis is that mis­placed domino.

I recently met with a major media client that found it was miss­ing its analy­sis domino. The web ana­lyt­ics team was strug­gling to meet the strat­egy, imple­men­ta­tion, and report­ing demands of this large, com­plex orga­ni­za­tion — let alone pro­vid­ing analy­sis beyond just ad hoc responses. Senior man­age­ment was becom­ing increas­ingly frus­trated with its ana­lyt­ics staff and sys­tem. For­tu­nately, the web ana­lyt­ics team received addi­tional head­count bud­get and hired an ana­lyst to per­form deep-dive analy­ses for all of its main prod­uct groups and drive action­able rec­om­men­da­tions. Not sur­pris­ingly the atti­tude of the senior exec­u­tives did a 180-degree turn shortly after the com­pany found its miss­ing analy­sis domino.

You may be won­der­ing how much time your ana­lysts should spend on analy­sis. As a rule of thumb, I would say at least 25% of their time should be spent on analy­sis, and gen­er­ally the more, the bet­ter. Sur­pris­ingly, 100% is not desir­able either because there are many impor­tant respon­si­bil­i­ties that are needed to keep an ana­lyt­ics pro­gram afloat such as report­ing, gath­er­ing busi­ness require­ments, train­ing, doc­u­ment­ing and com­mu­ni­cat­ing suc­cesses, etc. I hope after read­ing this arti­cle you at least rec­og­nize that 0% of their time is unac­cept­able. If your com­pany isn’t doing much analy­sis today, exper­i­ment with a 10% focus on analy­sis and see what suc­cess you have from there. In addi­tion, our con­sult­ing team is always will­ing to help with your analy­sis needs. Good luck!

16 comments
Cool Playbook
Cool Playbook

Great Post Ben! Really informative and I loved the comparison table specially. Thanks

Ahsan
Ahsan

Excellent post. An essential read for anyone involved with BI/DW implementation regardless of vendor or subject area.

Steve Fernandez
Steve Fernandez

@Eric - I wouldn't fully discredit using a true analyst to do reporting; at least the building of the report. There's a real art in the transformation of raw data into something readable and meaningful. I personally take a lot of satisfaction is stretching the Tufte side of my mind to help the organization understand the sea of digital data that's captured. But, I agree whole heartedly in automating the process as much as possible. Mind numbing drudgery needs to be eliminated at any opportunity.

ankur batla
ankur batla

Hi Brent, Thanks for writing this article.., It has clarify my queries .. :) Thanks again.. :)

Abhijith
Abhijith

Well put. One might be employee/employer who read the post could discover two things. where they are? and what they should do? Abhijith

Eli Mueller
Eli Mueller

@Eric T. Peterson - I agree that if possible, a company should strive for increased allocation of time in the analysis stages of the process outlined so well. It's true that automated reporting, where applicable, should be actively pursed to help relieve less constructive tasks from the limited resources that most companies have. Unfortunately in most businesses, resources (both technological and human-based) are limited to the point where the 25% analysis may be a significant improvement.

Brent Dykes
Brent Dykes

Andrew, Thanks for your feedback. Brent.

Andrew
Andrew

Fantastic article!! Very well written.

Brent Dykes
Brent Dykes

Matt, I agree wholeheartedly. As I wrote this post, I didn’t want to marginalize the role of reporting. It has its own unique role (it is a domino), but I hope that as companies discover the importance of analysis that they will be able to realize even more value from their web analytics investments. Brent.

Matt Coen
Matt Coen

Brent, Well put. These concepts are fundamental to realizing the real value of tools like SiteCatalyst. Reporting is necessary but the money is in analytics. Matt

Brent Dykes
Brent Dykes

Adam, If this post can help a few companies in some small way, I’ll be thrilled. Thanks, Brent.

Adam Greco
Adam Greco

Another great post! It is often hard to get companies to leave their analysts alone for enough time to do real analysis...Hopefully this post will help!

Brent Dykes
Brent Dykes

Web Analytics Europa, I’m glad you found the comparisons helpful. I felt they were necessary to help define the different roles of reporting and analysis. Hopefully, this article will help companies to identify where their analysts are currently focusing most of their time. Thanks, Brent.

Web Analytics Europa
Web Analytics Europa

We all knew it but with this article and the included comparisons it gives a very powerful approach to change things within the organisation to improve on your website and strengthen the competence of analytical people. Good read.

Brent Dykes
Brent Dykes

Eric, Thanks for your comments. I think we’re on the same page, but we might be differing slightly on the approach or emphasis. We both agree that analysis is important (“the more, the better” as I stated above). When I mentioned that analysts should be spending “at least 25%” of their time on analysis, I’m trying to encourage companies to get started with analysis. I think we’d both agree that the task at hand is not to get companies to close the gap from 60% to 80%. We’re trying to get organizations to close a bigger gap and go from the 0-10% mark in some cases to a higher level that will start to build tangible momentum or inertia for analysis in their companies – hence, my “at least 25%” goal. My wife started running four years ago, and she loves it. Her first race was a 5K in our neighborhood, and she eventually completed a full marathon. When you say that companies should have the goal of reaching 80% analysis mark (i.e., running a marathon), all I’m advocating is that companies start with a 5K first (>25%). Eventually, they’ll be both ready and excited to run a marathon just like my wife was. I believe when a firm has some success with analysis that it will fuel more analysis, but the key is getting started. That’s why I’m focusing on a smaller, more attainable goal in my post. Thanks again for your comments. We’re a united front for more analysis in our industry. Brent.

Eric T. Peterson
Eric T. Peterson

Brent, Great post echoing a lot of what I've been saying for over a decade. One issue: when you say analysts should spend "at least 25%" of their time on analysis (implying 75% on reporting and similar tasks) you really haven't moved the bar very much. At Web Analytics Demystified what we have long seen is that most experienced web analytics practitioners are spending 80% of their time on reporting and make-work functions and 20% (or less) of their time doing any type of real analysis. This is, of course, messed up for a variety of reasons: 1) Reporting, while valuable, is something that needs to be automated wherever and whenever possible, and reports need to be delivered through intuitive and easily learned systems. See my post on this subject from February of this year for more details. 2) Reporting very rarely translates into the type of insights that drive businesses forward, and so having your most highly trained and qualified people (analysts) spend 75% of their time (your number) producing low-value output doesn't contribute to web analytics return on investment. You captured this point well. 3) Most importantly, I don't know very many analysts worth their weight who enjoy reporting, regardless of what tool set they use. When people aren't able to "stretch their minds" and really consider what the treasure trove of data we work with can tell them about Change the Business initiatives, well, they get antsy. There is nothing worse than antsy analysts --- unless you're Corry Prohens over at IQ Workforce. J In our strategic practice we typically recommend that client build out governance and staffing models that lead to tiered teams (easier to hire) and challenge their most senior resources with spending 80% of their time doing analysis, not reporting. Yes, 80% is the target, and yes, 80% is difficult to hit in the resource constrained environment we work in, but in my experience we've already set our sights too low ... it's time to challenge ourselves, our leadership, and our community to do better. Again, great post. Eric T. Peterson Web Analytics Demystified, Inc. http://www.webanalyticsdemystified.com