In part one of this blog series, we talked about the importance of data but also the challenges of having so much of it. All this raises a good question: If a volume-based approach toward data and analytics is inappropriate, then why do we keep producing programs that accumulate more and more data?


You’ve definitely heard the words Big Data thrown about, and you’ve likely heard people say we’ve now entered the era of Big Data. Dramatic pronouncements aside, those statements are right. We now accumulate a lot more user information than ever before, and we now accumulate far more user information than any one person could hope to keep up with. If you’re a marketer and you try to personally handle all this data on your own, you will quickly short out your rational brain and end up making your decisions by resorting to your own fallible instincts.

Talk about counterproductive! We brought analytics to marketing in the first place so marketers could go beyond their ambiguous instincts and so they could use hard data to reach predictably successful decisions.

All this makes the Big Data story sound like a bit of a bind.

  • On the one hand, digital marketers need to use data to make their decisions.
  • On the other hand, digital marketers need to use an ever-increasing volume of data without letting that data overwhelm them.

The way out of this potential pitfall is actually pretty simple: marketers need to spend more time executing on all that data and less time collecting and simply visualizing it.

Execution Over Pure Accumulation

I’d love to put marketers who just accumulate unexecuted data on their own special episode of Hoarders. When it comes down to it, building up a mountain of untouched data is the IT equivalent of stuffing every last corner of your home with towering stacks of unread newspapers—there’s a lot of information sitting there, but it’s not doing you any good!

If you don’t execute on data, then that data is worthless. When you actually put your data to work, you’re going to reap the benefits of Big Data, as our clients have proven time and time again.

Leading financial services firm Citi used Adobe Insight to reduce delinquent payments by 15 percent through targeting at-risk customers with special marketing promotions pushing automatic payment reminders. Citi also used Insight to reduce customer service calls by 20 percent through identifying and correcting website abandonment points that lead to unnecessary phone time.

Our clients have also found the more sophisticated their analytics package, the better their results. Los Angeles based Dollar Rent A Car optimized their Web presence with Adobe SiteCatalyst and SearchCenter+ ( before adopting Insight and Adobe Test & Target to utilize their offline data just as effectively. Implementing this expanded analytics package, Dollar Rent A Car produced a 45 perecnt ROI as they increased productivity, reduced service costs, improved customer segmentation and personalization, and improved promotional effectiveness.

The Data Keeps Getting Bigger

If you’re going to create effective marketing campaigns, you need to do more than just measure and visualize large quantities of user data in order to put up the veneer of helpfulness. You need to use an analytics solution that sifts through these ever-expanding metrics and offers highly targeted recommendations based on its evaluations.

And make no mistake, the volume of user data being sucked into these analytics solutions is growing at an exponential rate. If you think it’s hard to keep track of this data on your own right now just, imagine where we’ll be in five years.

Overall, this era of Big Data (and soon to be era of Even Bigger Data) is a good thing. All this data not only lets marketers understand their existing users better than ever, it lets marketers perform previously unheard of feats of predictive marketing. With the volume of data we pull in and with a sophisticated solution, you can not only measure what your users do online, you can also discover who they are offline and even what they’re going to do next.

And that’s big.

Measure Everything, Look at Next To Nothing

The data required to accurately define and predict individuals and their future behavior is more than just big … it’s monstrous.

While there are a lot of bright people working in the world of digital marketing, even the brightest brain can only juggle so much data at any given moment. Forget about monstrous volumes of data; the human brain can only handle a relatively small amount of data before it shorts out and loses its rational decision-making capabilities.

But unlike even the brightest digital marketer around, the right solution can handle an infinite volume of data.

As I’ve said before, machines are just plain better than humans at sifting through a huge amount of data to find relevance within the mess of metrics laid out before them. That’s why we created these machines in the first place, and that’s why we keep making these machines smarter and more specialized to handle the sorts of analytics we’re creating for the future. After all, as the volume of data we can measure accelerates exponentially over the coming years, the (already poor) human ability to handle this data on its own will continue to diminish, and an effective marketer’s reliance on data-sifting programs will increase equally dramatically.

In other words, although smart marketers understand the importance of utilizing all the data they can get their hands on, they actually look at very little of it. The smartest marketers pass along as much of this data-sifting, pattern-finding, and relevance discovery to machines as possible and only look at a few outputs. It’s a lot easier to look at and make decisions from—a half dozen outputs versus a trillion inputs.

This means the smartest marketer in the room isn’t the marketer who looks at the most data. In a lot of ways, the smartest marketer in the room is the marketer who looks at the least data.