Recently I have spent a lot of time discussing with customers the importance of importing historical web analytics data when they move from one analytics platform to another.

Overwhelmingly, I hear marketers say that they must be able to import all data when they make the switch, and overwhelmingly, they get wide-eyed with anxiety when I tell them this: I am not a big fan of importing historical data. I believe it is only of value in some very specific circumstances.

Don’t get me wrong. When it comes to historical offline data, or data from an email campaign or from an ERP system, that information is of immense value in terms of analytics. But when it comes to comparing the general data in one analytics system to another, I believe it’s a fool’s errand: it wastes time and takes a tremendous amount of effort for very little payout.

There are many reasons for this, and my friend Avinash recently expressed his own concerns on importing historical data on his Occam’s Razor blog – many of which I agree with wholeheartedly.

That said, here are some of my own concerns and thoughts on what to import when you switch analytics vendors:

-> Concern #1. Inconsistency between platforms

Not all platforms count metrics the same way. Take unique visitors: there are definitions in the industry for what a “unique visitor” really means, but none of them are standardized. Thus, there’s no sense of compliance, and each analytics package tracks in a slightly (or hugely) different way.

There are differences in how analytics packages handle browser types, how they handle wireless, how they handle cookies… and these are just differences between two tag-based analytics systems. If you’re moving from an analytics system based on tags to one based on log-files, there are even greater differences because of the differences in how the systems process traffic.

The bottom line is, it’s almost impossible to try to compare these things.

-> Concern #2. The implementation itself

When a company deploys a new analytics system, changes are always made. You may put your tags at the top of the page rather than at the bottom. You go from a static HTML page to one that has flash. You deploy a variety of small, seemingly inconsequential changes.

But those changes mean that in addition to comparing data from one system to another, you’re looking at two different implementations.

It’s like comparing apples to aardvarks.

The comparability of your data is affected, and it can lead to misinformed decisions.

So what can you track?

When companies switch to Omniture analytics and they want to import historical data, I suggest that they focus on their KPIs. Key performance indicators are expressions of effectiveness, of how efficient a website is at achieving a specific business goal. So, while your underlying data may shift, the *relationship* between the KPIs, between one system and another, should not be dramatically different.

So, if you used one analytics platform for two years and your conversion rate was 2 percent, and you deploy a new one and your conversion rate is 4 percent, then, assuming that you haven’t changed anything in the site itself, the relationship between those averages should remain consistent.

How, then, do you handle data when migrating?

-> Step 1. Comparison in Excel

One of the first things you should do, before you try to import a bunch of data, is slow down and do a very simple comparison on a daily basis in Excel. Look at the difference between your systems, and develop an understanding of what the relationship actually looks like.

What often happens is people look at the relationship and say, “Our page views are down 20 percent!”

Don’t fret. Instead, relax and look at the relationship between the two. Are page views down 20 percent consistently, across the board? If not, there might have been something else going on during the time the relationship changed — perhaps there were holidays, an election, or the Super Bowl that could have biased your numbers. Or maybe there was a promotion that marketing was running that could have affected the specific period during the switch.

If, for example, in the week up to transition, your marketing department launched a free shipping promotion — and conversion numbers went up — then switched to the new platform at the same time that the promotion ended, you would have not only an inherent difference in measurement, but a change in conversions due to the conclusion of that campaign.

Bearing in mind the outside elements, try to discover what the ratio looks like between the data from the two systems.

-> Step 2. Augment your data with notes on external events, or “event notification”

Some analytics packages, such as Omniture’s SiteCatalyst, allow you to beef up your data with notes to inform those looking at the data of what was happening externally at the time the data was taken. For example, you can go into SiteCatalyst and make a note that on Jan. 5, you switched analytics platforms. When looking at the data in the dashboards, the note will show up to help you understand any changes in the data.

-> Step 3. Don’t let management look at the difference in data without giving them a heads-up

It’s very important to inform management that the numbers are not going to match. Set up their expectations that the numbers will be different, and tell them how you plan to reconcile the difference. Otherwise, there will be disgruntlement and heartache.

Also communicate with line-of-business managers and other key stakeholders.

Only at this point should you think about importing historical data into your system.

If you do decide to import KPI data, it’s really not that complicated, but it’s of little value if people don’t understand the differences between the two systems. Make sure management understands the ramifications, and remember to do an event notification to indicate when the change in the system occurred.

If you’d like to discuss historical data migration or have some thoughts on the subject, please don’t hesitate to leave us a comment!