This is the third in a series of posts in which I discuss the potential “best” measurement for online advertising. While some audience measurement firms believe time-spent-on-site is set to take over the waning page view as the most effective measure of visitor engagement, I believe they’re wrong.

I believe it is actually impressions that should become the standard by which both buyers and sellers of online media will begin to negotiate their buys. And perhaps, in the future, another metric like clicks will gain more footing in the non-search advertising world (clicks are already the standard in search advertising).

For a quick summary of my argument, read the first in the series. For more on time-spent-on-site, read on…

With its announcement, audience measurement firms have suggested that time-spent-on-site is a more equitable way than page views to measure visitor engagement in a Web 2.0 world.

The rationale is that Web 2.0 technologies like AJAX do not adhere to the traditional page metaphor, so time spent on site is a better form of engagement measurement. (For a description of why pages that use Web 2.0 technologies are difficult to measure using traditional page views, see Measuring Visitor Engagement: Brave New World or The Emperor’s New Clothes.

To that end, Scott Ross, director of product marketing at Nielsen commented, in a interview that, based on everything that’s going on with the influx of AJAX and streaming, total minutes is the best gauge for site traffic.

By way of example, Ross referenced, in the same interview (or read about it in a ComputerWorld article from Oct. 7, 2007), a comparison between MySpace and Youtube, where MySpace outnumbers YouTube’s page views by a factor of 10 or 11, but the ratio of time spent on MySpace is about 70% less – at roughly 3 to 1.

Sites like AOL and Yahoo also catapult to the top of the “engagement” list, driven by interactive applications like Instant Messenger and email.

With all of this in mind, it becomes increasingly evident why time spent on site has attracted so much attention. In a way, audience measurement firms didn’t have a choice. They needed to move beyond the page view to measure engagement, or the “quality” of the visit (but not necessarily the quality of the visitor).

But something else happens with time-spent-on-site. Audience measurement firms regain the high ground. Why? Because time spent on site is actually dictated by the panelist, not by the site they visit. In other words, time spent on site is measured by a meter that sits on the user’s computer – and unlike page views, sites have little way to manipulate or corrupt the number.

At this point, you could argue that this is great – kill two birds with one stone. Improve engagement or quality-of-visit measurement, and offer a metric that can’t be dramatically biased by sites themselves.

While I haven’t seen any articles mention this angle of the business (control vs. not), it’s nonetheless supportive of the claims that this is Brave New World territory and better for the industry as a whole.

But as I mentioned earlier, I actually disagree.

Time-spent-on-site has been available for years as a standard website metric.  In fact, it has changed very little since the prehistoric era when server log files were used to measure site traffic (yes, it was available back then).

So is it really a step forward for audience measurement – an industry that has recently come under fire for major challenges in their underlying methodologies and data accuracy?  And is it really a step forward for advertisers and publishers – who have likewise struggled to create an effective marketplace for buyers and sellers of ad inventory (hence the meteoric rise of third-party ad networks)?

I think not.  Next time, I’ll write about impressions, and why using impressions offers a significant opportunity for advertisers and publishers to create a more efficient marketplace.

michael k
michael k

Just finished reading all three 'visitor engagement' entries and now have more questions than answers! I run a resource site for traveling auto racing fans with a decent amount of content (i.e.: city guides, venue feedback, images, restaurants, lodging, etc.). The site employs both tabs and Ajax, so it seems that impression numbers will never give me the complete story when it comes to users. Because of this, I have been using the 'time spent' as a major KPI. Further, when it comes to advertising, it seems that impressions and 'time spent' can't be separated. If I am competing for ad dollars against another, similar site, it seems like there has to be a way to leverage 'time-on-page' in addition to merely impressions. If both sites have the same number of overall impressions on the Talladega Superspeedway page, but only averages 30 seconds per page; however, my users where on the page for over 3 minutes, wouldn't I be leaving ad dollars on the table if I can't leverage that fact? If nothing else, it seems like my site's higher time-on-page number will convince the advertiser that my site is more worthy of their dollars. If impressions were the metric advertisers used to compare the two sites, wouldn't they simply 'go with their gut', pick the prettier site, or even, flip a coin to determine which site to advertise on?

David Risdate
David Risdate

Though I agree with your basic premise about Time Spent being potentially not a step forward, I fear you approach this from a very colored perspective (being employed with a site analytics firm). Time Spent, as measured by an audience measurement firms, is accurately captured by software on a users computer. However, companies that use page-tagging methods to track behavior are unable to provide a complete picture because the last page of a visit is never included in time spent calculations (since there is no subsequent server call to compare timestamps). This is simply a fundemental challenge in page-tagging and accepted as being a 'flaw' in the process. However, this means that Time Spent measures from companies like Coremetrics and Omniture are inherently incorrect. What's odd in your posting above is that you say you disagree with the push of Time Spent, yet you don't mention why. Pithy statements such as "I think not" don't provide any detail as to why you are disagreeing with the industry. And you don't make any mention at all of the measurement flaw noted above and how it puts companies like yours in a more challenging position to find alternate methods of measuring 'engagement'.

Tammy A.
Tammy A.

Time spent on site is also one of those tricky metrics because it can be artificially high. Back in the day, I used to watch the average time spent on site metrics for a particular site I was managing. While interesting, it could also be misinterpreted. More time spent on the site doesn't always equal a happier visitor. It could be they couldn't find what they wanted and had to spend more time searching. It could also mean they were on your site and simply left their browser open and went on doing something else, which artificially inflates the sense of engagement they had. (Of course eventaully the 30-minute rules comes in, but I think you get my point.) Should be interesting to see how this plays out.