In an update to part 2 of this series, I decided to reference the “Click, Baby, Click” ad that was launched on October 8, 2013 as part of an Adobe campaign aimed at helping marketers understand the need for more intelligent optimization tools. While comical in nature, it correctly teaches the dangers of using clicks as a primary measure of success.
Click here to watch the video.
As a quick reminder, these posts discuss how click tracking should be used only as a small part of a site’s measurement strategy. “Click Candy”, as I call it, should be avoided in much the same way real candy should be used sparingly in a balanced diet. My four main reasons being:
- Click tracking is difficult and costly to maintain, implement, and report on.
- Heat maps and other click measurement tools rarely tell the whole story and have inherent flaws.
- Accurate conclusions are reached by measuring KPIs, not clicks.
- Clicks are not a measure of site usability or navigation effectiveness.
The last post covered the first two points and argued that the tools and methodology around measuring clicks is inherently difficult and produces questionable data. I’ll now address the last two reasons as well as a quick tip to help implement an appropriate replacement for click tracking.
Accurate conclusions are reached by measuring KPIs, not clicks.
This section comes from the concept that the clicks are not a KPI (in most cases). Starting with an example, after a site redesign the click metrics report that the new “watch video” button on article pages is red hot. Users love the video content, right? There is no way to tell because we’re looking at the wrong metric. It’s quite possible that the video ad views (the actual KPI) are down significantly because users watch video but fail to watch enough to generate more ad views than they did in the previous design. Focusing on the clicks instead of the video ad views misses the fact that the new design is not helping generate revenue.
Let’s also consider what happens if click data simply points you in the wrong direction. Using the same example above, what if a heatmap reports that the new “watch video” button is not being clicked very often? The first impulse might be to test different variations of the “watch video” button. Because it has received so few clicks, surely that is culprit, right? Maybe…but maybe not. The site was redesigned, so it could be any number of things. Is the “watch video” button even valuable? Is it just in the wrong place on the page? Is there something else on the page that prevents it from getting attention? The point is that the click data incorrectly anchors one into thinking that the button it is the problem when the actual problem is low ad views which may have nothing to do with the button at all.
Clicks are not a measure of site usability or navigation effectiveness.
I’ve saved this argument for last because it is probably the most stated reason why one wants to measure clicks. My disagreement with this stems from one concept: people are not robots. For whatever reason, many feel that users browse their sites in a defined order. Many feel that their users have a singular defined purpose for coming to their site and that they are going to click from page A to B, and then place an order on page C, with no deviation; therefore, the number of clicks on key elements on a page means success. Is that how you browse a site? Of course not! You open multiple tabs on the same site, compare what the site is offering to another (also probably open on another tab), find side avenues of interest, etc. If the number of times something is clicked on is the measure of how useful it is, then real optimization opportunities are being missed.
Let’s go with another example. Say your site has internal search functionality. Any heatmap, overlay, or table report is going to tell you it is one of the most clicked places on any page. The real question remains: is it helping the customer convert? Will measuring the number of clicks on the search results tell you that? No. This reports what users think is the answer to their search based on what is presented to them, not what is the answer to their search and, ultimately, what helps them convert. Millions of clicks on search results does not mean that the internal search tool is effective.
The same holds true for that special widget that Marketing and IT worked together on to aid customers in finding what they need. Like internal search results, millions of clicks on it doesn’t mean anything if users who use it didn’t actually convert or go on to view more content.
What about navigation? The same argument could be used in the example above. The number of clicks on a navigation element does not mean it was effective at getting the user where they wanted to go. The only thing clicks tell us is that it made the user think that element takes them where they wanted to go.
What should be measured and how?
What is a measure of usability is if the tools, layout, and menus provided help the customer convert, regardless of the path they took, what they clicked on, or how many times they clicked it. If a site tool, menu navigation element, or the like is often used in the same visit or used by the same visitors that actually converted/viewed the most content on your site, then you can begin to identify patterns, trends, and correlations.
How does one measure this with SiteCatalyst? When users land on key pages or complete key actions (not clicks), place the tool used to get them to that page/action in a prop and eVar along with an event. To take this a step further, treat internal tool usage as you would external campaigns to the site. Stack them, run advanced attribution modeling, see what order they are used, etc…Find out what key tools, menus, or modules are used to find content/convert and then use those reports as part of a testing program.
Note that the actual question of causality is still at large because correlation is not causality. This is where the testing program comes into play. With common themes and trends beginning to emerge, you can sniff out areas to begin testing and determining what is useful and what is not.
Eat your veggies!
Echoing what your mother likely told you, eat your veggies, and then you can have some candy! In analytics land, be sure you have all of your major KPIs measuring and reporting properly, proper analytics governance in place, and a well thought-out optimization plan/organization running smoothly before investing any time or money into measuring clicks. And who knows? You may just find that you’re “healthier” without all that “candy.”