In the last two weeks I have taken my car to the shop twice. The first visit was to get the oil changed, and just yesterday I had new tires put on. Having paid for both of these maintenance items so close together, I can say that getting new tires was much more gratifying than having the oil changed. After the oil was changed, I noticed absolutely nothing different about the car. A technician would tell you that I prolonged the life of the car by 6 months, but I can’t see any of that benefit today. However, changing the tires has instantly given be better fuel economy and my car no longer pulls precariously left into on-coming traffic.

Some business models are more like changing tires in that they naturally lend themselves well to testing because their monetization is so readily transparent. If you are selling widgets, you can easily see which test recipe sells more widgets. Of course if you can do that, you will likely take it a step further, looking at revenue per visitor to see which recipe hits the price elasticity sweet spot of conversion rate and average order value.

If your business is media though, testing can feel like changing oil because immediate visible results may be difficult to spot – even if they are there. All publishers generate revenue through their digital properties in one fashion or another. Often times this happens through brand advertising ad sales, often in the form of an impression CPM model. When it comes to testing changes on these sites, few publishers can make any more than a transparent connection between what a visitor does on their site and how the income statement is impacted in the short term. The impact may be there, but it is rarely as readily apparent as with other business models. Because of this, some marketers in our industry simply resign themselves to the fate that media testing just can’t be monetized. When that happens, a little part of me dies inside.

Although monetizing the impact of testing is inherently more difficult for publishers than it may be for retailers with a readily transparent monetization model, there are still a number of methods to monetize the impact of your testing in a way that you can act off the results. Below are a few critical points on how to do this.

One Size Does NOT Fit All
Years ago I worked for a company that would have its annual conference in a glamorous location each year. As part of this annual pep rally, attendees were given commemorative t-shirts that reinforced the company’s newest marketing slogan. In an attempt to make sure that no one received a shirt that was too small, everyone received an XXL shirt. While they successfully avoided offending anyone by giving them a shirt that was too small, they vast majority of people received shirts that were too big.  Consequently, almost no one wore them.

With your media testing program you’re not giving away shirts in bulk. You have the luxury of tailoring your testing solution and metric to your program. Below I have outlined a couple of media test metrics from the most simple and out of the box, to metrics that are more sophisticated and customized. This is just a sampling as an entire post should probably be dedicated to this list alone. What works for your organization ultimately depends on your organization.

Page Views Per Visit
This “engagement metric” is one of the standard metrics built into the Adobe Test&Target platform. It requires that you deploy a conversion mbox on every page of the site. Most organizations that utilize this metric deploy this mbox globally through their CMS, which can be fairly simple. From there, Test&Target simply counts how many pages each visitor consumes per visit after they have seen the test experience. Furthermore, Test&Target keeps track of which experience the visitor saw and by summing up the page consumption for all visits in a test, it can distinguish which test experience produces more page consumption. This is a fairly simply approach and it assumes that all pages carry the same monetized value, an assumption that is true for a select few, but not for most.

Page Score Per Visit
Many organizations recognize that not all content is monetized at the same rate, and for that reason want to weight pages differently when evaluating test experiences. This can be accomplished by adding a page score parameter to the mbox on each page of the site. With this approach Test&Target sums each of these page score values for every page in a visit to produce a weighted score for that visit. This method obviously requires more effort to produce unique page scores on every page, but the result is a metric more finely tuned to how revenue is actually generated on your site.

Few organizations make it this far, but those who do often go further to tackle even more sophisticated metrics such as ad count per visit (accounting for pages with multiple ads) or even ad margin per visit.

Focus on Progression, Not Perfection
Looking at the options above, it is natural for an organization to set its sights on settling for nothing less than a metric that completely replicates their site monetization. The problem with this line of thinking is many testing programs stall out while working on perfection when instead they could be making small, incremental steps forward. Your ultimate goal may be to become Mr. Universe. However, If you can’t do 10 pushups today, you need to start there.

Don’t focus on what your perfect testing monetization model is going to look like. I have seen several testing programs fail to launch a single test ever because they were immobilized by the quest for perfection.  Instead, identify where you are today and then figure out what your next step looks like. With this approach you are much more likely to make progress and find some actionable test wins than if you wait for that perfect metric to be ready.

Testing for media is always going to be a little less clear cut that with other industries. Don’t give up though. Start now by identifying where you are today and determining where you can realistically be tomorrow. With some strategic planning and carefully tuned metrics, your test results can move beyond miniscule oil-change increases in performance, to give your site a full engine overhaul.

0 comments