In my last post, I detailed the impor­tance of obser­va­tion, one of the prac­tices sup­port­ing asso­cia­tive think­ing as described in an impor­tant book titled The Innovator’s DNA. This post fea­tures a casual case study that explains how my team at Adobe was able to orga­nize and pri­or­i­tize dig­i­tal test­ing in a brave, new, dig­i­tal mar­ket­ing world.

The Innovator’s DNA quotes New York Times writer Peter Leschak, not­ing that we are all “watch­ers,” sprin­kled with but a hand­ful of “observers.” Because my team acted as engaged observers of our cus­tomers, our com­peti­tors, and our­selves, we were able to cre­ate a sys­tem­atic test­ing frame­work that helped our com­pany gain an impor­tant edge.

Casual Case Study: Adobe Dig­i­tal Mar­ket­ing Opti­miza­tion Framework

Ten years ago, the vir­tual mar­ket­place was in its infancy. Mar­ket­ing teams put together the first adver­tise­ments, videos, ban­ners, and web­pages, encour­ag­ing cus­tomers to shop, learn, engage, and com­mu­ni­cate online.

As online mar­ket­ing increased, it became clear to man­agers that we all needed a new way to mea­sure the impacts of our efforts. Never mind that frame­works for clas­si­fied A/B test­ing were designed as early as the 1890s by a man named Albert Lasker. This high tech envi­ron­ment was demand­ing chic new tools to mea­sure performance.

My team saw that a need for web­site test­ing was not being served but was unsure how to approach a solu­tion. We needed to embrace a test­ing frame­work allow­ing strate­gic pri­or­i­ti­za­tion wor­thy of a broad range of busi­ness objec­tives. We started by observ­ing. We looked at our cus­tomers, at our com­peti­tors, and at our­selves, thought­fully cre­at­ing a rough ini­tial framework.

We dis­sected the web­site, decid­ing to slice the pie into four equal pieces: lay­out, con­tent, cre­ative, and func­tion­al­ity. It was basi­cally four big buck­ets of vir­tual infor­ma­tion, cat­e­go­rized but not really orga­nized, and cer­tainly not prioritized.


The more we observed, the more orga­nized we became and the eas­ier it was to pri­or­i­tize. We pro­gressed to a model that gave each cat­e­gory its own sep­a­rate sphere. With sphere size rep­re­sent­ing the rel­a­tive resource time required, we weighed test vol­ume against con­ver­sion impact and placed the spheres along the x and y axes accord­ingly.


We dis­cov­ered that lay­out and func­tion­al­ity had the biggest impacts on con­ver­sion, but both required a great deal of time and tech­ni­cal knowl­edge to manip­u­late. That meant switch­ing things out for test­ing would prove expen­sive and time con­sum­ing, but get­ting it right was imper­a­tive. We found, again through obser­va­tion, that sites with low func­tion­al­ity or with poor lay­outs failed at con­ver­sion. We also saw that con­tent and cre­ative were rel­a­tively easy to cre­ate, change, and test. They could, and should, be tested only after get­ting our foun­da­tions of lay­out and func­tion­al­ity built on solid vir­tual ground.


Ulti­mately, we designed a reli­able frame­work that orga­nized and pri­or­i­tized our mar­ket­ing plans, help­ing my team to opti­mize our mar­ket­ing efforts. We cre­ated a new posi­tion titled “web opti­miza­tion” that focused pri­mar­ily on online test­ing. Orga­niz­ing, pri­or­i­tiz­ing, and opti­miz­ing, the Adobe OPO Dig­i­tal Mar­ket­ing Frame­work, born of obser­va­tion, was one of the first in the indus­try, giv­ing us a spe­cific advan­tage in ana­lyz­ing our mar­ket­ing efforts. How can you opti­mize your day today?