Last week I took a look at part one of Michael Krypel’s new book Refin­ing Design for Busi­ness, describ­ing his “iter­a­tive opti­miza­tion method­ol­ogy” and a highly acces­si­ble, highly action­able approach to inte­grat­ing consumer-centric design, test­ing, and strate­gic plan­ning. Part two of his book pro­vides those roll-up-your-sleeves next steps orga­ni­za­tions need. Think of part two as the “now what?” stage, and a proven process hun­dreds of busi­nesses have used to increase the effi­cacy of their site design. There are four key mile­stones within his method:

Opti­miza­tion roadmap. Tap­ping into the qual­i­ta­tive and quan­ti­ta­tive research you’ve culled, this stage is about devel­op­ing a pri­or­i­tized roadmap for test­ing across all areas of your orga­ni­za­tion. After each test there will likely be tweaks to your roadmap, with major adjust­ments com­ing dur­ing quar­terly busi­ness reviews.

Opti­miza­tion plan. Here, team mem­bers cre­ate plans for each test, includ­ing the objec­tive, hypoth­e­sis, planned dura­tion, approach, wire­frames, and designs.

Opti­miza­tion launch. The test is set up and taken through an exhaus­tive qual­ity assur­ance (QA) process before launch.

Opti­miza­tion results. Test results are ana­lyzed, rec­om­men­da­tions are made, and the roadmap is updated as needed. Results and next steps are com­mu­ni­cated through­out your orga­ni­za­tion and the opti­miza­tion process starts again, this time with revised goals based on these results.


Another unique piece of Michael’s process is the “paper trail,” and the weight that doc­u­men­ta­tion car­ries when it comes to refin­ing and opti­miz­ing design. In Michael’s opinion—and I couldn’t agree more—many orga­ni­za­tions are focused on doc­u­ment­ing every­thing, leav­ing them with more than any per­son could ever effec­tively use. On the other extreme, there’s the keep-nothing con­tin­gent, who run the risk of redun­dancy and dupli­cated efforts because there’s no back­ground or foun­da­tion for any­thing they’ve got in the works. Within the iter­a­tive opti­miza­tion method­ol­ogy, doc­u­men­ta­tion should be:

  • Light­weight, enabling oth­ers to quickly under­stand the lay of the land—what was done, why, and what resulted;
  • Stan­dard­ized, to ensure that the entire team is—and will forever—communicate in a pre­dictable way, facil­i­tat­ing hand-offs and smooth tran­si­tions between the team as well as future teams dur­ing all steps of the test­ing and inte­gra­tion processes;
  • Reusable, help­ing team mem­bers save time and avoid rep­e­ti­tion when it comes to writ­ing up work; and
  • Inte­grated into a library that’s acces­si­ble for every­one, so cur­rent and future team mem­bers can assess what worked, what didn’t, and what spe­cific method­ol­ogy was used for each test.

When done cor­rectly, doc­u­men­ta­tion is an essen­tial piece of the stream­lin­ing process, and can help keep the trains run­ning at every stage of the process.

Qual­i­ta­tive Research

Another piece of the method­ol­ogy puz­zle is qual­i­ta­tive research that, as Michael states, “offers an invalu­able way not only to get into the cus­tomer mind-set, but also to come up with ideas for inno­va­tion.” From qual­i­ta­tive research and the results that ensue, Michael argues, busi­nesses can gen­er­ate test ideas while simul­ta­ne­ously gath­er­ing insights that can help guide evo­lu­tion and indus­try in this rapidly chang­ing marketplace.

Unlike quan­ti­ta­tive research, qual­i­ta­tive pro­vides an hon­est, authen­tic per­spec­tive of given online expe­ri­ences. By rely­ing solely on analytics—and quan­ti­ta­tive data—businesses are fail­ing to address the thought processes that com­pel con­sumers to con­tinue on (or aban­don) their jour­neys within your site. With that pow­er­ful knowl­edge busi­nesses can adjust, adapt, and respond to what con­sumers are think­ing, feel­ing, and inter­pret­ing, and not sim­ply wait for the num­bers to be the sole dri­ver of next steps. Qual­i­ta­tive research answers two key ques­tions quan­ti­ta­tive can’t: what are the unmet con­sumer needs, and what is inform­ing the ana­lyt­ics data—in other words, the “why” of it all.

Step one in inte­grat­ing qual­i­ta­tive data is to define con­sumer goals that could explain why cus­tomers are choos­ing to engage with your busi­ness. Start by devel­op­ing a “most likely” bucket based on quan­ti­ta­tive data and exist­ing cus­tomer feed­back such as open­ing a deposit account within a finan­cial ser­vices site, or look­ing for a gift on a home goods site. Michael sug­gests ask­ing your­self, “If cus­tomers could come to the busi­ness look­ing for help with only one thing, what would it be?” Start at the top of the goals moun­tain and roll down from there, focus­ing on small seg­ments of your audi­ence and attempt­ing to address ques­tions in broad swaths. Qual­i­ta­tive research won’t help you make niche design deci­sions, but it will help you under­stand the over­ar­ch­ing moti­va­tions within your cus­tomer base.

Step two is for­mu­lat­ing the ques­tions, from the fun­da­men­tal ones (why do cus­tomers come to this site and how much they under­stand about the busi­ness), to ques­tions about search capa­bil­i­ties (includ­ing the ease of find­ing con­tent and the breadth and depth of offer­ings), to ques­tions tied to the decision-making process (such as the rel­e­vance of infor­ma­tion and pro­mo­tions pro­vided and poten­tial points of con­fu­sion). Michael even rec­om­mends ques­tions on com­pe­ti­tion and con­ver­sion points, such as bar­ri­ers to click­ing “buy now” and dis­com­fort with ship­ping fees and pric­ing. The list will change and evolve over time, and may take some unex­pected turns as more qual­i­ta­tive and quan­ti­ta­tive data is col­lected, but this is a good jumping-off point.

Beyond heuris­tic reviews and obser­va­tional cus­tomer research (the more tra­di­tional qual­i­ta­tive assess­ments), Michael touches on other key research meth­ods includ­ing ethno­graphic stud­ies, sur­veys, cus­tomer pan­els, diary stud­ies, card sort­ing, and feed­back forms, plus ways to act on your new-found qual­i­ta­tive insights (hint: ver­i­fy­ing through ana­lyt­ics) as well as next steps for test­ing. All of this consumer-driven data will become the foun­da­tion for your opti­miza­tion roadmap, a key step in Michael’s methodology.

That’s not to say met­rics and other quan­ti­ta­tive data aren’t impor­tant. They’re crit­i­cal to your organization’s suc­cess now and down the road. But for our pur­poses we’re assum­ing you’re track­ing some core quan­ti­ta­tive metrics:

  • Vis­i­tors
  • Vis­its
  • Page views
  • Rev­enue
  • Orders
  • Rev­enue per visit
  • Con­ver­sion rate
  • Click-through rate
  • Bounce rate
  • Aver­age order value
  • Page views per visitor
  • Time on site

There are, likely, a host of other company-specific data points you’re track­ing, but these are no-brainers that, when aligned with qual­i­ta­tive infor­ma­tion, pro­vide a more holis­tic view of your customers.

Next week we’ll put it all in action, with a com­pre­hen­sive walk-through of test exe­cu­tion within this iter­a­tive approach, from lay­ing the ground­work for suc­cess­ful design tests to wire­fram­ing and design to setup and qual­ity assur­ance and even com­mu­ni­ca­tions sur­round­ing the wins that ensue.