Last week I took a look at part one of Michael Krypel’s new book Refining Design for Business, describing his “iterative optimization methodology” and a highly accessible, highly actionable approach to integrating consumer-centric design, testing, and strategic planning. Part two of his book provides those roll-up-your-sleeves next steps organizations need. Think of part two as the “now what?” stage, and a proven process hundreds of businesses have used to increase the efficacy of their site design. There are four key milestones within his method:
Optimization roadmap. Tapping into the qualitative and quantitative research you’ve culled, this stage is about developing a prioritized roadmap for testing across all areas of your organization. After each test there will likely be tweaks to your roadmap, with major adjustments coming during quarterly business reviews.
Optimization plan. Here, team members create plans for each test, including the objective, hypothesis, planned duration, approach, wireframes, and designs.
Optimization launch. The test is set up and taken through an exhaustive quality assurance (QA) process before launch.
Optimization results. Test results are analyzed, recommendations are made, and the roadmap is updated as needed. Results and next steps are communicated throughout your organization and the optimization process starts again, this time with revised goals based on these results.
Another unique piece of Michael’s process is the “paper trail,” and the weight that documentation carries when it comes to refining and optimizing design. In Michael’s opinion—and I couldn’t agree more—many organizations are focused on documenting everything, leaving them with more than any person could ever effectively use. On the other extreme, there’s the keep-nothing contingent, who run the risk of redundancy and duplicated efforts because there’s no background or foundation for anything they’ve got in the works. Within the iterative optimization methodology, documentation should be:
- Lightweight, enabling others to quickly understand the lay of the land—what was done, why, and what resulted;
- Standardized, to ensure that the entire team is—and will forever—communicate in a predictable way, facilitating hand-offs and smooth transitions between the team as well as future teams during all steps of the testing and integration processes;
- Reusable, helping team members save time and avoid repetition when it comes to writing up work; and
- Integrated into a library that’s accessible for everyone, so current and future team members can assess what worked, what didn’t, and what specific methodology was used for each test.
When done correctly, documentation is an essential piece of the streamlining process, and can help keep the trains running at every stage of the process.
Another piece of the methodology puzzle is qualitative research that, as Michael states, “offers an invaluable way not only to get into the customer mind-set, but also to come up with ideas for innovation.” From qualitative research and the results that ensue, Michael argues, businesses can generate test ideas while simultaneously gathering insights that can help guide evolution and industry in this rapidly changing marketplace.
Unlike quantitative research, qualitative provides an honest, authentic perspective of given online experiences. By relying solely on analytics—and quantitative data—businesses are failing to address the thought processes that compel consumers to continue on (or abandon) their journeys within your site. With that powerful knowledge businesses can adjust, adapt, and respond to what consumers are thinking, feeling, and interpreting, and not simply wait for the numbers to be the sole driver of next steps. Qualitative research answers two key questions quantitative can’t: what are the unmet consumer needs, and what is informing the analytics data—in other words, the “why” of it all.
Step one in integrating qualitative data is to define consumer goals that could explain why customers are choosing to engage with your business. Start by developing a “most likely” bucket based on quantitative data and existing customer feedback such as opening a deposit account within a financial services site, or looking for a gift on a home goods site. Michael suggests asking yourself, “If customers could come to the business looking for help with only one thing, what would it be?” Start at the top of the goals mountain and roll down from there, focusing on small segments of your audience and attempting to address questions in broad swaths. Qualitative research won’t help you make niche design decisions, but it will help you understand the overarching motivations within your customer base.
Step two is formulating the questions, from the fundamental ones (why do customers come to this site and how much they understand about the business), to questions about search capabilities (including the ease of finding content and the breadth and depth of offerings), to questions tied to the decision-making process (such as the relevance of information and promotions provided and potential points of confusion). Michael even recommends questions on competition and conversion points, such as barriers to clicking “buy now” and discomfort with shipping fees and pricing. The list will change and evolve over time, and may take some unexpected turns as more qualitative and quantitative data is collected, but this is a good jumping-off point.
Beyond heuristic reviews and observational customer research (the more traditional qualitative assessments), Michael touches on other key research methods including ethnographic studies, surveys, customer panels, diary studies, card sorting, and feedback forms, plus ways to act on your new-found qualitative insights (hint: verifying through analytics) as well as next steps for testing. All of this consumer-driven data will become the foundation for your optimization roadmap, a key step in Michael’s methodology.
That’s not to say metrics and other quantitative data aren’t important. They’re critical to your organization’s success now and down the road. But for our purposes we’re assuming you’re tracking some core quantitative metrics:
- Page views
- Revenue per visit
- Conversion rate
- Click-through rate
- Bounce rate
- Average order value
- Page views per visitor
- Time on site
There are, likely, a host of other company-specific data points you’re tracking, but these are no-brainers that, when aligned with qualitative information, provide a more holistic view of your customers.
Next week we’ll put it all in action, with a comprehensive walk-through of test execution within this iterative approach, from laying the groundwork for successful design tests to wireframing and design to setup and quality assurance and even communications surrounding the wins that ensue.