Question 3 – Usability Testing

Question 3 from last weekend’s unanswered list is from Mike Lyman:

“What is the best software to use to A/B test a site w/changes to get better conversion?”

Great question. My answer’s probably going to be a strange one- the best software is eyes, brains and fingers. To clarify that (which would seem necessary, eh?), I prefer to do direct usability tests of the site at different phases of the project – pre-testing before determining new designs, post-testing of new designs/layouts, and then very careful click/path analysis after launch to see if the ‘real world’ reaction follows suit. (I’m hoping you got a chance to see Jared Spool’s session at the Jam Session- which probably answered this question far better than I could).

There are, of course, eyetracking software packages that can literally tell you where user attention goes in relation to a page design with stunning degrees of accuracy (Morae being my favorite – which may answer your question, but keep reading, please!), but I usually fall back to a warm body in a chair- specifically, the users of your site. Recruit a few site visitors (perhaps people who have responded through online contact forms, or expressed issues with the site already?), and then determine the real user paths you want to support with the design/layout/navigation, and create specific, goal-oriented test cases – then record the results. You might even want to consider the same tests with people who aren’t familiar with your products and services and site, to really see what a fresh set of eyes and opinions can bring to your current design.

For example, if one goal of your redesign is to improve the conversion of inbound clicks to sales, a use case could be simply ‘from the home page, find a product that lets you do (X), and then purchase it’. The paths and flows a real-world user will take may astound you, but are priceless in determining how link structure, site navigation and general design may be hindering the simplest of tasks.

This ‘first round’ of testing usually drives the requirements for the redesign. For example, if no one clicked on the ‘store’ link while performing the task example above, but instead shopped the products section of the site and added items to a cart from their catalog pages, you may want to examine the position and relevance of the store link entirely – or perhaps make the catalog itself more front-and-center in the design/navigation so less clicks are required to find and select a product for purchase.

A second round of testing with the new design/layout/site usually helps validate your design decisions. Did the user task behavior change appropriately to better support your goals? Or did it introduce new problems? This way you can start drawing direct correlations between cause and effect- how your design affects user interaction.

I realize this may be a bit of a dodge of your question- which software best supports this process – but honestly I have a minor lack of faith in heuristic software to find these navigational and design shortcomings, as they just don’t mirror real humans clicking on real links, buttons and items on your site. Plus, the manual approach is far more fun. ;-)

Jakob Nielsen has some good (and recent) thoughts on user testing here, and of course I’ve adopted many of my own opinions and thoughts on UE testing from Jared Spool and the UIE team (attended my first seminar with Jared about 10 years ago, and the advice I got there has stuck with me to this day).

Hope this helps answer your question (specifically as this is not my forte, but something I’ve done a lot and really enjoy), if anyone has additional thoughts/critiques, please bang out a comment below.

2 Responses to Question 3 – Usability Testing

  1. FWIW, did A/B testing. We configured Apache to serve different versions of pages to different users, based on a programmable percentage (and we cookie’d users so they would get a consistent experience on repeat visits). We used extensive web analytics to track the effects of the A/B versions and then picked one version or the other based on the results (assuming we got anything significant from the tests).

  2. Interesting- I love the idea of combining the analytics/tracking package with a/b version testing. I’ll have to pick your brain about that offline sometime soon… :D