Posts in Category "Methodology"

Using live web-based user interaction studies

Over the past few months we have been experimenting with a new (to us) methodology: live web-based user interaction studies. This methodology allows us to observe and interview, in real-time, a user on Adobe’s online Help and Support pages who is trying to solve a problem.

 

How it works:
Using a service called Ethnio, we are able to set up a screener that pops up to visitors on a certain page. If the user is interested in participating, he or she is asked to fill out a short survey and to provide us with his/her phone number. We are then able to immediately phone the user and conduct the interview  and directly investigate his or her experience of trying to solve a problem using the Adobe site.

Example Study  – Printing Tips

The most frequently visited document in the Adobe.com knowledge base is Printing Tips – an article helping users with common printing tasks in Acrobat/Reader. This document was receiving 5 million page views per month. The vast majority of visitors arrive at this page through a button in the printer dialog box labeled ‘Printing Tips.’ When you click this button, you are taken to the Printing Tips web page.

Aside from being so highly trafficked, this document also had a poor user rating (~50%), making it a sensible target for improvement.  The Printing Improvement team had analyzed a lot of data from different sources including web analytics and text analysis of user comments. This analysis led the team to make several well thought out content improvements to the document. Despite these efforts, the user rating didn’t budge and remained at 50%.

It was clear to the team that they needed a deeper understanding of the customer experience in order to improve the document, and so we conducted a study piloting the user interaction methodology. We had 4 main questions for users:

  1. What problem were you trying to solve?
  2. How did you get to this page?
  3. What were you expecting when you clicked the Printing Tips button?
  4. Did the page help solve your problem?

After conducting 7 interviews, we discovered that users were coming to the page by accident, in the course of testing each element in the print dialog box, which included the Printing Tips button.  The true source of the pain was not poor content but a confusing dialog box.

The Printing Improvement team was able to use these findings to actually improve the print dialog box.

When to use interaction studies:

In our group, we advocate using this methodology when you have some sort of mystery to solve about user behavior. If users are behaving in a way you don’t understand and the quantitative data available isn’t providing you with the necessary insights, and you have a focused research question, a user interaction study may be beneficial. However, we also have some guidelines about when NOT to use an interaction study:

  • When you already have a pretty good idea about what is going on – user interaction studies are time-intensive; and
  • When you are trying to identify a pattern – interaction studies can provide hints and insights, but the number of participants is generally too few for us to generalize our results to the general population.

 

 

 

 

 

 

 

So, what does a learning research team do exactly?

Our team is responsible for much of the research used in the Learning Resources Group. Our colleagues in Learning Resources not only develop learning content for all Adobe products, but also administrate the communities, and maintain the navigation and search mechanisms involved in the learning experience on Adobe.com. They seek out and share community-created content, support the community moderators who help manage community participation, and produce the product Help. In order to do this effectively, they conduct extensive investigations into the needs and preferences of the Adobe community.
The research team’s work boils down to three kinds:
Summative – we are the scorekeepers for the Learning Resources group. We’re especially interested in user success, user contributions (comments that add value to our learning resources), search success metrics (abandonment, clickthrough, and search modification), and calls to the Support call center.
Formative – we compile data that our colleagues, Adobe’s corps of content leads, can use to improve the above scores. For example, we report open-ended survey responses, contributions by product, search success by product and query, and calls by call category. We also work with them on studies of user behavior, to find out exactly what gets in users’ way when they are trying to learn Adobe software.
Decision-support – we help our colleagues articulate their design decisions, questions whose answers will inform those decisions, and strategies for answering those questions.
Going forward, we’ll write more about each aspect of the work, and report on some of our findings.