Author Archive: Diana Joseph

We’re hiring! Updated: Req is closed

The Community Help & Learning (CHL) group works with the broader Adobe community to identify and provide Adobe learning and troubleshooting content. As we expand further into mobile, social and online communication channels, our content and community leads increasingly need ways to illuminate the customer experience.

We are the Learning Research group within Community Help & Learning (CHL), and here’s what we get to do all day in support of the CHL vision:

  • Identify and measure our success – How well do we provide the information users need, in the way they need to get it?
  • Collaborate with our content and community leads to move toward success
  • Conduct design research – Collaborate with research scientists & engineers on iterative investigation toward technical solutions
  • Decision support – Should we spend money on a solution? How could we test to find out?
  • Understand the big picture – staying abreast of search optimization, social and online communication, mobile content, online ethnographic methods and whatever else comes our way.

We’re looking for two colleagues, one a recent (or soon) graduate for a full-time internal Learning Researcher position, and one a more senior person coming in as a contractor.  Both should have background in multiple research methods (such as quant, ethnographic, user) and understand measurement in the context of social/online communication, mobile content, or maybe both!

Join our team! To view the job posting for the contractor position, please visit here or here. The full-time recent graduate position has not yet been posted. We will update with the link when it is available, but contact Jill Merlin, jmerlin@adobe.com to get started if you’re too eager to wait for the official posting.

Major lessons from observing user workflows

We recently asked Create with Context (CWC), an independent research and design company, to conduct lab studies of four important user workflows involving Adobe products. We wanted to understand the effectiveness of the learning experience around these workflows, and figure out how to improve them.
We learned three important lessons that we think will apply across every workflow and learning experience using Adobe products and learning resources:
* Users would prefer _not_ to learn something new in the middle of their work. Rich learning experiences like this one for Flex are good for advancing your skills — you need something different when you just want to get something done. We need to find ways to deliver appropriate content quickly, while still offering rich resources when people have time for them.
* Users are in a big hurry and read as little as possible. This matches what we know from prior research. The tricky part is that some of the time, users need to read in order to get what they’re looking for. How far can we boil down our content? How can we help users understand when it’s actually worth reading?
* Users may not know the technical language for the techniques they want to learn. This is a big obstacle to effective searching! We need to figure out how to connect the words people use to describe what they are looking for with the words used in learning materials.
Coming soon: The methodology behind these lessons

So, what does a learning research team do exactly?

Our team is responsible for much of the research used in the Learning Resources Group. Our colleagues in Learning Resources not only develop learning content for all Adobe products, but also administrate the communities, and maintain the navigation and search mechanisms involved in the learning experience on Adobe.com. They seek out and share community-created content, support the community moderators who help manage community participation, and produce the product Help. In order to do this effectively, they conduct extensive investigations into the needs and preferences of the Adobe community.
The research team’s work boils down to three kinds:
Summative – we are the scorekeepers for the Learning Resources group. We’re especially interested in user success, user contributions (comments that add value to our learning resources), search success metrics (abandonment, clickthrough, and search modification), and calls to the Support call center.
Formative – we compile data that our colleagues, Adobe’s corps of content leads, can use to improve the above scores. For example, we report open-ended survey responses, contributions by product, search success by product and query, and calls by call category. We also work with them on studies of user behavior, to find out exactly what gets in users’ way when they are trying to learn Adobe software.
Decision-support – we help our colleagues articulate their design decisions, questions whose answers will inform those decisions, and strategies for answering those questions.
Going forward, we’ll write more about each aspect of the work, and report on some of our findings.