Few weeks back, we posted the first part of this blog series in which we discussed how Casey, an occasional user of Captivate can make use of the analytics in ACCC (Adobe Captivate Course Companion).
Today’s blog post is about how Robert, an intermediate-level user of Captivate can make use of the analytics in ACCC in order to understand users engagement with his courses.
Robert : I create courses for a large audience. I have a standard group of reviewers who give their feedback on the courses that I develop. Sometimes, a course goes through many review cycles and iterations. What I want to know is how do ‘actual’ users respond to my courses? I feel that reviewers have become over-familiar with the topics of our courses and we may be missing the real users’ perspective. I want actual user feedback and make changes to course if needed before rolling out to larger set.
ACCC – ACCC provides various details at slide, questions, and object-level, which are comprehensive and can be easily compared with similar courses or different revised versions of the same course. You can make use of the graphs and data to make informed changes.
Some of the questions that you can get answers for:
- Are the expected number of people taking the course?
- On which slides are people taking more breaks?
- What is the average breaks that users are taking in the course?
- Which buttons are users interacting with?
- How people are navigating through the course?
- Which part of the course is fairing well?
Go through the interactive video below to know how you can get answers to these questions using the learning analytic in ACCC. The examples and data in the video are taken from an actual course.