You know your learners are learning something from your courses. After all, that’s what you created them for in the first place. But the real question is: are YOU learning anything from them? Your training analytics can tell you a lot about your courses. You can determine how well or *dare we say* poorly they are doing for your learners.
The problem is, most designers aren’t actually collecting this data. And those that may have access to some information, aren’t actually looking at it. Like Waze notifying you there’s a turtle in the middle of the road along your route, you should use information along your projects to improve your end product. We’ll look at several different outcomes you can look at for improvement opportunities.
Time Spent
We’ve talked before about training seat time, and aligning this with learner availability / expectations is crucial. First, managers may not allow employees to get away from their job responsibilities for very long. You need to ensure your training fits into the time-frame given to employees.
Second, people get bored REALLY fast these days.
So, more than ever, we need to make sure we’re chunking material appropriately. Not only is this crucial from a cognitive load perspective, but it correlates to our attention spans as well.
I know what you’re saying. “I already thought of this and factored that into my design. I’m a savvy vet, yo!” Ok, great, you’re a step ahead. But have you confirmed this is how long your course *actually* takes your users?
By confirmation, I don’t mean how long it takes you to complete the course. I also don’t mean how long it takes your SMEs or core reviewers to complete it either. You see, you all are really familiar with the course / content by now, so your times aren’t really representative.
Instead, you need to know how long this course takes the average user to complete; those people taking it for the first time. This will give you the real answer, and it might not be what you thought.
You see, learners likely will take longer to complete activities than your testers. They also may or may not complete everything the way you envisioned. The only way to know is to sample the data.
Many LMS can tell you how long users took to complete the course. So, what if you find the course takes learners longer to complete than its supposed to? You could streamline the course. Otherwise, think about breaking apart the course into multiple modules.
Identify Roadblocks
As we discussed above, sections or activities that aren’t completed as expected is one possible reason a course takes learners longer to complete than you planned.
It can be common to find that certain activities or interactive sections aren’t as intuitive to learners as they seemed to the designer. Obviously some of this can come out if you do user testing. Regardless, you should monitor this in the field. You may be tripping up users in ways you didn’t even realize.
Whether it’s a complicated activity or confusing navigation, you need to ensure the course experience is as streamlined for the learners as you expected. Nothing short-circuits a training course like users getting frustrated because they don’t understand what they’re supposed to be doing.
Monitor whether learners are progressing through the entire course. Also check whether any questions or activities have a disproportionate number of failures. This could point to poorly written questions, confusing instructions, or gaps in the content.
Device Identification
So how do you gather these training analytics, you may be wondering. Funny you should ask. There’s several ways. The first thing to do is make sure you know what type of device your users will use to access the training. This one is really important to know BEFORE you design the course.
Even though eLearning tools like Articulate Storyline and Adobe Captivate can create desktop and mobile in a single publish, you really need to make different design decisions if a course is accessed via mobile.
If you’re designing this course for your employees, you may already know exactly how they will access your courses. If not, you’ll need to do a bit of investigating.
There are several ways to do this, including issuing a survey to your users. However, you may be able to collect more automated data. If your users are already accessing the location your course will reside via the internet, you could use Google Analytics to determine their device.
Several of our clients, especially ones who upload courses to their website for public access, get lots of great info from Google Analytics. In one of these cases, the client determined that an overwhelming majority of their clients access their site from phones, so we designed their training to fit that screen size and functionality.
Monitor Competency Scores
Once your course is live, you should be tracking training analytics surrounding competency scores.
Most LMS can capture question-level data and return either individual or aggregate user responses. If any particular area is getting significantly lower scores than others, investigate.
You may find that particular questions are confusing or not written as well as you thought. Sometimes users point out that there are more correct answers than the one you specified. In any of these cases, re-work the questions.
You may also find that some of your activities are not as clear as you intended. Since you thought up the activity, and likely discussed it with your review team prior to building it, you’re all familiar with how it works and what users are supposed to do. But sometimes users exposed to it for the first time reveal that the instructions or the activity itself isn’t as clear as you hoped. React accordingly and either clarify the directions or modify the activity.
Track User Progress
Another important training analytic to consider is completion rate. Depending on how your distribution is set up, your LMS can again be your best friend. Completion is easy to track, if courses are prerequisites of each other. If courses aren’t required to progress, most LMS still have reporting that can give you this data.
But what if you don’t use an LMS?
Our same client above, who’s users all used phones, hosted their course on their website. For this reason, Google Analytics came to the rescue again. We embedded brief javascript code within the Storyline course at specific milestones (like each timeline stage below) to send pageview data to GA. This measured time on slide, number of views, and thus allowed us to determine if / where users dropped out.
Capturing this level of data helped us determine if anything was wrong. Problems could be navigation issues, confusing material, or sections of content that weren’t relevant enough to sustain interest. Any data that suggests a significant dropout should be investigated further and rectified.
Satisfaction Ratings
One of the most crucial aspects of learning from your courses is getting feedback from your users. Classroom trainers have been issuing “smile sheets” (and more meaningful survey tactics) for forever. However, once things moved to eLearning, a lot of the user rating collection stopped. That said, it’s as important as ever to learn how users feel about the courses we create.
You see, sometimes we’re too vested in our own creation to see it objectively.
But just like Debbie from the local town hall meeting – your end users have no problem telling where you could do better. Unfortunately, we aren’t always giving them a vehicle to do that.
So, make sure you collect feedback. And whether that evaluation data is formative, summative, or confirmative (ideally some of each) you should be putting that knowledge back to work for you.
Depending on the type and timing of the feedback, you may or may not decide to make edits to the current course. Regardless, user feedback and attitudes about your courses should definitely shape future training decisions. Whether you went out on a limb trying a new style of course, or are just trying to maintain the status quo, user feedback will tell you if you hit the mark.
Provide Personalized (Adaptive) Learning
One of the benefits of collecting all this data (and paying attention to it) is providing a more relevant learning experience.
For example, you can ensure style, tone, and approach match your culture. Different audiences have different expectations. Reacting to training analytics can ensure you’re aligned with your audience’s desires.
Another possibility, with the right data, is creating a personalized training experience. For example, you could create an “adaptive” structure, where learners complete courses the way that best suits them.
For example, you could structure a course so that it scaffolds based on user competency. You provide learners with questions, and their performance determines their learning path. Get questions right and they progress to more challenging tiers. Get questions wrong and they receive remedial content.
Personalization doesn’t have to be about competency. You could offer elective content that dives deeper into certain topics. Alternatively, you could offer more than one way to access chunks of content (click here to read about X, click there to view a video about X).
Allowing users multiple “paths” through a course shows you their tendencies and / or competencies. The more you know about them, the better you can design the paths. The more you track the paths, the better you can craft future courses to their tendencies.
Training Analytics Improve Future Efficiency
While you may not have been formally incorporating training analytics into your process, it’s not hard to start. Some of these techniques are easy. They don’t require technical expertise; only follow-through.
Other techniques are more advanced, but can reap significant benefits. Just make a plan. Incorporate a few techniques. Collect the data. And use it to craft courses that fit your audience.