When faced with the sheer volume of data available today, we’ve found that more does not usually mean better. Unless you have a department or team devoted to data analysis, usable information can be difficult to find, much like a leaf lost in a forest. Many platforms have built-in analytics, which can overwhelm users with information and colorful charts, making it easy to mistake collected data for actionable items, which can in turn lead to unnecessary or unproductive process changes.
Here are some lessons we’ve applied to our own collection of data and use of analytics:
1. Narrow the Focus
Just as we do when building an e-learning course, deciding which information to analyze requires us to take a hard look at our goals and ideal outcomes. When we’re deciding which course results to track, for example, we first have to decide what defines success in both the short term and the long term. Which metrics align with each of these goals? How do these metrics relate to one another? It’s useful to know the completion rate of a course, but how do we determine if this translates to desirable behavioral changes on the part of our audience? What exactly IS that desirable behavior? Asking these questions allows us to sort through the available information and narrow our focus.
2. Talk to Stakeholders
Once we’ve set our desired parameters, it’s important to review them with the project stakeholders. Do they agree with our stated goals? Do they have any input on day-to-day operations that may skew the data? For example, a course that is launched during a time of the year where many people are on vacation will most likely have a lower completion rate, regardless of how engaging it is, simply because much of your target audience will be unavailable. A course may come back with different analytics depending on if it was launched with advanced learners or with a fresh batch of seasonal hires. In-depth conversations with stakeholders, along with small samples of the targeted data analysis, can help confirm your metrics are properly aligned with your goals.
3. Revisit Goals
Next we set a period of time to collect data - at the end of this period, do our results align with our expectations? If not, do we know why? Was a metric missed? Do we need to adjust our analysis, or is our analysis simply returning different results than we expected? We go back to our stakeholders with these results, clarify the goals, refine the metrics if necessary, and then repeat the process until we are certain that our results are reflecting reality accurately.
This isn’t the end of how best to make your data work for you, but hopefully it’s a fruitful beginning. Just remember that even though it’s possible to track and aggregate almost any information about your e-learning, this isn’t always the best solution.