Nicole Papaioannou Lugara
What Video Data Tells Us About Learning-- And What It Doesn't
YouTube, Vimeo, Kaltura, Wistia-- all big names in video hosting technology. All of these companies are selling their products by promising significant data reporting capabilities. That's what consumers want.
But even when this data is collected, it often goes ignored, whether it's for lack of time or because people just don't know what to do with the information. Many of my peers in learning and development never look at this data beyond saying "hmm. Interesting."
So in this blog post, I'm going to break down for you some of the useful data points you can gather with these types of sophisticated video hosting systems (if you're lucky, one is built into your company's LMS/LXP) and some ways this data CAN'T be interpreted.
To start, get familiar with your video hosting service's dashboard. What data is being reported?
For example, in Thinkific, I'm able to see what people are:
watching and rewatching
how much of the video is holding their attention
how often they play it when they see it
how many times it's loaded up versus how many times it's played
how many hours of viewing time it's had
how many unique visitors have checked out the video
With YouTube, there are many more options. The language used is also different, geared toward being useful for marketers. Their dashboard includes reporting on features, such as:
The question you probably have is: how do I use this data as a learning professional?
Things to Look For
You don't need to be a data scientist to make use of the data on your video dashboards.
Is there somewhere in the video where people just stop watching? In the example shown here, the number of watches drops sharply at 6 seconds in.
When I go back to the video, I can see it's just after the title slide.
Once people are in the video, the watch rate stays much more consistent.
Why are people starting the video, seeing the topic, and then dropping off? Is it because they think they know it already? Is it because they're simply saving it to return to later?
If the answer is the former, then it's time to assess whether this is a needed topic and if it is, is it obvious to people why they need it?
Replays are one of the most important data points you can gather about video usage.
They indicate that one of the three things are likely happening:
learners are interested or recognize the need to learn what they see here
learners are confused
there's a mistake in the video or technical error
For example, you can see in the video below that there were 22 rewatches, nearly 20% of total video usage, at 2 minutes and 14 seconds.
When I go to the video, I see it's the start of an empathy mapping model explanation. There is no technical glitch, which means people are either really interested in learning this process or they're confused by it.
In this case, I'm going with: they're interested. I can make that assumption because when they turn in the practice component related to this model, I see that they're completing the task and meeting expectations.
Really, though, the only way to confirm whether this part was interesting or confusing would be to ask users.
Average Engagement & Play Rate
Average engagement is derived from what percentage of the video users are watching. It's not necessarily linear. People may be forwarding through to the content they want. The play rate is simply how many people pressed play on the video when they saw it.
You may be thinking "I want 100%" and while, sure, that would be great, I would encourage you to consider whether it's truly necessary.
Do people really need to watch every second of the video, or can they skim what they know and stop at what they need to know?
Do they have to watch the video, or is it there for them to use as a resource when they come to this part of their job?
That said, if Average Engagement or Play Rate is extremely low you should ask yourself: Is this content relevant?
The Big Picture
Overall, video data is really useful for identifying what's useful, what's relevant, and what's confusing. It's helpful for improving content and cutting out the fluff.
But it doesn't tell a story on its own.
At the end of the day, you'll need to go back to your metrics and goals and see where you need to fill in the gaps, whether that's with performance data or user feedback or a combination.
Want to learn more about collecting, analyzing, and applying data to design?
Get on the waitlist for the next cohort of From Data to Design.