top of page
  • Writer's pictureNicole Papaioannou Lugara

Audience Analysis for Engaging Social Learning: Part 3

Drawing Insights from Audience Analysis Data

You’ve planned your analysis events, conducted your user interviews, and you have a big pile of data about the audience you’ll serve. But data is just data until you start to draw insights from the trends you’re observing.


In the ideal world, you’ll have Learning Analysts and Organizational Psychologists trained in statistics, data analysis, as well as qualitative and quantitative research, to make sense of the information.


But in the real world, most training departments lack these roles.


So let’s talk about some preliminary steps you can take to make connections between the trends and the ways you’ll design your social learning experience.


Design Clues


You’re looking for clues in the data that will you help you design in these key areas:

  • Connection to their authentic experience

  • Opportunities to provide practice that supports learning transfer

  • Motivators that will keep them engaged in the experience

  • Avoidance of apathy triggers – things that will make them want to exit the learning experiences


Mixed into your audience analysis is going to be information about the organization and performance, and you should absolutely be making use of that data. For the purposes of this post, I’m going to my best to stay focused on audience-specific information.


Connection to their authentic experience


The concept of “What’s in it for me?” is pretty common in our field, yet the WIIFM often stops at posting vague learning objectives. Once you get to know your audience, you can get beyond this ineffective surface level strategy and create real connection by honoring the authentic experiences that your adult learners are already having in work and in life.


In the data, look for clues about:

  • Communication style

  • Work style and environment: How and where they work best, as well as the environments in which they find themselves challenged (broadly, not related to a specific task)

  • Levels of experience (novice, experts)

  • Whether this is a group of folks who have come from many different industries or people who have been in their industry for a long time

  • Technologies they use at work and in life


Leverage this data to determine the appropriate:

  • Conversation starters -- How do you get them talking about what they already know? What common experiences (shared reality) may they have?

  • Pacing of connection -- How fast or slow can you expect them to share personal insights and/or be vulnerable in the group setting?

  • Scenes and scenarios

  • Language usage

  • Learning Activities

  • Depth of content

  • Level of self-regulated learning

  • Technologies

  • UX design


Opportunities to provide practice that supports learning transfer


We want to streamline learning. Even when we create fun games and nontraditional learning experiences, what the audience is learning needs to be made easy to recall and apply on the job. If there is too big a gap, and they can’t figure out how to transfer those concepts, then the learning doesn’t actually serve the need.

A lot of this will come information about performance and the performance environment. So again, for the sake of length, let’s look at what insights you might pull specifically about people:


Look for clues, such as:

  • Sentiments about their capabilities and competencies. For example, if they lack confidence even though they seem to have the skill, previous training may not have made the connections between what they’ve learned and what they’re doing clear. On the flipside, there may be management issues that contribute to a culture of fear, perfections, or lack of clarity around the work.

  • Risk-taking or risk-averse personalities

  • How they learn outside of work


These trends can help you decide:

  • how to create opportunities where peers can ask questions to one another and get reliable information

  • scenes and scenarios for collaborative problem solving

  • how much you need to scaffold the learning

  • how much explicit connection and explanation you need to make between the learning and their performance context

  • how much positive reinforcement you need to provide

  • how much "content drip" needs to take place

  • whether you should be leveraging synchronous or asynchronous communication tools for peer to peer learning

  • tone and style of content (e.g., you may need to be softer with a group that lacks confidence)


Motivators that will keep them engaged in the experience


This is one of the fun ones. Once you figure out your audience’s currency, you can keep them tuned in and coming back for more.


Some common motivators include:

  • Money

  • Time with family

  • Solving puzzles

  • Status

  • Recognition

  • Helping others


You can build these into your learning experiences. For example, if you know your audience likes recognition, a public shoutout for great participation and/or completing the learning experience can be motivating. If someone is motivated by money, you can directly show through your learning content how the training creates opportunities for them to earn more. If it’s about solving puzzles, let them solve a real challenge together.


Avoidance of apathy triggers


During group and one-on-one interviews with learners, I like to ask one question that usually results in a bit of laughter followed by strong responses:


“When it comes to training, what makes you roll your eyes and want to skip it? What makes you disconnect?”


There is so much great data that comes from this question about what not to do. Sure, there are usually some similarities-- bad presentations are bad presentations; poor facilitators are poor facilitators; the WIIFM is unclear and feels like a time waster—but there are lots of unique factors, as well.


For example, in highly inclusive work cultures, learning experiences that are designed without the same vision of inclusivity can make people disconnect (e.g., they are using antiquated, offensive, and/or oppressive language to discuss marginalized groups, or the content reinforces stereotypes). And when moderators fail to support inclusion, in any group, the social learning experience can quickly fail-- and doubly so in these highly inclusive cultures.


Look for the things that disconnect your audience and avoid them in your experience design, multimedia, and written content at all costs!


Evidence-Based Decisions


A word of caution as you start to analyze the data—especially qualitative—don’t let one or two people’s perspectives define the entire experience for everyone. And certainly don’t look for individuals from marginalized groups to represent the entirety of the group (tokenism).


Really look for trends. Ask yourself:


How many times are these phrases used? Sentiments expressed? Trends mentioned? The more you see it, the more likely it’s important.


How many audience members out of the entire group represent this trend? Again, the more people who state the same views, use the same language, or note the same issues, the greater likelihood that there is a noteworthy trend emerging.


Is it a need? Is it a want? There’s a famous quote from Henry Ford that goes something like “Had I asked the people what they wanted, they would have told me a faster horse.” It often gets used to say “We’re not going to listen to audience.” But really what Ford is saying here is that he genuinely understood the NEEDS of his audience, not just their wants. He knew that a faster horse actually just meant they want to go further faster. It was Ford’s job (the company) to execute on something that solved that problem. This is your challenge too.


Don’t just “hear” your audience. Don’t just convert 2 days of training into 20 minutes of self-paced eLearning because they said they don’t like that that the training takes so long. Listen to what they really need—maybe it’s something more relevant. Maybe it’s a cohort broken down into smaller sessions. Maybe it’s a discussion with management about the need to lighten the workload if it’s critical to prioritize this training.


Do I need to follow up with additional interviews or analysis activities to clarify or validate this information? Remember, analysis can be a multilayered process (and often should be). You don’t have to take one small data set as the complete capture. And you don’t just have to keep running interviews. Surveys, emails, LMS data, HR data… there are lots of ways to make sure it’s not just something people say but actually what’s happening.


What are the risks involved in misinterpretation? The higher the risk, the stronger your evidence—and the ability to evaluate the impact of your decision —should be.


The Big Takeaway

Data may be science but design is very much an act of interpretation.


The list above is by no means a conclusive list of what trends to look for or how to use the data you collect. The most important things you can do are to listen carefully, be conscious of your biases and blind spots, and try to be objective as you interpret what you see.


Again, have a plan for evaluating the impact of your most critical design choices. For example, did turning training into a game actually result in better people experience AND performance?


If you haven’t checked out the earlier blog posts about collecting data and interviewing for audience analysis, now is a great time to give them a read.



 

Contact us for a consultation.

コメント


bottom of page