Anyone who has delivered in-person training, given a presentation, or run a briefing meeting knows how useful feedback can be. We’re not just talking about surveys or satisfaction forms filled in after a session. We’re talking about the kind of in-person, immediate feedback that lets you adjust your delivery in real-time.
If you see eyes starting to wander, maybe you’ve spent too long on that slide. If people are looking up at you in confusion, perhaps your diagram isn’t that clear. If the people at the back are leaning forward and squinting, you should probably have used a larger font.
With e-learning though, it’s not always that simple. How can you tell if learners are actively engaged and enthused by your courses, or just clicking through activities they don’t understand to get to the end? Graded assessments do the job to an extent, but pretty late in the process, and by then, your learners may have switched off for good.
Learner Feedback Loops
What you need is an effective learner feedback loop—a process that gathers information on how learners are experiencing, interacting with and understanding your content—to see where you need to make interventions or improvements.
Ideally, you’ll receive feedback at each stage of the learning cycle—before, during and after each activity, course or segment. The more granular the data you can gather, the more detail you’ll have to inform future learning design.
Here are our three top tips on how to effectively prompt and collect valuable feedback from your learners during each of those stages.
Before
First impressions are important—whether it’s the first time you’ve engaged with this group of learners or just the first time an existing group of learners have seen your latest course, you want to get off to the best start.
Asking the right questions at this stage will give valuable insight into how learners approach your courses, how easy it is for them to access the resources they need, and what they’re expecting to gain from the experience.
01 Find out what learners are looking for
To start with, it’s useful to find out why learners are accessing your course in the first place, and what they hope to learn. This is valuable information, as it allows you to assess how well you’ve promoted or marketed your course—if expectations are way off the mark, perhaps your course description is misleading.
It also gives you a benchmark to compare post-course feedback against—did the experience live up to the expectations which learners had going in?
02 Set a baseline with an intro quiz
Are you pitching the difficulty level accurately? One way to find out is with an introductory quiz to gauge the level of preexisting knowledge your learners have on a given topic and establish whether the course will be too easy or too hard. This also provides a baseline for post-course comparison, so you can more clearly see the “value add” of the course.
On a practical note, it’s also a useful exercise for the learner—if their score on the introductory quiz is much higher or lower than you would expect, it might be best to direct them to an alternative exercise better suited to their level.
03 Evaluate the onboarding process
The best-designed courses are useless if learners find it too hard to access them. Encourage your users to get in touch if they have problems logging into the platform, finding the right section, enrolling on a course or navigating through learning materials.
Adding a “need help?” link, support email address or chatbot to the relevant pages allows learners to report any problems they have in accessing resources, activities or assessments.
Even minor issues of this type can be extremely off-putting, particularly for users who are new to the platform, so it’s useful to have a complete picture of how well your onboarding process is working.
During
Once learners are onboard and interacting with courses and activities, you want to find out if there are any roadblocks, anything standing between them and getting the learning done, or anything that could be made easier for them.
The key thing to remember at this stage is that you don’t want requests for feedback to distract learners from the core purpose of the course. You need to include opportunities to give feedback that are clear and easy to use, yet unobtrusive.
04 Make it easy to flag up any issues with activities or resources
As in the onboarding process, it’s important to ensure that learners have a clear channel for submitting feedback when they’re working through a course, whether that’s an immediate request for assistance or more general comments on a particular issue.
There are a number of ways to achieve this, depending on the LMS you’re using, including inserting “Give Feedback” buttons at strategic points through the course, to making chat or messaging functions available for learners to contact a course leader or manager to assist with more complex problems. Remember—the more convenient you can make it for the user, the more likely they are to interact with the feature.
05 Have a process in place to log and collate requests
It’s important to remember—especially if you’re using real-time feedback channels such as chat or messaging—to record the topics or issues which learners are asking about, in order to be able to analyse and respond to them later. There are a variety of ways to do this, from recording common requests in a shared document, to using a ticketing system (which many customer-facing businesses may have in place already) to store, categorise and review learner feedback.
The key is to ensure you’re not only responding to learners “in the moment”, you’re building up a broader picture of their experiences which you can use to inform how you design courses and activities in the future.
06 Add an outro survey or quiz to capture immediate reactions
Building in a short feedback activity to the end of each activity, or section of a course, helps to ensure that the information you collect from learners is accurate and comprehensive. While post-course surveys are valuable (as we’ll discuss below), by the end of a longer course, users may have forgotten about issues specific to a single activity.
You don’t need to design a detailed questionnaire—simply providing a comment box for learners to submit any observations is useful. Alternatively, you could prompt users to score the activity they’ve just completed against several categories—ease of use, difficulty level, and so forth—and then contact individual learners for more detail if they’ve scored one element particularly low.
After
With the course complete, it’s time to find out how things went – both functionally and from a learning and development perspective. This stage requires balance—ask too few questions of your learners and you’ll be missing out on potentially very valuable data, ask too many and they may not finish the survey.
One option is to make the course completion, certification or grade contingent upon the learner submitting a post-course feedback form. If you add an explanation that the data is needed to improve the course for future learners, and you don’t overdo the number of questions, this should be fairly well received.
07 Assess how well your learning matched expectations
Having already requested learners to outline their expectations for the course in stage 1, you can now ask them to compare their earlier statement to their actual experience and see if your e-learning is delivering as expected. It’s best to ask for specifics here, rather than a general yes/no answer—there may be different aspects of the course which are more or less in line with what your users were hoping for.
If there are significant gaps, then the information you gather provides an excellent starting point to tweak, refresh or completely overhaul the course, depending on where the problems lie and how severe they are.
08 Ask practical questions too
As well as asking about the learning process, it’s a good idea to add a few questions about the format and functionality of the courses. Was enough time allowed on timed quizzes? Was it easy to locate and download resources without having to navigate too far? Was the layout and font size easy to read?
These seem like simple issues, but fixing minor annoyances like these can elevate a course from good to great, removing distractions, creating a better user experience and encouraging future engagement.
09 Communicate what you’ve learned and what you plan to do
Finally, having gathered all this data, it’s important to be as transparent as possible with your users about what you’re going to do with it. Not from a data protection point of view—although that’s something you should be aware of when designing and implementing surveys and questionnaires—but on a practical level.
It’s a good idea to periodically publicise the decisions you’ve made based on the data gathered, e.g. “In response to our recent survey, where 65% of you mentioned the onboarding process was hard to navigate, we’ve now implementing single-sign-on, so you can access Moodle with your Google Workspace credentials.”
As well as keeping everyone up to speed, open communication like this reinforces the benefits of participating in feedback activities, assuring learners that their concerns are being taken seriously, and making them more likely to offer useful insights in future.
Ultimately, the key to a successful feedback loop is ensuring learners feel involved and motivated—inspiring a sense of shared ownership of the LMS.
Up next
In our next post on learner feedback loops, we’ll be looking at the specific tools available within Moodle LMS and Moodle Workplace which you can use to gather and review user experience data. Keep an eye
In the meantime, if you’d like to know more about how to implement an effective learner feedback process, give us a call or email. We’d be more than happy to advise you on gathering, collating and interpreting your data, and building a better learning experience.
Privacy Policy | Cookie Policy | Data Protection Policy | Equality, Diversity and Inclusion Policy
© 2023 Titus Learning LTD | Company Number 08799881 | VAT Number 1813 09027
Super talented, unflappable and very funny, Phuong supports the whole marketing team in her role as Digital Marketing Executive. Phuong holds a bachelor’s degree in Business Administration and recently completed a master’s degree in Management and Marketing. Originally from Hanoi in Vietnam, Phuong is now based in the UK and climatising brilliantly to our weather and food.
Phuong owns a food review Instagram page as travelling and food are her passion. She also has a cute little french bulldog.
Ellie was the first woman to join Titus and has paved the way for many more since then. After studying for a degree in Fashion and Marketing, Ellie was lucky to find herself at fashion weeks and photoshoots.
Now she’s switched from talk of the front row to front end design and has brought loads of transferable knowledge to Titus. Ellie has also found a real passion for tech, especially in the learning sector, helping clients create positive change for their organisations.
As one of the youngest people at Titus but at the same time one of the oldest serving members of the team, Callum has graced Titus with his broad smile and positive attitude for over 5 years now. As a key member of the marketing team, Callum works across all areas, both on and offline, to ensure that all Titus brands and communication are on point.
After missing out on the opportunity to go to University the first time around, management encouraged him to enrol in our course alongside his work. He is now studying to achieve his Level 6 Diploma in Professional Digital Marketing.
Always bringing innovation and new ideas, Dec studied a degree in Journalism but found his passion in digital marketing. Dec has also worked in marketing for one of the countries biggest retailers and within the property sector.
Outside work, Dec Co-founded a news publication where he collaborated with global brands like Uber, Amazon, BooHoo and countless SMEs.