Learner surveys: Dusty storage boxes to massive impact in 1 day

Smile sheets, satisfaction surveys, Kirkpatrick Level 1s. Most training teams have these in place. Why?

Because we should gather learner feedback, it’s industry standard, part of the process/adult learning model/evaluation methodology, etc… All good reasons.

Not too long ago, I posted an entry called “How to measure success: Part 1 – Reaction“. In that post I talked about some of the pitfalls and challenges with level 1 surveys. I wanted to go abit more in-depth in an extremely impactful and easy solution. The Net Promoter Score.

If you aren’t familiar with NPS, this primer I put together to walk a team through the methodology is a useful kickoff to how NPS works.

The NPS methodology allows you to target the hardest part of feedback gathering. The “what now?”.

Many companies diligently ask the right questions and have followed good process to deploy and gather the feedback. Now what? In many cases that energy wound down as feedback piled up and other priorities came to the fore. Surveys would come in, get labeled nicely and stored.

The single most important survey objective should be the actionables put in place BASED on the feedback. For a fiscal plan to include a key goal focused on this, the payoff is huge, for an almost negligible cost. Using a Net Promoter Score (NPS) methodology for Level 1 surveys is a very effective way of driving great scoreboard metrics and very useful feedback!

NPS can help you deal with the main obstacles that pop up for Reaction Surveys:

  1. Annoyance factor increases proportionate to the number of questions. Fewer questions is ALWAYS better and will increase the response rate.

    I spoke with a large company analyst who saw 6 to 8000 surveys go out to every support customer engaged, and a decent response for a typical 10 question remote survey would sit around 10 -15%. This increased drastically the fewer questions asked.Now take into account that to get decent trending data for learner feedback, you would need say 15 to 20 solid responses to make good decisions on changes to a course. You would need to send out 150 to 200 10 question surveys.

    Because an NPS survey is designed to be Short and Sweet, it drives response rate up. Learners are less annoyed at a survey that takes them less than a minute and draws from the the information that is key in their mind; What they loved or hated.

  2. Getting data you can use. I rate the LMS as a “3”.

    What are you going to do with a 3? I mentioned in another post about slapping around a bunch of IDs, demanding they make the design a 4! Traditional survey’s don’t allow for this, they are metric based, not feedback based. Asking “Why did you Rate it X” will drive the learner to tell you exactly what you want to know and provide useful trending data to empower the design or support teams to action.

  1. Metrics are King. 30 minute discussion to the executive team or 3 second scoreboard?

    The reaction survey has an important secondary objective; Demonstrate your success and growth. If you have a 15 question survey, how are you detailing trends in an understandable fashion for the organization.

    Because NPS provides a clear metric you can graph from session to session, you can instantly set objectives, benchmarks and comparison models with different audiences. This data doesn’t show that learning has occurred (you still need a good level 2 model for that!), but this is what will convince executive management that you are impacting the audience positively and they are eager to come back for more.

Managing feedback from this type of survey is relatively easy. If you use a tool like Surveymonkey, the output gathered as CSV will allows you to make use of an excel formula to quickly tabulate the NPS result (total Promoter% – total Detractor% = NPS). Some online tools like Surveygizmo have steps you can use to autoreport the NPS result of your questions (here!).

As the data will be short and impactful, triaging the results is fast and easy. One person of my team was holding NPS triage sessions daily during a multi-day program, reviewing the results with the facilitator and actioning the feedback for the next days session. (way to go Lynda!)

The result? A upward tracking graph of NPS results to show leaders, satisfied learners and better equipped facilitators/instructional designers.

What more can you ask for?


About Andrew Ambrose
I am passionate about the learning longtail for formal and informal learning solutions, leveraging social media and networking technology for learning projects, innovation through mLearning, collaborative learning and applying solutions that fit within the learners personal learning environment.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: