Why ‘vanity’ L&D metrics cannot serve your business long-term
- 5 Min Read
The days of organisations simply looking at surface-level metrics that don’t help them elevate their learning and development programmes are over.
Many companies tend to use basic metrics to measure the success of learning and development (L&D) programmes – if indeed they measure at all. So perhaps it’s no surprise that 42% of attendees at our recent “The Evolving Role of the CLO” webinar cited showing ROI from L&D as their biggest challenge.
For some companies, the success of their programme is measured by whether their annual L&D budget is fully spent so they are sure to receive the same training budget the following year. To others, it is measured by the number of people who attended training courses or the number of people who passed a training course. In some cases, these measures are important for regulatory compliance, but whether they move the needle on impacting business outcomes is another story.
The reality is that these types of metrics do not actually assess if the training ultimately led to improved outcomes. Nor do they determine if people are being taught the right skills.
Far too many companies are focused on what Steven Angelo-Eadie, head of learning services at global digital business services firm Emergn, describes as “vanity metrics”., describes as “vanity metrics”.
“Vanity metrics are easy to calculate, look nice and make you feel good,” says Angelo-Eadie, who cites the example of a company getting a lot of registrations for a webinar. “If you get a tonne of people registering, you think ‘that’s amazing’.”
L&D program audit
However, the real questions organisations should be asking are: How many engaged with the speakers during the webinar? How many reached out afterwards? What were the results post-webinar? According to Angelo-Eadie, those are lagging metrics, and are a true measure of success.
He says there are similar issues relating to learning programmes where organisations often fail to ask the right questions of people who have undertaken training, and as a result, they don’t get a true insight into how effective the training was.
“If you ask someone if they enjoyed the training course, and they say ‘yes’ you need to know why they said ‘yes’,” says Angelo-Eadie.
“Did they say ‘yes’ because they were given tea and cake during the course, or because they learned something new? Similarly, if they replied ‘no’, was this because the air conditioning was turned up too high or because they didn’t find the course content beneficial?”
He says organisations need to rethink what they are measuring. To do that, first they need to establish what success looks like for their business.
“You need to be capturing whether learners have applied those new skills or if there’s been a change in mindset,” says Angelo-Eadie. “Ultimately, you want measures in place that show that the learning investment is worthwhile because it’s leading to an observable change that makes a positive impact on the business.”
Currently, too many organisations are measuring the wrong things and that’s partly because many are – as Angelo-Eadie puts it – “data rich, but knowledge poor.” He says that such organisations do not know what to do with their data because it is not tied to anything, and they suffer from data paralysis.
“What it comes down to is that most companies don’t have a good enough handle on what they should be measuring, and what they do measure, they make unbelievably complicated,” he adds.
Measuring L&D programmes effectively
Angelo-Eadie also notes that determining what needs to be measured is a straightforward process. He suggests starting by establishing the problem your learning programme is intended to solve. Then, determining measurable changes you can track and analyse to know that you have solved the problem.
For some, this is easier said than done. And this is where a partner like Emergn can really help. Emergn’s Academy platform includes a bespoke onboarding process, where clients are guided through setting metrics that are relevant to their organizational goals.
“We help companies understand where the baseline is on an individual level and as a team,” says Angelo-Eadie. “We establish this is where the current skill set is, this is what they know, this is what they don’t know. Then we give them access to the courses that can be done at their own pace. Each course includes knowledge-based and practical assessments so you can see how people have improved from A to B and have confidence that learners can apply newly acquired skills to their work.”
He adds that while it is important to assess the success of learning programmes on an individual level, you must measure success on a higher level. “People work in teams, so the measures have got to be team-based, outcome-based and customer-based. Because that’s where it really matters.”
Emergn’s Academy enables learning to take place with a group of peers, which Angelo-Eadie says is a more effective way of learning.
“Learning with a community means you get to share ideas, talk to each other and say, ‘I really didn’t understand that bit, what about you?’, and you learn from each other,” he says. “People do pretty well learning on their own, but they do better when you bring in their peers, and they do even better when you bring in an expert, but only when they need it.”
Want to learn more about how Academy provides measurable ROI of learning programmes? Get a personalised demo at https://www.emergn.com/academy/academy-demo-request/.