Developing a digital mindset: How digitally literate is your L&D team?
- 5 Min Read
Chief Learning Officer and L&D specialist David James explores digital literacy in HR and the importance of evaluating L&D initiatives.
Digital learning & development isn’t new. We may have used different terms in our evolution from e-learning to online learning and now digital but we’re certainly no novices to using technology in our profession. However, one question looms large over L&D’s digital adoption:
How do we know if it works?
It’s an important question on many levels:
- Learning technology is rarely inexpensive. In fact, it can be very expensive, when we include platforms, content suites, and bespoke content development.
- It’s expensive to take employees away from their work to learn. 30 minutes here and there adds up.
- With skills gaps becoming increasingly urgent and organizations spending more on reskilling efforts, the ability to actually reskill (rather than just consume content in hope) seems to have presented L&D with its day of reckoning.
But how do we know if our digital L&D works?
In a recent survey that polled 500 employees, it was discovered that only 29% of respondents thought that their organization’s online learning (eLearning, virtual workshops or massive open online courses [MOOCs]) was effective. That is a damning assessment of corporate online learning considering both the expense and also the hope pinned on it making a difference to development and capability at scale.
One area digital learning is expected to make a difference is in support of new managers but in the same survey the biggest criticism of the training provided was that it was too generic and not specific to the situations faced in the new manager’s role.
But how can this be when L&D are neither new to using technology and the investment is so huge?
L&D big on implementation, little on analysis
If I were to ask you what problems you were seeking to solve with your organization’s learning technologies, what would you respond?
- To provide just-in-time learning content to a large number of employees all vastly different needs?
This statement not only seems logical, but it’s been the broad rationale for investment in an online learning provision for decades. Providing an exhaustive library of content in a large system, ideally with a pleasant user interface and analytics at the backend is de rigueur, is it not?
I’ve known L&D teams get a great deal of internal credit for implementing new systems. With a little customization to make it look and feel like an internal system employees will flock to the whizzy new platform, won’t they? Well, they may be intrigued but rarely does a ‘successful’ implementation lead to sustained engagement, let alone what employees need to become better at their jobs and to improve their prospects.
So what does it take to move beyond another ‘successful’ implementation to achieving a planned outcome, i.e. actual upskilling or reskilling?
The answer is analysis.
Data & evidence is your friend… and doesn’t need to be scary
The bedrock of digital is data. Without it, we have more of the same expenditure without the returns.
This doesn’t mean burying our heads in spreadsheets staring at numbers and hoping to see patterns. It means asking questions about what we’d see if assumptions were true and seeking data to validate whether it’s true or not and then understanding what’s going on from there.
So when it comes to new managers, the assumption that they need development as they transition can be validated by asking and answering a question like: What would we see if promoting managers without preparation or guidance was problematic at your organization?
We might see:
- Unproductive teams (missing targets);
- Disengaged or disgruntled team members;
- Complaints to HR;
- Team members leaving the organization;
- Over a longer period of time we might see a lack of talent coming through;
- Talented or well-regarded employees plateauing;
- Talented or well-regarded employees applying for roles outside of the team
We could answer these questions by seeking the data relating to teams in which a new manager had been appointed in the last 12 months versus data from teams where the manager is longer established.
What does the data say? Is there a problem to solve? What problems specifically need addressing? In what areas of the organization?
Armed with this data, you can then speak with these managers to understand their experiences; speak with their managers to get a different perspective, and even with their team members to understand what the transition lacked.
With this data and evidence, you are then in a position to address issues or problems directly with targeted digital resources and workshops. And what would be your measure of success? Well, you know what the data was before and so an improvement in results would indicate success and anything other than improvement would suggest trying something else.
This seems harder than implementing a new system, right? And what was the alternative? Buying exhaustive libraries on content and courses, none of which refers to known issues new managers face as they transition into a new role in the very specific context of their team and stakeholders inside your organization.
Digital literacy is far from buying new platforms and content suites and more about scaling development efforts to address known problems and this is why a foundational understanding of data will advance any team’s digital literacy more than any new platform ever will.
So the first question you should be asking your L&D team to test its digital literacy is:
How do we know our digital learning efforts are delivering planned results that affect people’s ability to do the work required today and tomorrow, beyond subjective assessments?
If you’re able to answer this question then your team is on the literacy scale. If you’re not able to answer it then they’re not.