The leadership and innovation course you’ve just finished was bit of buzz – but what’s it worth to your organisation?
If you don’t apply what you learned to the work you’re doing, probably not whole lot. But then again, if you seriously enjoyed the process, at least they get the benefit of your gratitude, possibly your improved morale and even your increased loyalty.
Trying to get to grips with the value of organisational investment in people can be little like nailing jelly to the wall – the link between what’s learned and the bottom-line benefit is often indirect, intangible and hard to quantify. Which is why many human resource (HR) departments shy away from it.
A recent survey shows the majority of New Zealand businesses are putting money into HR initiatives without attempting any formal evaluation of the outcomes. And while the costs of those initiatives are routinely tracked, the benefits are not. It’s an approach that not only risks wasting money on poorly targeted initiatives but rather seriously undersells the impact of HR practices.
When the links are made between those practices and company’s organisational performance, they turn out to be pretty significant. The results of recent study on human capital practices in Canada and the US (The Human Capital ROI Study, Deloitte & Touche) suggest they represent as much as 43 percent of the difference between the market-to-book value of one company compared to another.
It also found that while small set of human capital practices are universally valuable in driving financial success, it’s not case of one size fits all. practice that works for one organisation may not work as well for others and certain widely accepted practices vary in how they add value.
It seems few New Zealand companies would be able to identify where they get the best ROI on their human capital spend. nationwide survey conducted jointly by Deloitte Human Capital and education specialists The Learning Curve, and involving 108 organisations of varying sizes, found that just 18 percent measure the tangible business impact of their HR programmes. Most of the rest are “using intuition, perception and general observations”, says Learning Curve director Cheryl Reagan.
The result is that while the tangible cost of HR is readily apparent on corporate balance sheets, the return on that investment remains intangible. That makes it lot easier to take the cost-cutting knife to HR programmes when times get tough, warns Jack Phillips – visiting American who’s been described as “the gold standard” on capturing the ROI in learning.
According to Phillips, the local survey results are bit disappointing and lag behind international best practice. “It’s not all bad news. There are some great investments in people going on here. The missing part is the ability to show the value of particular project or programme. There’s perception you can’t do this – it’s too difficult or costs too much, but it doesn’t have to be that way.”
Both Phillips and his wife/business partner Patti have authored books on evaluating human capital investment (“I do the fat ones – hers are more readable”), and visited New Zealand recently to run workshops on the subject in partnership with Deloitte.
“Our advice,” he says, “is to take on the challenge and put some impact in around those initiatives; show the returns. Otherwise you’re in this soft fuzzy area and when people are looking to cut bud-gets during downturn, that’s where they’ll cut because they don’t know the value.”
But downturn is when HR spending should increase, even if staffing levels have to be dropped, says Phillips while conceding that it can be hard to persuade senior management that’s the case – particularly if HR benefits aren’t captured.
So how to go about it?
Sooner not later
Asking whether money put in development/training programme was well spent couple of years down the track is worthless – you have to build evaluation into the programme right up front, says Phillips.
He has an example from the August issue of Management. The Government is putting $20 million into leadership programme. Great stuff, but the only measure of return mentioned was retrospective – the rate at which participants move into leadership roles. Okay, there could be more to it, but the point he’s making is that you can build some accountability around the investment from the word go.
“For major project like that, you’d want to see $200,000-$300,000 of the $20 million go towards putting the right kind of accountability system in place, so you can see where it’s gone and measure the output,” says Phillips. “You can show the pay-off… but you have to think up front and design [the programme] to collect the right kind of data to show impact as you go. Checking the pay-off in two or three years’ time is just too late. You’ve already spent the money.”
It could turn out that the programme design is faulty, the content inadequate, the wrong participants selected or there could be faults along the way. Maybe what’s learned is not being put into practice.
“There are lots of things that can cause programmes to go astray so our suggestion is to put some rigour in up front and in some cases try to predict the ROI. What is this trying to achieve, what is the impact? Not all of it can be converted to monetary value,” he adds.
“If you are trying to reduce employee turnover in the group or maybe boost productivity, these can be converted, but if you are after better teamwork or enhanced communications, then you can’t.
“You have to look both at short and long-term goals and make the distinction between tangible and intangible returns – and we have clear guidelines as to whether you can credibly convert to monetary value.”
These guidelines are part of “The ROI Process” trademarked product from the Jack Phillips Centre for Research (a division of Franklin Covey). The process is divided into four main stages: evaluation planning, data collection, data analysis and reporting.
The measures are based on what the particular programme sets out to achieve. You start, says Phillips, with the end in mind. “What is it you want to accomplish with this particular programme and how will you know when you have? You design the project with results in mind and build in measurements.
“Nobody likes data collection particularly if it looks like an add-on. But if these collection points are built in, it saves time and helps drive results. Intuitively, if you step back and look, that makes sense, but so few do. They get excited by the challenge and don’t think about the accountability until somebody wants to know.”
Five evaluation levels
The timing and types of measures used need to be matched to objectives and Phillips draws five boxes representing different evaluation levels. First is reaction – either to particular programme or any HR change; the learning – what new skills and knowledge have been acquired; the application – what’s being done differently as result of that learning; impact – what’s the consequence of this behaviour change; and this leads to ROI. Phillips sums it up: “I react to your programme, learn something I have to do, do something different, it has consequence and it has positive or negative ROI.”
The measures may be hard (eg, output increase, quality improvements, time or cost savings) or soft (eg, customer or employee satisfaction, work habits, staff morale). Evaluation instruments may include questionnaires, surveys, tests/demonstrations, interviews, focus groups, observation or performance records.
The measures are often not well targeted to objectives, according to Phillips. For instance, questionnaire asking participants their level of satisfaction with particular learning programme doesn’t provide any useful information. “I may not be so satisfied with the facilitator but very satisfied with the lunches – so? But if we ask questions like ‘how useful or relevant is this to you in your job right now?’ or ‘d