L&D is a Master of VR

When Ed and David released Running Training Like a Business (RTLAB) it was clear to many that the industry needed a new way of looking at not just how and what we were training employees but why.  The book aspired to take the industry discussion up a level.  Away from the micro of courses, design methodologies and technology to the macro and meta.  The book encouraged a turn inward away from the course and curricula towards the creator, the L&D organizations itself.  What the factory was designed for, pre-determined what the output was.  Transforming the organization would transform the output and the value it produced.

In 2010 when I started writing the Learning Hacks blog as a way to capture my musings on L&D I began with a blog entitled “The Spark That Started It All”, the working title for this post can still be seen in the URL for the post.  It expressed my disappointment that many of the challenges described in RTLAB, over a decade prior, remained unaddressed.  In my book Running Training Like a Startup I cite one of my favorite Ed Trolley quotes.  A quote that was validated in many of the assessments we did for clients around the world.

“Business leaders have low expectations of training. And they are being met.”

-Ed Trolley

Yesterday, Harvard Business Review released an article entitled. “Where Companies Go Wrong with Learning and Development” that put things in clear perspective. In it Steve Glaveski highlights recent studies that show:

  • 75% of 1,500 managers surveyed from across 50 organizations were dissatisfied with their company’s Learning & Development (L&D) function;
  • 70% of employees report that they don’t have mastery of the skills needed to do their jobs;
  • Only 12% of employees apply new skills learned in L&D programs to their jobs; and
  • Only 25% of respondents to a recent McKinsey survey believe that training measurably improved performance.

Glaveski nets it out this way, “Not only is the majority of training in today’s companies ineffective, but the purpose, timing, and content of training is flawed.”  I don’t disagree.

While the L&D community hold conferences dominated by sessions on how to create compelling Powerpoint title slides, the use of chatbots, and incorporating podcasting into a curriculum, the businesses they support keep moving and changing desperate for employees that can perform.  In the late 90’s I was tasked to lead a project for Microsoft.  At the time they were under intense scrutiny for monopolistic practices.  It was also a time when Fred Reicheld (who would later create the Net Promoter Score) released the “Loyalty Effect” debunking the marketer “top-box” approach to assessing satisfaction.  I won’t go into it here but when retention does not show a drop off as satisfaction goes down there are other market forces at play. High switching costs, tie-ups and lack of alternatives can be some of those drivers. The retention results don’t reflect the satisfaction of customers (it may make it worse because they feel trapped) but it does give the provider an extremely distorted view of how it is performing.

Over 20 years post the release of RTLAB the data on L&D’s customer satisfaction continue to come in.  While the L&D industry focuses on budget amounts, spend per employee and other “vanity metrics”, the HBR article clearly shows it is long overdue for the learning organizations that are delivering leadership training to take a leadership role.  For the L&D groups supporting innovation initiatives to innovate.  For the industry, as a whole, to take off the goggles and stop living in its virtual reality world.

Some L&D Math and Some Questions

Disclaimer: I am a lover of data.

I had some time play with some of the data in ATD’s State of the Industry Report and it raised some questions for me. In order to better understand the ATD data, I looked at the “implied” results that are not included in the report. Because ATD includes data such as percentage of revenue and percentage of profit I can simply reverse the calculation to see what the trends are for both revenue and profit per employee. Since these are the ultimate measures of the success of learning, the trends in these should be trending positive or at least correlated to the investment in learning being made by organizations.

impact2

The first thing that stood out was the delta between the implied revenue per employee (RPE), a common public market metric, and the profit per employee in the ATD report and the S&P 500 average. According to Yardeni, an economic advisory, the 2016 Average RPE for S&P 500 ranged from $321,000 and $1.7 million depending on industry with a profit margin of approximately 10%. The revenue discrepancy for Consolidated cohort is understandable given the smaller size for many of the reporting companies for the ATD data. The comparison to the BEST cohort is closer but still under the S&P averages.

The comparison to profit per employee was similarly off.

craft

I then looked for a correlation between learning and an impact on revenue and/or profit in two ways. First, I looked to see how the numbers compared year over year. I then looked for a correlation between learning and an impact on revenue and/or profit in two ways.

First, I looked to see how the numbers compared year over year. This view showed that the increased percentage of investment in learning, touted as a positive reflection on businesses opinion of learning in the ATD report might be misplaced. The ATD report states “Confirming organizations’ commitment to learning, this indicator [% of profit] grew from 8.3 percent in 2015 to 8.4 percent in 2016; the ratio has climbed steadily for four years in a row.”

impact1

While ATD seems to draw a positive connection, in fact this may simply be a case of reported profit and revenue dropping, things that businesses care about. There appears to be no correlation. The resulting chart shows years where learning hours rose and the implied profit or revenue dropped. If there is a return to be captured from learning, the ATD numbers don’t seem to reflect it. I did a similar look lagging the revenue and profit a year, to let the impact of the learning spend sink in. Still nothing that showed a correlation much less a causation.

As I stated in the post on benchmarks, be careful.