Goofus Data

Goofus image

When I was a kid, one of the only exciting things about going to the dentist was the chance to catch up on my Highlights magazine reading.  The childrens’ magazine is famous for a monthly feature titled “Goofus and Gallant” which showed the behaviors of good children versus those of not-so-good kids.

I was reminded of these cartoons as I sat, frustrated once again, listening to the media and politicians discuss Covid data. If you wanted to put together some real life Goofus examples for dealing with data you don’t have to look any further than the local or network news. From “garbage in, garbage out” to mistaking the data as the end and not an input to a deeper insight Goofus seems to be hard at work daily.

Don’t have unclear/inconsistent reporting standards.

What is the definition of a Covid death? When do numbers get reported (even on Sunday?)

Don’t focus on the wrong data.  

Infection count is only useful or important in the context of audience size or tests conducted.

Don’t look at daily data if the system operates on a different time scale.

We know there is a lag between action and impact with Covid.  Would a rolling 14 day average be more useful for planning and trend analysis?

Don’t lose the message in averages.

Pull out a few early states, or remove the elephant that is New York and watch how the chart of the country’s battle changes.

Don’t use the wrong units.

Percentages can be a marketers friend (100% growth of a small number sounds better than the actual number) but sometimes it is also the best way to understand the data. Percentage (%) of beds in use versus number (#) of hospitalizations is more readily understandable when ICU beds are a key capacity constraint.

Those are just some of my daily irritants.  And don’t get me started on false positive % or how an exponential function works (just watch this old shampoo commercial.) https://youtu.be/mcskckuosxQ

Do you work with data?  What would you add?

The Math of Upskilling

The case for learning versus hiring has long been a topic of discussion. With the recent job market as tight as ever the conversation continues.  Just this week Josh Bersin (or as I call him, “JB”, not because I know him that well, just because it sounds cool) released the highlights of a study done with three firms that concluded,

“It can cost as much as 6-times more to hire from the outside than to build from within.” – JB

While I can take issue with the phrase, “as much as”, since I have a dog who can be obedient “as much as” half the time.  Or perhaps the sample size, only three companies in different industries. Or maybe that the study used highly paid jobs +$150k salary to joust at.  But none of that will stop the industry from using this stat widely.  This may be fine at L&D conferences but try it with a CFO and you better be prepared with the math.

So that you know that I am not picking on JB (who I think is the Seth Godin of Human Capital) the issue that I have is with reports that don’t stay loyal to the kind of math that has credibility with finance folks.  While for some, being able to simply cite a case study with a recognizable company may be enough.  For me it is not. And for my own learning this blog is my attempt to take Jane Bozarth’s work out loud approach and show my work.

“And showing what we’re doing—narrating our work in a public way—helps make learning more explicit.” – the other JB

The Case for Upskilling

We start with the simple comparison of costs to determine value.  If the result is positive then reskilling wins.  If not, hire away.

The value of reskilling (V) = Cost of New Hire (CN) – Cost of Reskilling (CR)

Seems simple enough but the devil is in the details.  So lets break it down further.

V= [Cjn+Chn+Cpn+Co+Cs] – [Cjx+Chx+Cpx+Cu]

The cost  of new hire (CN) equals:

  • Cost of job opening (Cjn) plus
  • Cost of new hire (Chn) plus
  • Cost of lost productivity (Cpn) plus
  • Cost of onboarding (Co)
  • Cost of redundancy/severance (Cs)

The cost of reskilling (CR) equals:

  • Cost of job opening (Cjx) plus
  • Cost of transfer hire (Chx) plus
  • Cost of lost productivity (Cpx) plus
  • Cost of upskilling (Cu)

This approach leaves some very real variables out:

  • Calculation does not include fully loaded employee costs (benefits, occupancy, equipment, etc.  This is assumed to be a wash between CR and CN.
  • Does not include quantifiable costs associated with loss of investor confidence due to layoffs  which would likely show up in stock price.
  • Does not not quantifiable costs associated with loss of employee/candidate confidence due to layoffs such as; unplanned attrition, longer time to hire, reduction in candidate quality.
  • Does not include the 2X-3X higher turnover rate for new hires used by JB for his calculation.

Please let me know what I have missed and how this calculation can be more valid and useful.  In my next post I will further breakdown each of these costs, insert some assumptions (cost of onboarding/upskilling, recruiter fees, time to productivity, etc.) and share the excel spreadsheet plus the results it spits out.

 

L&D is a Master of VR

When Ed and David released Running Training Like a Business (RTLAB) it was clear to many that the industry needed a new way of looking at not just how and what we were training employees but why.  The book aspired to take the industry discussion up a level.  Away from the micro of courses, design methodologies and technology to the macro and meta.  The book encouraged a turn inward away from the course and curricula towards the creator, the L&D organizations itself.  What the factory was designed for, pre-determined what the output was.  Transforming the organization would transform the output and the value it produced.

In 2010 when I started writing the Learning Hacks blog as a way to capture my musings on L&D I began with a blog entitled “The Spark That Started It All”, the working title for this post can still be seen in the URL for the post.  It expressed my disappointment that many of the challenges described in RTLAB, over a decade prior, remained unaddressed.  In my book Running Training Like a Startup I cite one of my favorite Ed Trolley quotes.  A quote that was validated in many of the assessments we did for clients around the world.

“Business leaders have low expectations of training. And they are being met.”

-Ed Trolley

Yesterday, Harvard Business Review released an article entitled. “Where Companies Go Wrong with Learning and Development” that put things in clear perspective. In it Steve Glaveski highlights recent studies that show:

  • 75% of 1,500 managers surveyed from across 50 organizations were dissatisfied with their company’s Learning & Development (L&D) function;
  • 70% of employees report that they don’t have mastery of the skills needed to do their jobs;
  • Only 12% of employees apply new skills learned in L&D programs to their jobs; and
  • Only 25% of respondents to a recent McKinsey survey believe that training measurably improved performance.

Glaveski nets it out this way, “Not only is the majority of training in today’s companies ineffective, but the purpose, timing, and content of training is flawed.”  I don’t disagree.

While the L&D community hold conferences dominated by sessions on how to create compelling Powerpoint title slides, the use of chatbots, and incorporating podcasting into a curriculum, the businesses they support keep moving and changing desperate for employees that can perform.  In the late 90’s I was tasked to lead a project for Microsoft.  At the time they were under intense scrutiny for monopolistic practices.  It was also a time when Fred Reicheld (who would later create the Net Promoter Score) released the “Loyalty Effect” debunking the marketer “top-box” approach to assessing satisfaction.  I won’t go into it here but when retention does not show a drop off as satisfaction goes down there are other market forces at play. High switching costs, tie-ups and lack of alternatives can be some of those drivers. The retention results don’t reflect the satisfaction of customers (it may make it worse because they feel trapped) but it does give the provider an extremely distorted view of how it is performing.

Over 20 years post the release of RTLAB the data on L&D’s customer satisfaction continue to come in.  While the L&D industry focuses on budget amounts, spend per employee and other “vanity metrics”, the HBR article clearly shows it is long overdue for the learning organizations that are delivering leadership training to take a leadership role.  For the L&D groups supporting innovation initiatives to innovate.  For the industry, as a whole, to take off the goggles and stop living in its virtual reality world.

Your Learning Dashboards

Your Learning Dashboards: how to make sure they deliver actionable insights.

  1. Target a specific audience.
  2. Involve end users in the design process.
  3. Provide adequate context.
  4. Describe how to interpret the numbers.
  5. Choose the right data charts.
  6. Anticipate the flow of questions.
  7. Streamline for easier consumption.
  8. Highlight what’s important.
  9. Recommend prescriptive actions.
  10. Review content periodically.

In Running Training Like a Startup I propose several dashboards for L&D to consider. What dashboards are watching and sharing?

When is it not Capital? When it’s Human.

Every annual report talks about people being the company’s most important asset.. Except it isn’t.  A quick glance at the balance sheet reveals no line item for people.  The facilities are there.  The equipment is there. But nowhere do the people show up as an asset. 

Over the last few months I have been doing a bit of research into the historic effects of automation on the workforce.  Think elevator operators and bank tellers.  More on that at a later date. [spoiler alert: the robots are indeed coming for some of our jobs but that is a good thing] One thing that came from my research is that companies are incented to automate on multiple fronts.  It is on one of these fronts, accounting, that we may have a lever to incentivize upskilling.

A quick primer.  When companies buy a robot, or any piece of equipment, they pay for it but rather than have it simply take cash out of their account it does something else.  If they robot is estimated have a working life of 10 years the company places that “asset” on its balance sheet, reducing its value for every year of service.  This asset sits opposite the debt the company has, allowing it to borrow more. If I replace a human making $50K with a robot that costs $250K but is expected to last 10 years after the first year I have an asset worth $225K on my balance sheet (the cost spread over the lifespan less the first year).  If I spend $5K  to upskill the employee to perform at a higher level, equivalent to the robot, I have nothing but an expense that hits my bottom line.

Large publicly traded companies are evaluated quarterly by Wall Street.  The results reported often drive a short-term mindset but it also keeps key metrics front and center for these companies.  In addition to the asset to debt ratio, one of these measures is revenue per employee. This simple metric, top line revenue divided by the number of employees offers a clear way to see the benefit of automation. If a company can simply hold its revenue steady while reducing its headcount it looks better on paper than a company that might grow revenue modestly with the same, but upskilled, workforce.

So what if a company’s investment in  people could be truly treated as an asset.  Invest $1k in an employee and your average tenure for that role is 3 years. Why isn’t that a capital investment to be added to the balance sheet ($666). The switch is a case of accounting policy but what is more interesting to me is what the change in behaviors of companies might be.  If employee upskilling was treated as a true capital investment would L&D see more money, stricter reporting standards and a more respected seat at the table.?      

Some L&D Math and Some Questions

Disclaimer: I am a lover of data.

I had some time play with some of the data in ATD’s State of the Industry Report and it raised some questions for me. In order to better understand the ATD data, I looked at the “implied” results that are not included in the report. Because ATD includes data such as percentage of revenue and percentage of profit I can simply reverse the calculation to see what the trends are for both revenue and profit per employee. Since these are the ultimate measures of the success of learning, the trends in these should be trending positive or at least correlated to the investment in learning being made by organizations.

impact2

The first thing that stood out was the delta between the implied revenue per employee (RPE), a common public market metric, and the profit per employee in the ATD report and the S&P 500 average. According to Yardeni, an economic advisory, the 2016 Average RPE for S&P 500 ranged from $321,000 and $1.7 million depending on industry with a profit margin of approximately 10%. The revenue discrepancy for Consolidated cohort is understandable given the smaller size for many of the reporting companies for the ATD data. The comparison to the BEST cohort is closer but still under the S&P averages.

The comparison to profit per employee was similarly off.

craft

I then looked for a correlation between learning and an impact on revenue and/or profit in two ways. First, I looked to see how the numbers compared year over year. I then looked for a correlation between learning and an impact on revenue and/or profit in two ways.

First, I looked to see how the numbers compared year over year. This view showed that the increased percentage of investment in learning, touted as a positive reflection on businesses opinion of learning in the ATD report might be misplaced. The ATD report states “Confirming organizations’ commitment to learning, this indicator [% of profit] grew from 8.3 percent in 2015 to 8.4 percent in 2016; the ratio has climbed steadily for four years in a row.”

impact1

While ATD seems to draw a positive connection, in fact this may simply be a case of reported profit and revenue dropping, things that businesses care about. There appears to be no correlation. The resulting chart shows years where learning hours rose and the implied profit or revenue dropped. If there is a return to be captured from learning, the ATD numbers don’t seem to reflect it. I did a similar look lagging the revenue and profit a year, to let the impact of the learning spend sink in. Still nothing that showed a correlation much less a causation.

As I stated in the post on benchmarks, be careful.

Benchmarks and the Danger of Data

CherryPicking

We are always excited to read the annual installment of ATD’s State of the Industry.  Cited year round by our clients, and the industry as a whole, this compendium of data is seen as an important touchstone for many L&D professionals. But, while these numbers are used by so many to justify a sought after initiatives or validate current activities, benchmarks can be misleading.

The other note about benchmarks and data is that cherry picking a single data point or even source can be misleading.  While it can be comforting, it comes with a caution.  While some will point to the positivity of increased spend, others will cite the data from Bersin by Deloitte, Corporate Executive Board and others, that shows the lack of confidence in L&D, the amount of waste caused by scrap learning or the negative net promoter score for L&D.

Achieving benchmarks is not the goal for today’s learning organizations.  While directional, every company is its own group of one. Your company’s business strategy, market conditions and human capital are unlikely to be identical to any other. If you are spending lower or higher than benchmarks, and delivering no value, you are overspending. The reverse also holds true.  The true metric for learning professionals to watch is their contribution to the success of the businesses that L&D serves not spending levels.

Some additional thoughts for math lovers here.