Your Learning Dashboards: how to make sure they deliver actionable insights.
Data is a powerful weapon for Learning & Development organizations. While a lot of attention has been spent on collecting and even understanding it, communicating it to those inside and outside the L&D organization is critical. Brent Dykes, the Sr. Director of Data Strategy at Domo, which is a cloud-based, self-service BI platform highlights the challenges many dashboards face. “Rather than giving us clear insights and direction, they can overwhelm us with too much noise, obscure what’s important (signal), create confusion and raise needless questions.” In the Forbes article he addresses 10 keys to dashboards that deliver “actionable insights.”
- Target a specific audience.
- Involve end users in the design process.
- Provide adequate context.
- Describe how to interpret the numbers.
- Choose the right data charts.
- Anticipate the flow of questions.
- Streamline for easier consumption.
- Highlight what’s important.
- Recommend prescriptive actions.
- Review content periodically.
In Running Training Like a Startup I propose several dashboards for L&D to consider. What dashboards are watching and sharing?
Every annual report talks about people being the company’s most important asset.. Except it isn’t. A quick glance at the balance sheet reveals no line item for people. The facilities are there. The equipment is there. But nowhere do the people show up as an asset.
Over the last few months I have been doing a bit of research into the historic effects of automation on the workforce. Think elevator operators and bank tellers. More on that at a later date. [spoiler alert: the robots are indeed coming for some of our jobs but that is a good thing] One thing that came from my research is that companies are incented to automate on multiple fronts. It is on one of these fronts, accounting, that we may have a lever to incentivize upskilling.
A quick primer. When companies buy a robot, or any piece of equipment, they pay for it but rather than have it simply take cash out of their account it does something else. If they robot is estimated have a working life of 10 years the company places that “asset” on its balance sheet, reducing its value for every year of service. This asset sits opposite the debt the company has, allowing it to borrow more. If I replace a human making $50K with a robot that costs $250K but is expected to last 10 years after the first year I have an asset worth $225K on my balance sheet (the cost spread over the lifespan less the first year). If I spend $5K to upskill the employee to perform at a higher level, equivalent to the robot, I have nothing but an expense that hits my bottom line.
Large publicly traded companies are evaluated quarterly by Wall Street. The results reported often drive a short-term mindset but it also keeps key metrics front and center for these companies. In addition to the asset to debt ratio, one of these measures is revenue per employee. This simple metric, top line revenue divided by the number of employees offers a clear way to see the benefit of automation. If a company can simply hold its revenue steady while reducing its headcount it looks better on paper than a company that might grow revenue modestly with the same, but upskilled, workforce.
So what if a company’s investment in people could be truly treated as an asset. Invest $1k in an employee and your average tenure for that role is 3 years. Why isn’t that a capital investment to be added to the balance sheet ($666). The switch is a case of accounting policy but what is more interesting to me is what the change in behaviors of companies might be. If employee upskilling was treated as a true capital investment would L&D see more money, stricter reporting standards and a more respected seat at the table.?
Disclaimer: I am a lover of data.
I had some time play with some of the data in ATD’s State of the Industry Report and it raised some questions for me. In order to better understand the ATD data, I looked at the “implied” results that are not included in the report. Because ATD includes data such as percentage of revenue and percentage of profit I can simply reverse the calculation to see what the trends are for both revenue and profit per employee. Since these are the ultimate measures of the success of learning, the trends in these should be trending positive or at least correlated to the investment in learning being made by organizations.
The first thing that stood out was the delta between the implied revenue per employee (RPE), a common public market metric, and the profit per employee in the ATD report and the S&P 500 average. According to Yardeni, an economic advisory, the 2016 Average RPE for S&P 500 ranged from $321,000 and $1.7 million depending on industry with a profit margin of approximately 10%. The revenue discrepancy for Consolidated cohort is understandable given the smaller size for many of the reporting companies for the ATD data. The comparison to the BEST cohort is closer but still under the S&P averages.
The comparison to profit per employee was similarly off.
I then looked for a correlation between learning and an impact on revenue and/or profit in two ways. First, I looked to see how the numbers compared year over year. I then looked for a correlation between learning and an impact on revenue and/or profit in two ways.
First, I looked to see how the numbers compared year over year. This view showed that the increased percentage of investment in learning, touted as a positive reflection on businesses opinion of learning in the ATD report might be misplaced. The ATD report states “Confirming organizations’ commitment to learning, this indicator [% of profit] grew from 8.3 percent in 2015 to 8.4 percent in 2016; the ratio has climbed steadily for four years in a row.”
While ATD seems to draw a positive connection, in fact this may simply be a case of reported profit and revenue dropping, things that businesses care about. There appears to be no correlation. The resulting chart shows years where learning hours rose and the implied profit or revenue dropped. If there is a return to be captured from learning, the ATD numbers don’t seem to reflect it. I did a similar look lagging the revenue and profit a year, to let the impact of the learning spend sink in. Still nothing that showed a correlation much less a causation.
As I stated in the post on benchmarks, be careful.
We are always excited to read the annual installment of ATD’s State of the Industry. Cited year round by our clients, and the industry as a whole, this compendium of data is seen as an important touchstone for many L&D professionals. But, while these numbers are used by so many to justify a sought after initiatives or validate current activities, benchmarks can be misleading.
The other note about benchmarks and data is that cherry picking a single data point or even source can be misleading. While it can be comforting, it comes with a caution. While some will point to the positivity of increased spend, others will cite the data from Bersin by Deloitte, Corporate Executive Board and others, that shows the lack of confidence in L&D, the amount of waste caused by scrap learning or the negative net promoter score for L&D.
Achieving benchmarks is not the goal for today’s learning organizations. While directional, every company is its own group of one. Your company’s business strategy, market conditions and human capital are unlikely to be identical to any other. If you are spending lower or higher than benchmarks, and delivering no value, you are overspending. The reverse also holds true. The true metric for learning professionals to watch is their contribution to the success of the businesses that L&D serves not spending levels.
Some additional thoughts for math lovers here.