<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=344430429281371&amp;ev=PageView&amp;noscript=1">

3 MIN READ

A Data-driven Approach to Linking Training to Customer Success

Written by Bill Cushard

Published on March 11, 2015

2015.03-Training-is-Data

by Bill Cushard (@BillCush)

Training professionals design training programs with the intent of making some impact on the lives of students and/or the organization in which the students work, but we do not always have the data in place to measure the impact on results. Yes, training professionals have models for evaluating training, but we are not known for being data-driven and analytical. In fact, most of the evidence we have about whether our training programs are effective is anecdotal and come primarily in the form of student satisfaction surveys. But with the large amount of time, resources, and money it takes to develop good training programs, organizations are becoming more demanding about showing evidence that training is actually helping customers achieve outcomes.  

So how can you do this?

What should training professionals be thinking about in order to...

To Be Data-Driven, Ask These Three Questions

According to Maria Manning-Chapman, senior director of research for the education services focus area at the Technology Services Industry Association, suggests three questions all training leaders of SaaS/Cloud enterprise software companies should start asking, if they want to take a data-drive approach to using training to improve customer success:

Question 1: Is the customer using the product at all?

This is basic usage data. If one of the goals of training is to help people use the product, we should collect usage data.

Question 2: Could customers use the product more? Could customers use more of the product?

The second question has two parts that look similar, but are quite different. The first part of the question is a question of frequency. Do people use the product every day? Once a week? If a software product is designed to be used as a primary work tool, but people only use it once a month, there might be a problem. Do people who completed training use the product more often than people who did not? That is an important question to answer. '

The second part of the question is a matter of depth and breadth of use. Are people using just one feature and nothing else? Are others using more features? Do people who completed training use more of the product than people who did not attend training?

Question 3: Could customers use the product better?

This question is about the effectiveness of use. If business outcomes are being impacted by the use of your software, you know people are using your software the way you always imagined they would. It will be more difficult to collect this data. It will require talking to your customers and working with them to determine what outcomes are being achieved. Once you do this, you can run studies to determine whether people who completed training use your product "better" than people who did not take training.

Register for Webinar

Metrics to Consider

Answers to the questions above will lead you to specific outcome metrics that you can track. Tom Krackeler, Co-Founder and CEO of Frontleaf, suggests a few specific metrics to track:

  • Increase usage of a particular feature set
  • Reduce the time for a customer to reach a milestone
  • Increase the percentage of users above a certain usage threshold
  • A specific measurement for every reason the customer bought your product in the first place 

The training leader who can demonstrate the impact training has on customer outcomes will be able to help their organizations grow by giving customers even more reasons to purchase a product and purchase training. It all starts with asking the right questions and identifying the key outcome metrics your customers care about most.

Call for Comments

  1. How do you measure the impact of your training programs on customer outcomes?
  2. What data do you collect?
  3. How do you analyze the data?
  4. Have you been able to show a customer (with data) how training can improve key usages or outcome metrics?


Originally published Mar 11, 2015 4:30:00 PM, updated Mar 11, 2015