Why You Shouldn’t Manage Exclusively to Activity Metrics

A cold shiver went up my spine when I reviewed the call notes that an account manager had taken during a recent client onboarding meeting. Buried deep in the notes was a quote made by a well-respected head of sales operations at a large global software company.

“At the start of last year, we instituted a strict policy of managing to clearly defined activity metrics. As a result, we experienced a 15% sales increase.”

Otherwise stated: I took an action. The team had success. Therefore, this specific action caused the success.

As a proud data geek, this thinking horrifies me because there is an implied lack of understanding the meanings of "correlation" and "causation" by someone who should know better. (For a quick primer on the difference between the two, @KirkDBorne recently tweeted a brilliantly simple explanation). 

More importantly, I have had a front row seat to this same movie more times than I can count. It always has a sad ending.

Misinterpretation of Data

It's not uncommon for business leaders to point to their innovations as the drivers for team success. A little self-promotion is needed in some organizations to navigate around sticky political situations or climb the corporate ladder. In this case, however, the SVP of Sales had hired our company to help his team achieve greater success through the application of advanced analytics. This was not self-promotion. He believed his story and encouraged us to believe that this change was the only factor in the year-over-year improvement. Nothing else had changed, he claimed.

Further, he pointed to an incredibly powerful statistic – his own data analysis showed that tracked activities (phone calls, opened opportunities, meetings, emails) in aggregate did increase substantially. Fifteen percent to be precise – the exact level that sales increased by.

This statistic is powerful, but ultimately misleading. It took a data scientist on our team less than an hour to poke holes in the sales leader’s theory that the company’s new management style caused the increased sales levels. We have seen many companies fall into this trap of over-reliance on activity metrics before so we knew how to test the hypothesis that the 15% increase in activities directly led to the 15% uptick in sales.

Testing the Hypothesis

As a first step, we divided the sales reps into four equally-sized quartiles based on sales. We also divided the sales reps into quartiles based on win rates. Not surprisingly, most reps landed in the same quartile in either trial. Reps with the highest gross sales tended to be the reps with the highest win rates.

When we look at activities and results in these more granular buckets, it quickly becomes interesting. There was only a negligible increase in activities for the top two quartiles (less than 2%). The bulk of the activity increase came from the bottom half of sales performers. The bottom quartile witnessed a whopping 34% increase in activity levels. However, when we looked at actual results (gross sales and win rates), the results were flipped upside down. The top half of performers, whose activity levels barely changed, saw the greatest increase in win rates and gross sales. Clearly, something other than this new activity-based management approach was driving the sales increase.

Managers who are focusing on the activity tally for their reps are less likely to focus on the viability of the opportunity or the quality of the interaction.

In this fairly standard case, we see evidence that the bottom half of the sales organization is pumping the pipeline full of questionable opportunities and logging more calls and emails for these questionable opportunities in order to avoid negative consequences. These reps have done as instructed, but there is not a corresponding focus on the quality of the opportunity or the interaction. Managers who are focusing on the activity tally for their reps, because they are being evaluated on their implementation of the strategy, are less likely to focus on the viability of the opportunity or the quality of the interaction. This leader made a common mistake in interpreting data. More importantly, when companies focus their sales strategies on activity metrics, they are bound to be disappointed. Here’s why:

  • We expect to see lower performing reps entering more unqualified opportunities and closing at lower win rates. More time is wasted on fudging bad data. Less time is focused on expanding skill sets.
  • Higher performing sales reps tend to have a laser focus on achievement: hitting their sales targets, getting the next sale, achieving the next rung in the commission ladder. They understand how to sell, and loathe being micromanaged. Focusing on activities fixes a problem that doesn’t exist.
  • Even the best rep has weaknesses. If sales management has made the investment in coaching reps, management should include some efforts to identify and coach to those weaknesses. These weaknesses might only be microscopically visible when we see some combination of other predictive variables, including a certain type of account, size of the prospect, deal size, type of meeting, funnel stage, deals in which a certain competitor is involved, industry of the prospect, or product(s) being sold. The data might show that a rep's aggregate sales activities are high, but frequency, kind, or quality of these interactions are predictive of the outcome. Some combination of several of these individual weaknesses may be a blind spot for the rep and the manager. Coaching to activity metrics typically involves coaching to the average rep instead of coaching to the individual.
  • Coaching to average metrics takes significant management focus and prevents a search for all the other predictors of success to which managers can coach and train.

None of this should suggest that we don’t value activity metrics as an important part of the sales process. However, activities like anything else need to be considered in the context of all the other sales data when implementing sales strategies and interpreting results.

About the Author: Jim Dries is the Chief Executive Officer at piLYTIX.

 

 

Share this