Product Management KPIs

This article originally appeared on Medium on September 18, 2020.

The Basis of Product KPIs: Quality, Quantity and Future Proofing

KPIs, KPIs, KPIs, — there’s been a lot of talk about KPIs lately. To be honest, people have been talking about KPIs for a long time, especially around marketing, sales, and some R&D (though the latest mostly speak about KPIs but don’t measure it well enough IMO — but that’s for another article). The difference today, I think, is that now everyone is speaking about it. I guess COVID19 put the importance of KPIsinto perspective — people are working remotely and it’s harder to understand what is being done. So now, whether you measure the individual, the team or the company, how well you are doing should be measured in objective KPIs.

I mentioned sales and marketing above and now ask — why have sales and marketing been doing it so well? I’m assuming that money was involved! In marketing, the costs for media and ads are very high. It’s mostly digital these days and somewhat easier to measure, partly because it’s a more mature market for measurements, and partly because the things that are measured are the tools that are used for conversions, and not only people. In sales, it’s not much different; salespeople are compensated for their sales, with cliffs and quotas and targets for bonuses — sales departments have always measured what they were doing and how well they do it.

Success, Product and R&D are a step behind, while other departments (HR and Finance) are even farther away. There are some tools that try to help, but personally I didn’t find anything mature enough (though for Success, Totango and Gainsight are doing a decent job I think). For Product, zilch.

I am not writing a tool, yet :), but I did want to go over the way I think the product team should be measured. In this post I will cover the three basis of measurements I think should be taken in product management to create proper KPIs — quantity, quality and future proofing.

I hope this, in turn, creates some meaningful discussions at some point somewhere.

The basics of KPIs

In my opinion, KPIs, no matter which department, should always cover three aspects: quality, quantity (speed) and building for the future — both the team and the tools the team use, as well as thought leadership and personal growth.

When diving into the specific department (R&D, Product, Success, etc), different fields (B2B, B2C) and so on, the details, (how we measure quality for example) change, but the concept doesn’t. Once these three aspects are in place, you can then decide, based on your specific timely needs, where the emphasis lies. Are you driving quality over quantity? Do you care, for the next 2 quarters, to speed things up? And so on. I tend to put:

  • Quality at 50%,
  • Quantity at 30%
  • Growth at 20%

Surprisingly, I found that people are drawn into speed, and sometimes I lower the percentages to 20% and increase quantity to 60% to compensate for that.

Another important tip for KPIs — just like marketing and sales do — make sure KPIs are understood, visible and accessible to the people being measured (ideally, digitally, not via quarterly review). This is how you make sure the KPIs are not only used to measure, but also incentivize people to do the “right” thing. I’m putting “right” between quotes because obviously it’s what’s right for you and what you are aiming for.

Quality and quantity are pretty straight forward I think, but building for the future is not, so I will touch on it quickly. Like I wrote before, the weight placed on each aspect is internal to the person, team and company. Building for the future is two-fold. Internally, it’s about communicating product values, educating the team and the company, self educating (a.k.a growth). Externally it’s not about marketing the product, but rather employer branding and strengthening your team’s position in the community via thought leadership and education, presentation, etc.

Another aspect to building for the future is building the tools of the trade: working on the roadmap; improving the tools and infrastructure; planning recruitment; onboarding for future team members and so on. All of those things are what people do (I hope), and once it’s a KPI it means that it’s taken more seriously and not as a “by the way”. It also means that management gives it the proper time and values the work it presents. If you DO NOT want this to have value just add 0% to this part, like I wrote above.

One final note: There are leading indicators — KPIs that mean that we are on the right track, ie, we have users landing on our page! (However, success in them doesn’t mean we will succeed, ie, they don’t register!) and there are lagging indicators, the indicators that actually mean that we are successful — users registered and are paying us.

It’s important to distinguish between them, measure both, and put the right emphasis on what you want to accomplish. I see many people that don’t understand the difference; I see companies putting KPI only on the lagging indicators. And I see companies that don’t understand how to play with both types (for example, when doing research or starting a new team or a new field, leading indicators are often all you’ve got for a quarter or two).

Product KPIs

Below are the product team KPIs that I measure, what they mean, and why they are important. These are not all the KPIs, but those are the ones I felt are worth discussing and potentially explain the line of thought, as well as some that are unorthodox:

Quality

Like I wrote above, I give quality, ideally in a well trained team, 50% of the emphasis. I always make sure people know that quality is first, but quantity is close second. There is sometimes a discussion on why quantity is important at all — doesn’t it make sense to only measure quality, and product people who make a noticeable difference and move the needle are the ones you want to encourage? Some people also go as far as measuring product people by revenues — while I agree with some of what Tyler writes there, I try to encourage product people not to think about revenues at all — but usability and product market fit. Sales people will know to translate that to revenues, but a long lasting company has to chase product fit and revenues can be misleading in the short term.

Quality of a product team is super hard to measure. Below are some of the things I think should be taken into account:

Number of customers who use a feature — a lagging indicator on the usage of a feature. This obviously should not take into account, or only partially take into account, features that are mandatory or used by all customers, or even unknown to customers (using https for security, for example), however, if most of your customers choose to use a feature — it has value.

Feature Usage data — lagging indicator on time spent, repeated usage (and other indicators for value gained from feature).

Number of questions from R&D of features — leading indicator. While I strongly believe in encouraging discussion, questions and unclear feature requests are noise to the R&D and product. If there are blocking questions, or a lot of back and forth on a feature — it was not written correctly. Need to be careful and make sure this is measured across all features and teams so a single developer-PM relationship is not hurt.

Back and forth on Success — leading indicator. This is a shared KPI with the Success team, or anyone originating feature requests. Sometimes the requests are not clear — but if there are, the faster the PM gets to a clear answer and need, the better it is for all involved.

Noise to Product, Noise to R&D: lagging indicator. Features prepared and done from a product perspective, sent to R&D and not done by R&D for another quarter or two (time varies according to the speed of the company). In an agile environment this means that there was a waste of time and attention internally in the product, and then with the R&D when the request was verified, but for whatever reason it was not done — ie, it was not needed at the time. The two indicators (noise to product and noise to R&D) differ by the time we give until we decide it was noise.

NPS of users and customers — lagging indicator

Quantity

Number of features ready a week — leading indicator — once quality is established and hopefully at a high score, we measure quantity at its simplest form — how much content is created for R&D to work on.

Response time for discussions on features — leading indicator — whether from the customers, internal or external, or from R&D — the speed of answers is critical to help them reduce context switching.

Build For The Future

Team Building:

Preparing and maintaining fit of hiring process — leading indicator. Hiring process is critical — how well are we doing it? Are we, as a team, synced on our purpose and deliverables? Is this in sync with what the company goals and needs? Is the process fun for the candidate? Is it fast enough? All of these things are part of the hiring process.

Producing external content (blog post, podcasts and so on) — leading indicator. Branding of the team, the company, and improving ability to write, present ideas to external people.

Building KPIs, including measurable KPI and the way to measure them — leading indicator. All of the work above.

Product Content Building:

Leading — Strategy (content 2–5 years ahead)
Leading — Roadmap (this year’s content)
Again, these product KPIs can be weighted differently per your company agendas and goals. But each has its place in ensuring that what we do is measured, accounted for and respected for the value it brings.

Lior Sion

Get delivery & logistics news and insights