Important product metrics you might be missing

The Product Metrics You’re Obsessed With… But Shouldn’t Be

Product managers and startup founders often fall into a familiar trap: celebrating glowing metrics that signal success on the surface but hide deeper issues that are silently derailing your product. It’s not uncommon to track obvious metrics like user sign-ups, time-on-app, or MRR growth, all while feeling confident that your product is heading in the right direction.

 

But here’s the thing: Not all metrics are created equal. Some of the metrics you’re tracking may look great on dashboards but provide little insight into the true health of your product. Worse yet, these vanity metrics might be masking the very problems that will lead to churn, loss of revenue, or worse, product failure.

 

So, what’s the fix? It’s all about shifting focus to hidden metrics that go beyond surface-level numbers and uncover what’s truly driving user behaviour, retention, and product value.

 

 Time to Value (TTV): The Hidden Driver of Retention

 

Imagine this scenario: users are signing up for your product in droves, yet a huge portion of them churn within the first week. What’s going wrong? The answer often lies in Time to Value (TTV)—the amount of time it takes for a new user to experience their first “Aha!” moment.

 

That “Aha!” moment is critical because it’s the point where users get your product’s value and feel invested enough to stick around. If your TTV is too long, users may drop off before they ever truly understand what makes your product great.

 

Case Study: Slack’s Focus on Message Volume  

Slack, the workplace communication tool, is often touted as one of the fastest-growing SaaS products of all time. While much of this success is attributed to their product-market fit and viral adoption, a big part of their growth came from understanding TTV. 

 

Slack discovered that teams who sent 2,000+ messages within the first month were significantly more likely to stick with the product long-term. They honed in on this insight and optimized their onboarding flow to push users toward collaboration faster. Instead of measuring new sign-ups or app downloads, Slack’s product team focused on the metric that mattered most: how quickly users hit that 2,000-message threshold.

 

By accelerating TTV, Slack ensured that more teams reached the critical milestone that made the product indispensable to them. This laser focus on TTV helped drive Slack’s explosive growth and reduced churn, ensuring that users stayed engaged for the long haul.

 

 Engagement Consistency: Beyond DAUs and MAUs

Daily Active Users (DAUs) and Monthly Active Users (MAUs) are commonly cited metrics for measuring product success. After all, they show how many people are interacting with your product on a given day or month. But here’s the problem: these metrics often give a false sense of security.

 

A user might log in, poke around for a bit, and then leave, never returning. Your DAU/MAU ratio might look fine, but how many of those users are consistently engaging with your core features? If your users are only interacting with your product once a month or aren’t using the main value-driving features, they’re at high risk of churning.

Case Study: Netflix and Watchlist Additions  

Netflix provides an excellent example of using engagement consistency as a guiding metric. Instead of focusing solely on daily logins, Netflix tracks how often users are adding content to their watchlists. Why? Because adding content to a watchlist signals intent—users are planning to come back to engage with that content later, showing deeper engagement with the platform.

By focusing on this specific engagement behaviour, Netflix ensures that users don’t just log in and browse but interact with the platform in a way that keeps them coming back. This is a far more reliable metric for long-term retention than DAUs alone.

For your own product, think beyond DAUs. Are users repeatedly coming back to your core features? If not, why not? Focus on driving consistency with the features that deliver the most value.

 

 Feature Stickiness: Ensuring Long-Term Engagement

Launching new features is exciting, and it’s easy to be swayed by the initial spike in usage. But how do you know if that usage is sustainable? The answer lies in feature stickiness, or how often users return to a feature after its initial launch.

Many product teams celebrate new feature launches by tracking the number of users who try it out in the first week or month. But what happens after the hype dies down? Are users still finding value in that feature, or was it just a novelty?

 

Case Study: Instagram Stories’ Long-Term Success  

When Instagram launched its Stories feature, there was a lot of excitement around it. In the first few weeks, millions of users tried it out. But Instagram’s product team didn’t just stop at measuring initial usage; they were interested in the long-term recurrence rate—how many users were posting Stories consistently day after day.

 

What they found was key to their continued success: Stories weren’t just a flash-in-the-pan feature. They had long-term stickiness, meaning users returned to the feature repeatedly. This sustained engagement signalled that Stories had become a core part of Instagram’s value proposition, contributing to the platform’s long-term growth.

 

To apply this to your own product, think about how to measure stickiness for new features. It’s not enough to get people to try it once—you need to ensure they keep coming back. That’s the real measure of success.

 

 Retention by Cohort: Diving into User Segments

Retention rates are often looked at in aggregate, but this can hide important differences between user segments. To get a clearer picture of how your product is performing, it’s essential to measure retention by cohort—grouping users by when they signed up or by specific actions they’ve taken.

Looking at cohorts allows you to see if users who joined during a specific period or took a particular action are behaving differently from others. Are certain cohorts retaining better than others? If so, why? By identifying the patterns, you can refine your product or marketing strategies to improve retention across the board.

Case Study: Dropbox’s Early User Retention  

Dropbox, the cloud storage giant, used cohort analysis to identify a key driver of retention: users who completed the onboarding process and saved a file within the first two days had significantly higher retention rates than those who didn’t. Armed with this knowledge, Dropbox focused on making its onboarding process as seamless as possible, pushing new users toward saving their first file immediately.

By analysing cohorts, Dropbox was able to identify which actions led to long-term engagement and then optimize the user experience to drive those behaviours. This focus on early user actions helped Dropbox improve retention and scale rapidly.

 NPS Decay: The Hidden Danger of Satisfaction Scores

Net Promoter Score (NPS) is a widely used metric for measuring customer satisfaction. But here’s the hidden danger: NPS decay. While your NPS may be high today, it could be slowly declining without you noticing. Tracking NPS over time can give you an early warning sign of product dissatisfaction before it turns into a bigger problem.

Many companies only look at NPS in a snapshot view, measuring it periodically. But by tracking NPS decay—how your score changes over time—you can spot emerging issues and address them before they impact retention.

 

Track What Matters, Not What Looks Good

Chasing the wrong metrics can lead your product down a dangerous path, masking deeper issues that could threaten its success. To build a truly healthy, scalable product, you need to focus on the hidden metrics that go beyond vanity numbers and tell the real story.

By tracking Time to Value, Engagement Consistency, Feature Stickiness, Retention by Cohort, and NPS Decay, you can gain insights that will help you optimize your product, improve retention, and drive long-term growth. Don’t be blinded by flashy numbers—dig deeper and find the metrics that truly matter.

So, before you open that dashboard today, ask yourself: Are these metrics showing me the real story, or just a version of it?