If I could point to one of the turning points of that made Codecademy’s revenue takeoff, it was the introduction of a metrics review process to the product team.
It dramatically transformed our understanding of how our metrics worked and how to improve them.
Metrics reviews are essentially the process of trying to draw the connection between the work that you did and how that impacted your key metrics.
This was hands down the most effective thing we did as product managers and once you understand the exact relationship between your work and how your company makes money, you can never work in any other way.
Before we did Metrics Review, we tried to increase conversion to paid by:
After we did Metrics Review, we started working on:
You get the idea. Understanding what drives a number allows you to actually improve it much more reliably and you feel very dumb for not having done this sooner.
The most common objection that I hear from companies is that “we know our numbers”.
However, when you drill into what they mean by this, it means that the founders/CEO occasionally look at top-level numbers like MRR and sign-ups.
The problem with this is that most subscription products have many, many more moving parts than people think about and there can be a long delay between improving the product and seeing big revenue growth.
The biggest danger is that you misinterpret your situation based solely on your top-line revenue and take the wrong actions in your product and business.
Your recurring revenue is influenced by multiple factors that are hard to intuit:
This list keeps going on…
The ultimate value of a good metrics review process is that it forces your team to understand the relationship between the work that they ship and how their numbers move.
The better that they understand this, the more likely that they can build a roadmap to move these numbers.
One thing to clarify at this level is the type of meeting that we’re talking about. At medium & larger sized companies, it is very common to have to explain to senior leadership your overall progress against a goal.
This is not the type of meeting that I’m talking about.
Presenting in these meetings is stressful, your narrative has to make sense and you don’t want to appear to not know your numbers.
A Metrics Review meeting is one that is held between peers, maybe with a few leaders, and the whole goal is to discover what you don’t understand about your numbers.
The typical structure for a meeting like this is:
The big variables here that will change between companies are:
So if you ship once a month and it will take you 2 weeks to get an answer back from your Data Science team, you need more time between meetings.
If you ship weekly and you have self service analytics tools, you can do this weekly. Find the cadence that works for you.
Before I get into exactly how these meetings function, one point to highlight is that setting the right tone of these meetings is key.
The wrong tone is senior leaders yelling at the team leads whenever a metric goes down. This will cause the team leads to not present ambiguous or negative trends in the product, which will make this meeting a “check the box” exercise that produces little value.
The right tone is somewhere between a thought exercise and a guided exploration of a problem. The presenters should be able to present what they understand but also honestly discuss what they don’t understand.
In each meeting, the presenter is trying to explain to the group:
To give an example, if I was a product manager in charge of free user sign-ups, that might look like this.
This leads to a prioritized list of questions that the presenter can look into and come back with answers next time.
Attempting to draw the relationship between the work that you do and how your business makes money is one of the few things that I believe 100% of companies should be doing.
That said, you need to have a few things set up before this process can work.
Once you have these in place, you have the foundational elements in place.
Even with these in place, I would start with as small of a group as possible and at as high a level as possible. It probably took the Codecademy team 3 months before we actually asked the right questions and looked at the right data.
Also, once you start to go down this road, I have seen companies hit the following bumps.
While these hurt to acknowledge, you now at least know about them and can start to fix them.