Data wins arguments
Data wins arguments.. I wish I had this in mind when I started my career. As a product manager, I have been “bullied” in my work quite a lot when it comes to prioritization. It seems that everyone had a strong opinion and their own agendas when it comes to “what to build next?”. Usually, the winner was the one with the loudest opinion.
I was attending a company meeting once, where the CEO, with all the confidence in the world, announced that X is the direction to go. I thought “he must know something” right? I assumed that analysis and validation of that strategy was done and I trusted management’s decision. A few months later, the management realised that we were heading in the wrong direction and the strategy that was celebrated so much in the beginning was stopped overnight, taking down hours of work and wasted money and energy.
Another time, the CPO called an urgent roadmap planning session. He listed all the things we were working on, the requests from customers, the technical roadmaps etc and together with other PMs it was decided that Y items will be in our roadmap that quarter. Well you guessed it right, none of the items added much value. We were just a feature factory for the next quarter and the goal was to deliver more and more items.
Have you been in a meeting with the UX team arguing for hours or even days if the Z field will be represented as a drop down or a radio button? Luckily enough, I have been in that position too. Wasting hours hearing arguments about the best way to represent the field. And you don’t want me to start about the conversations regarding the right copy.
Soon enough I started thinking it doesn’t matter what me or the designer or the copywriter wants. In fact, it doesn’t matter what you want, what the CPO/ CEO wants or the creator of the universe. What matters is what the customer wants!
If you have experienced the above situations or at least a similar one, I have good news for you! No, you don’t need to predict the future to figure out what the customers want or what to build next. You just need to bring data to the table, let the data speak! And hopefully stakeholders and management will follow.
Product Data Management (PDM) is the process of collecting, organising, storing, and sharing data within an organisation. A Data Product Manager is like a Product Manager, but who focuses more heavily on Product Data Management.
Product School
In my humble opinion, there is no need for a special product manager title, in this case the “Data Product Manager”, it should be in the DNA of a PM. Product managers should always prioritise based on what data tell them, when that is applicable. If no data are available the team can run some product discovery sessions. Only if no data are available, the PM can make an assumption that must be first tested before spending even more time and money on her idea.
Teams that build continuous customer discovery into their DNA will become smarter than their investors, and build more successful companies.
I find that there is no magic recipe applicable to the discovery work & prioritisation of items. A Product Manager during the course of her professional experience learns how to balance all these different aspects of data analysis and picks up the right pieces to compose the puzzle of “What to work on next?”.
Below I describe some practices that I identified and follow to win arguments and avoid waste by letting the data do the talking.
1. Use your analytics tools to understand behaviours
Most mature companies have data analytics tools in place. This means that they capture the digital behaviour of a user. These information can vary from which device the user logged in, at what time, how long the user stayed on each page, to what action the user took etc . The role of the PM is to gather all these data, analyse the behaviour of the user and eventually conclude any actions.
In one of the products I was involved, in the early stage of the product discovery, some colleagues made the assumption that the more images the user will upload each day, the more he will use the product which will lead in converting him to a paid user.
Normally, I would rush to the development team and based on that assumption I would ask them to:
- optimise the uploading process (clear steps, clear messages, clear information when the uploading task was taking long time, great UX/ UI)
- work hard to increase uploading speed (you could have your picture in the portal in milliseconds).
- focus a lot of energy to optimise our email marketing campaigns so the user will not “forget” to upload images.
This time I took a careful look at our available data and after some analysis I noticed that our users used the upload functionality quite rarely! The day the user created an account was the same day he/she uploaded large batches of images (during night) and never again or not very often. The user will come back to the product a couple more times and then stop.
That was quite an eye opener for me. It means that my assumption and hypothesis must have been wrong. Well at least I didn’t dive into it immediately and wasting my teams time and effort.
2. A/B test whenever is possible
A/B testing is essentially an experiment where two or more variants of a page are shown to users at random and statistical analysis is used to determine which variation performs better for a given conversion goal. In other words, A/B testing is a great way to decide which experience or message will improve your product.
However, that doesn’t mean that you go and A/B test everything on your product without knowing what is the question you are trying to answer. In other words, you need to have a goal. You must have a problem identified and a clear hypothesis that you will either confirm or not by running this test.
Step 1: Begin with a problem you’d like to solve. Maybe you have data or user research suggesting that there’s an issue, or just an informed hunch derived from knowledge of your product and audience.
Step 2: Define your hypothesis
Step 3: Run a test to gather evidence that will confirm or not your hypothesis.
Step 4: Decide what you will do based on the evidence you collect.
A/B testing can be a complete waste if you don’t consider at least the below:
- Make sure you have meaningful traffic. It doesn’t make sense to run a test when your traffic is 100 users per day.
- Make sure that the hypothesis clear. To do that you need to have a clear problem.
3. Raise your GAME
Has anyone ever asked you “We need the x feature. When you think it can be ready?”. Well, many PMs will go to the development team and start refinement in order to guest-estimate a delivery time.
I always take a step back, breath and try to do a proper discovery and analysis to present the facts and figures. I would do the following:
- Ask how this feature is linked with the product strategy?
- Figure out which OKR is going to be affected (if any) after the development of the feature?
- How I will measure this feature? Which metric do I expect to move by implementing this feature?
When I get questions like this from management, I quickly use the GAME framework to assess the feature.
- Goal
- Action(s)
- Metrics
- Evaluations
How does this feature contribute to my product goal? What are the actions that a user will take when using the feature? How will I measure the success of the feature? Finally, I will evaluate the impact and the effort it takes to develop this feature but also “compare” it with the impact/effort of the other features of the backlog.
If the outcome of the analysis shows that this feature will not have an impact to your product and it’s not aligned with your product goals, you have the evidence to support the reasons you will not proceed with the development of the requested feature. This approach will save you time arguing about the development of a feature but also you won’t waste your team’s effort.
Just Do It!
Last but not least, be aware of the Analysis Paralysis. Validating ideas should be fast and cheap. On the other hand, building a product can be slow and expensive. Find a balance between analysis and execution. In a world where time doesn’t matter, gathering all possible data from all kinds of sources and doing extensive analysis to make the right decision would be a reasonable approach. However, in reality time is important, especially if you want to stay ahead of competition. So be mindful on the time you spend on your analysis, decide what your threshold is and know when to say “Now I know enough” and go for it. At the end of the day, doing something is better that talking about doing it.