- It's all about asking the right questions;
- To grow, you need to measure your progress;
- However, you can't measure what you don't capture;
- Develop 'data discipline' and ensure you're collecting the data you need from day one.
One of the founding principles of Lean is build, measure, learn. We are massive believers in this being the correct way to go about your startup. To do this you need good data and that's why at Forward Partners we encourage you to be disciplined, from the outset, about data. Besides being a foundation for making good decisions, having good data also helps you put together great pitch decks when speaking to investors about potential investment.
The real key to good data is being able to ask good questions. Without asking good, challenging, questions about your business you won't be able to know whether you have or are planning to have the data to be able to answer it. Take this a step further and you should be also be asking yourself questions about future events that your business might encounter and what data you might need to answer questions that arise from those events.
Experiments aka New Features
Each time you launch a feature or improve the product, you should be disciplined enough to be capturing data that will allow you to say for certain: "Yep, this made a difference." To do this, always treat new feature tests like a scientific experiment. Using this approach you should have:
- a clear hypothesis (what you think is going to happen)
- a method (how you're going to measure what you think is going to happen), results (what actually happened)
- and a conclusion (was the thing that actually happened what we thought was going to happen).
Launching new features without having this framework in place will slow you down as you scratch you head wondering what's happened or worse why you don't have valid data to be able to tell.
Improve in increments
In the early stages of a startup you're constantly looking for your next growth. How will you get it? Where will it come from? It can all feel a little foggy when you don't have a clear answer. Good data can often give you the answer! One thing you can do to grow is look at your funnels and say: "Where do I see drop-offs?" Ensuring you have good conversion funnels set up in the analytics package you adopt (probably Google Analytics or Mixpanel) is a great way of making your first improvements to your growth. Looking at poorly converting steps (sometimes called micro-conversions) allow you to isolate problematic pages/steps/flows. These could be as small as problematic form fields on a sign up form or a much larger functionality problem (e.g. our checkout process has stopped accepting Visa credit cards). You will only be able to answer these questions if you've got your analytics set up correctly. Sometimes it helps to dig into the data and say: "Well, maybe certain type of people are struggling." Maybe age is a factor, so perhaps you want to make your product different according to age. You might look at your analytics package and say: "Show me that funnel for people under 20. Show me the funnel for people over 50."
You need to make sure that you're capturing any data from the start that you might eventually want to use.
Unless you tell Mixpanel to capture people's age, three months down the line you won't know if people over 50 are really struggling and that they need a path that's specific to them. You won't even know that you had to go and build something more specific for them in order to get a conversion uplift. Agile development teaches you to build as little as possible, but data's the only thing where it's useful to capture as much as possible. (Just be aware that some analytics packages price on server calls, so you may want to limit the amount of calls you make). (Just be aware that some analytics packages price on server calls, so you may want to limit the amount of calls you make). Always ask yourself, "What questions might I need answered?" That's what we mean by data discipline.
How to test & when to test?
For startups, testing should be built into your development and product process. However, tests create problems as they take a lot of time to run to statistical significance. Quite often this is time you can't afford if you're aiming for 20-30% month on month growth. A poorly rolled out test could cripple the growth in that month and hamper your chances of raising further investment. When choosing what to test, think about revenue. If the test isn't going to effect revenue then is there any point testing it? If you decide there is a positive revenue objective for the test then you need to weigh up speed versus risk. You could send the test out to a large segment of your base to increase the speed that you'll get a relevant result with, however at a much higher revenue risk if it goes badly wrong. Conversely you could test a very small segment with small revenue risk, however the test could take you month to get to a statistically sound result. You need to weigh up your business objectives here. Crucial to all testing is having an A/B/n testing tool where you are confident in the results it can give. Our tool of choice is Optimizely as it's free to begin with, is super easy to use, and integrates well with analytics tools.
Knowing your cohorts
You should capture enough data to be able to do cohort analysis. When a startup leaves Forward Partners after 12 months and tries to get funding, one question they will be asked is: "I want to understand your cohorts." Cohorts work like this:
- Imagine you run some kind of subscription service
- You've calculated that when people subscribe, they stay for five months and you make £100
- In April you run a big PR campaign, and have a big spike in subscribers
You might think: "Great – now I have all this money I can spend acquiring more customers." But wait. With cohort analysis, what you might find is that April's new subscribers only stick around for a month because they were from a different cohort – your PR event. You would have started spending money you don't have. It's important to know how different cohorts behave. But you can only do cohort analysis if you capture the right data from the beginning. So whenever you launch a campaign or new feature, the first question to ask yourself is: "Have I got the data in place?"