8
11 Comments

How to validate growth experiments?

I'm trying to think through how to validate/invalidate various customer acquisition strategies. Of course, I could lean on classical statistics, which would say that I need a very large sample size to conclude anything. As a startup, I don't have access to hundreds or thousands of users for every experiment. If I did try everything on all my users, they would probably get tired of it and opt-out :)

Has anyone run lots of experiments and used smaller sample sizes (ex. n=10 or so) to get a signal from the market? Any blog posts that you would recommend about running smaller marketing experiments as a startup?

  1. 3

    I think in the beginning you just need to rely on your gut and talk to customers individually to get a feel for what is working and what isn't. Also, copy what seems to be working for others who are further along.

  2. 2

    Every stage requires a different approach to "get a signal from the market". For tens of customers - it's 1-on-1 interviews. Move to experiments and statistics only when interviews are too costly or don't give any new information. Good luck!

    1. 1

      Definitely! Have been leaning on my customers, but the challenge is that I don't have that many of them (yet) so I can't ask them questions every few days :)

      1. 1

        You don't want to run to your customers every time you're in doubt. :) They can't tell you what to do anyway.

  3. 2

    Maybe theres some inspiration for you in PG's essay titled: "Do things that don't scale":
    http://paulgraham.com/ds.html

    1. 1

      Great suggestion! Had forgotten about that gem. Just reread it and it'll help me focus my marketing going forward

  4. 1

    Hi Harris,

    Been there before and totally get your struggle.

    I’ve mentored hundreds of startups on experimentation in different accelerator programs and my advice boils down to:

    • focus on getting QUALITATIVE data first - mainly that means talking to your customers to validate your hypothesis :)
    • if you really need to run quantitative experiments, make sure you’re testing BIG changes. Testing button colours? Definitely not. Testing a completely different onboarding flow / email style / website messaging (not just the headline!) etc. - yes, that could work! Think of the most “extreme” way to test your hypothesis. By doing that, you should be able to see big swings which don’t need a big sample size. And if you don’t see these big swings, it probably won’t take your company forward, so the area you’re taking is not something to focus on.

    Lastly, I’d challenge you on running any experiments on 10 users or so. Sounds like it could be better to focus on acquiring more customers and test things on the acquisition side?

    Best
    Johannes

    1. 1

      Hey @joradig,

      First, so sorry for my delayed reply! I appreciate all the depth you went into here.

      I've got an Airtable sheet that I'm using to keep track of all of the experiments. I usually test them on about 10 people, just to get some signal into whether it's a good idea. They're all on the customer acquisition side, so I'm glad that's what you'd suggest I do!

      Definitely can't waste my time working on buttons colors :) There are so many things that I would love to change about my UI, but they're not the most important things to do to move the business forward.

  5. 1

    It's really hard to get significant metrics when the sample size n <=10.

    However, if you can get those numbers up a little, I've found MixPanel to be the perfect tool to measure such experiments.

    1. 1

      Nice...I'm on Amplitude, but similar technology. What sort of an n do you look for?

Trending on Indie Hackers
I talked to 8 SaaS founders, these are the most common SaaS tools they use 20 comments What are your cold outreach conversion rates? Top 3 Metrics And Benchmarks To Track 19 comments How I Sourced 60% of Customers From Linkedin, Organically 12 comments Hero Section Copywriting Framework that Converts 3x 12 comments Promptzone - first-of-its-kind social media platform dedicated to all things AI. 8 comments How to create a rating system with Tailwind CSS and Alpinejs 7 comments