3
1 Comment

Don't optimize your landing page with A/B testing

Hey all, I have launched a few weeks ago GrowthClub - a peer to peer coaching platform for founders. We have been actively iterating the landing page a few times, trying to communicate the message perfectly & get the perfect conversation rate.

Usually, we would iterate the landing page using qualitative feedback and our gut feeling. But I know how easy it is to get BSed with these methods. I heard about A/B testing and it caught my eye right away. Basically, you simultaneously run A & B versions of you website (e.g with Google Optimize) and see which one performs better. Isn't it great when you are sure your new title idea is correct because the data shows you?

However, I have not taken into account statistical significance. What it means is that unless you need quite a large amount of traffic there is no certainty that the conversation rate was improved just because of chance. Think of flipping coin and getting the same side 3 times in a row.

GrowthClub is a peer to peer coaching platform, it is a relatively new concept, and many are confused about how it would work. We wanted to show it before they signed up. The A/B test I ran on GoogleOptimize was designed to test our 'examples' idea - basically, we would show on our website real people with real goals in their soft-skills they seek to improve (our platform focuses on soft-skills) as well as where do they have expertise. Version A is located at /home and B is at /home2. Google Optimize would redirect 50%/50% traffic to these versions.

Now it is been 3 weeks after the A/B test was launched:
Imgur

The probability of the new version to be best is only 75% which is pretty low. Typically in an A/B test, you want 95% or 99% confidence. Overall it seems it is a waste of time for early stages of a startup, like ours.

Thanks to the advice from folks from Startup School and YC's lecture on conversion rates I am now focusing more on getting more traffic to the landing page. We do iterate the landing page, but more to adjust product positioning. E.g. recently we started to put way more accent on coaching & methodology to differentiate from networking platforms.

  1. 1

    It's true what you've written, you need large data in order to receive statistically significant results in a traditional A/B test, You can calculate how much before starting a test by using this calculator for example:
    https://www.optimizely.com/sample-size-calculator/?conversion=13.32&effect=100&significance=95

    However, there is another method to conduct an A/B test - Bayesian method, it doesn't need large testing data and it can give you faster results.
    https://medium.com/convoy-tech/the-power-of-bayesian-a-b-testing-f859d2219d5

Trending on Indie Hackers
How I grew a side project to 100k Unique Visitors in 7 days with 0 audience 49 comments Competing with Product Hunt: a month later 33 comments Why do you hate marketing? 29 comments My Top 20 Free Tools That I Use Everyday as an Indie Hacker 18 comments $15k revenues in <4 months as a solopreneur 14 comments Use Your Product 13 comments