10
15 Comments

A/B Test Results - Newsletter Landing Page

Hey everyone, thought I'd share some recent split testing results in case you might find them useful for your own project.

For our Exploding Topics newsletter page we previously had a looooong landing page. Aswell as the hero CTA, it had a "how it works" section and graphics, tweet cards from people for social proof and then a CTA subscribe box at the bottom.

But then we saw Morning Brew's homepage and thought it was worth testing something like that.

Anway, the results are in. The original had a 15.9% conversion rate, while the stripped down "hero only" without even the footer, is at 20.4%. With 97% confidence of being better given the number of sessions.

Less is definitely more when it comes to landing pages. And I don't think there is such a thing as "too little" either. Well maybe if you don't have a subscribe input...

Also, if anyone is wondering, I used Google Optimize for the testing. It's really good actually and ties into Analytics. Plus it's FREE. And that's a lot cheaper than Optimizely :)

posted to Icon for group Growth
Growth
on February 6, 2020
  1. 3

    Who here is old enough to remember the first time they used Google? I remember thinking that the page hadn't finished loading. They were one of the first to demonstrate that less is more. It's a good lesson to be reminded of.

    1. 2

      Haha absolutely Eric.

  2. 2

    Less is definitely more when it comes to landing pages
    Don't follow this blindly though. If it's a LP for something as simple as a newsletter, then it's surely a good idea. If you're promoting a more sophisticated/novel product, that visitors are not familiar with, then it's not such an obvious decision.

    1. 1

      Great point dude, I should be more careful with my generalizations!

  3. 2

    Love Exploding Topic Tuesdays. Thanks for sharing.

    1. 2

      Thanks man - glad you like it! :D

  4. 2

    Awesome! Morning Brew's landing page also made me consider doing a more basic landing page.

    I will definitely test this in the future, thanks!

    1. 1

      Nice!! Let us know how it goes :)

  5. 1

    It's great that you are testing, but slow you roll a bit. "Less is definitely more when it comes to lading pages" is jumping to a flawed conclusion. You can't look at the results of one test and apply it broadly to every other situation.

    Having been part of many split tests going way back to my early direct mail days, I can tell you that as a general rule of thumb, the ideal length is as long as it needs to be. For a simple, low stakes offer like signing up for a newsletter, you really don't need a lot and you can lose your audiences' interest if you keep going.

    But that's not the case with offers that have more complexity and/or importance. In those cases, you need the extra messages to clarify things, spell out benefits, overcome friction and create urgency.

    One final thought. Make sure you're testing right. I agree with @RayT that the sample size is a bit low for statistical validity, though I don't think that's specific to Google Optimize. Most of the big testing software calls the results too early. That's why that is one of the most common mistakes I see. In case you're interested I wrote about that and the other ones here: http://www.seankirbycopy.com/ab-testing-mistakes/

    1. 1

      Good point - it is a false generalization to say "less is more" for all landing pages. But in the context of newsletter landing pages, I believe it's a good rule of thumb. Like you say though, good to specify and likely not the same for lot of SaaS or more complex products.

      You and @RayT got me concerned that I called it too early! :P

      So I actually checked the stats myself and the result is indeed statistically significant + mirrors Google Optimize's numbers: https://abtestguide.com/calc/?ua=867&ub=773&ca=139&cb=161

  6. 1

    Hey Josh,

    thanks for sharing! Sounds awesome. I am curious about the total numbers. How many users participate to the test?

    Another question regarding Google Optimize: is there anything you don't like on it?

    1. 1

      Hey Kermin, over 1500 people took the test and Google Optimize does the math to calculate the statistical significance and tells you when to end the test.

      I actually haven't had any issues with it, but I need to use it more and then I'm sure I'll find things I don't like :D

      1. 1

        Ok, sounds good.

        To be honest with you: This is the reason why most experts don't use GO. Sure, it tells you to stop the test, but from my experience I can tell you traffic is way too low to have a reliable result.

        Google for topics like "test duration ab testing" or event check a calculator: https://www.convert.com/tools/ab-test-duration-calculator/#calculator

        It's really nice to have kind of AB-Testing mindset and experiment a little bit, but my point is: don't blindly trust what a tool tells you ;)

        1. 1

          I checked the stats myself on this one. Had to dust off an old textbook :D The results are statistically significant (even with this sample size) and align with the Google Optimize’s numbers: https://abtestguide.com/calc/?ua=867&ub=773&ca=139&cb=161

        2. 1

          This comment was deleted 6 years ago.

Trending on Indie Hackers
710% Growth on my tiny productivity tool hit differently, here is what worked in January User Avatar 64 comments You roasted my MVP. I listened. Here is v1.3 (Crash-proof & 100% Local) User Avatar 26 comments I built a tool to search all my messages (Slack, LinkedIn, Gmail, etc.) in one place because I was losing my mind. User Avatar 25 comments Why I built a 'dumb' reading app in the era of AI and Social Feeds User Avatar 20 comments Our clients have raised over $ 2.5 M in funding. Here’s what we actually do User Avatar 14 comments How I got my first sale from a forgotten project User Avatar 10 comments