5
20 Comments

After 4 landing page rewrites, I finally figured out why my analytics SaaS wasn't converting

I've been building Zenovay for the better part of a year. Privacy first web analytics with revenue attribution. Bootstrapped, solo technical founder.

For 8 months I shipped features like a maniac. Session replay. Heatmaps. AI visitor scoring. A whole CLI for the terminal. Stripe integration. 3D globe visualization. The product got genuinely powerful.

And my conversion rate stayed flat at around 0.5%.

So I stopped. Six weeks. No new features. Just landing page work.

Here's what I actually learned in those six weeks.

Lesson 1: Your homepage is not your feature list

V1 of my LP was a wall of feature icons. 12 of them. I thought "look at all this value". Visitors thought "I have no idea what to look at first" and bounced.

V4 has exactly one hero, one differentiator statement, and a live 3D globe of real traffic. That's it above the fold.

Lesson 2: Show, do not tell

Every analytics company says "real time" in their copy. I did too. It meant nothing.

When I replaced the static dashboard screenshot with an actual live globe spinning and showing dots appear as real visitors hit my customers' sites, people stopped scrolling. That visual carries the entire pitch.

Lesson 3: Pricing above the fold doubles intent

I had pricing buried on a separate page for months. Standard SaaS playbook. The day I moved a simple three card pricing section above the fold, my signup rate jumped immediately.

Hiding pricing reads as "expensive and complicated". Showing pricing reads as "confident and transparent".

Lesson 4: Kill social proof you do not have

I had fake looking logos for ages because every guide said "add logos". They actually hurt trust because they did not match the size of the company I obviously was. Empty space beats fake authority.

Lesson 5: Copy is 5x more important than design

I wasted so much time on design. The version that finally clicked had nothing fancy visually. It had three sentences in the right order.

Where I am now

Just shipped V4 this week. Early signal: conversion roughly 3x what it was. Small sample, will know more in 30 days.

If you want to see the result: zenovay.com

Happy to dig into any specific question about the rewrite process.

Valerio

on May 14, 2026
  1. 1

    Adding a data point on the AI-search thread above (3vo's comment): I just measured this for my niche (vacation-rental ops software).

    Probed Gemini with Google Search Grounding (same machinery as Google AI Overviews) on 28 buyer-intent queries — "best Airbnb cleaning app", "alternatives to [incumbent]", "best photo verification tool for short-term rentals."

    Results: the dominant incumbent gets cited in 16/28 (57%). The #2 player in 14/28 (50%). My product? 1/28 (3.6%) — and the only query I won is the one where my exact-match positioning IS the search phrase.

    Two implications for landing page copy:

    1. If you're cited by Gemini at all, the visitor arrives "pre-educated but skeptical" (3vo's framing) — your page needs evaluation-mode copy, not awareness.
    2. If you're not cited, your landing page never gets the chance to convert. Copy quality is downstream of AI-surface presence.

    Testable: tag your AI-search-referrer URLs with a UTM param and see if those visitors convert at a different rate than Google organic. We're about to do this for our V2.

    1. 1

      this is the kind of data point i was missing. 3.6% vs 57% is brutal but useful, tells you exactly where the AI-surface gap is.
      the pre-educated-but-skeptical framing maps to what i've been seeing, though i haven't isolated it cleanly with UTMs yet. clean test. going to set up the same thing this week, tag perplexity/chatgpt/gemini referrers separately and watch the conversion delta vs google organic.
      honestly the second implication is the one that worries me more. if you're not cited at all, the funnel never starts. landing page work is irrelevant if the AI never sends anyone. different problem entirely

  2. 1

    lesson 5 is the one most ppl skip imo. when i ran hosting90 we'd rewrite landing copy every quarter and the version that always won was the one closest to how customers actually described us in support tickets — not what we thought sounded smart. the 12 feature icons trap is so real, we did the exact same thing. one q tho — when you killed the fake logos, did you replace with anything (testimonials, traffic numbers, customer count) or just empty space? curious what actually built trust at low social proof

    1. 1

      mostly empty space plus the live globe doing the work. the globe is functioning as social proof now, it's showing real customer traffic in real time, which is harder to fake than a logo strip.
      i did add a small "founded in switzerland" line and a quiet user count once it crossed a number that wasn't embarrassing. but the big shift was just admitting i didn't have brand-name customers and stopping pretending. weirdly that made the page feel more confident not less.
      the hosting90 quarterly rewrite cadence is interesting, i was doing it ad-hoc and it dragged out for months. might steal that rhythm.

      1. 1

        yeah the "stop pretending" shift took me ~2 yrs to make at hosting90 — i thought fake authority logos were the moat, turned out they were just noise. the globe-as-proof is sharper bcs it's observable not claimed. one thing that made the quarterly cadence actually stick: tying it to a specific input — every quarter pull 20 fresh support tickets + 10 churn surveys, copy the verbs, rewrite from those. without that input the rewrites drift back to what i wanted to say. what's your rewrite input source rn — analytics, sales calls, support?

  3. 1

    Four rewrites is about where I'd expect the founder to start seeing message-to-audience mismatch, not copy quality. Privacy-first analytics has two completely separate buyers: the GDPR-anxious EU SaaS founder, and the US growth marketer who is fine with the tradeoff and wants the revenue attribution. Trying to convert both with one landing page is the rewrite trap. What did the analytics on the page itself say about which segment was bouncing where? That tells you which audience the copy was actually written for.

    1. 1

      yeah this is the trap i was in. trying to write one page for both the EU founder worried about cookie banners and the growth marketer who just wants attribution numbers. the copy ended up vague enough to bounce both.
      the analytics on the page itself were noisy because traffic was low, but the qualitative signal was clear once i looked. the privacy-anxious folks were converting on the cookieless angle, the growth folks were converting on the stripe revenue thing, and the people bouncing were the ones landing on copy that tried to do both at once.
      V4 leads with the cookieless/privacy angle and lets revenue attribution sit deeper in the page. probably means i'm under-serving one segment, but at least the message is coherent.

      1. 1

        The 'under-serving one segment to keep the message coherent' tradeoff is the right call. The split traffic was telling you two products' worth of intent, and shipping one coherent landing page beats two confused ones. Worth measuring is which segment shows up in the second page if the growth folks bounce on V4 but the cookieless folks scroll, that's the signal that V4.5 should pull attribution numbers up a section, not split the page.

  4. 1

    The 4th rewrite working isn't always because the copy improved -- sometimes it's because the traffic source shifted and the audience arrived with different priors. Pattern from 2025-2026: landing pages that started converting better were optimized for referral traffic from AI search (Perplexity, ChatGPT, Gemini). Someone arriving from a Perplexity answer was already told what you do and why you matter in 2-3 sentences. They arrive 'pre-educated but skeptical' -- not cold. The page that converts them skips the awareness-level copy and leads with the evidence that you're the best option among the alternatives the AI mentioned. That's a fundamentally different structure than a page built for someone who found you via Google with zero context. If one of your rewrites happened to shift from awareness copy to evaluation copy, that might be the actual variable that moved.

    1. 1

      this hypothesis is sharper than my "the copy got better" story.
      i haven't done a clean referrer split yet, but anecdotally the traffic mix did shift. more direct visits in the last couple months, fewer pure-cold google sessions. some of that might be perplexity and chatgpt sending people over after the indiehackers and reddit posts started getting cited.
      setting up referrer-tagged conversion tracking this week. if AI-search traffic is converting at a wildly different rate than google organic, then yeah, the rewrite probably worked partly because it accidentally lined up with a different audience showing up. that would be a useful thing to know before i declare any of the lessons univrsal

  5. 1

    Landing page copy is underrated. I rebuild SaaS landing pages for conversion — happy to audit yours if useful.

  6. 1

    Extremely valuable lessons that we can leverage across multiple business opportunities, thanks for sharing

    1. 1

      glad it was useful, thanks for reading

  7. 1

    The 'copy is 5x more important than design' lesson resonates hard from the data/BI side. I've watched founders and execs sit in front of genuinely beautiful dashboards with 25 KPIs and leave the meeting with zero clarity on what to do next. The dashboards that actually drive decisions tend to look a lot like your V4: one headline number, a clear 'so what', and two or three supporting data points. Same principle.

    The pricing-above-the-fold insight is underrated. Hiding the pricing page reads as 'we have something to be embarrassed about.' Buyers pick up on that signal faster than they consciously realize.

    Congrats on the 3x early signal. Worth noting that 0.5% is actually useful baseline data in itself — most founders don't know their starting conversion rate at all, which means they can't measure whether any of their changes actually worked. You already had the right measurement instinct.

    If you're ever debugging your own product's SQL-layer analytics or attribution queries, my free diagnostic scripts are handy for spotting the slow or broken queries hiding underneath: https://growthwithshehroz.gumroad.com/l/psmqnx

    1. 1

      the dashboard analogy lands. 25 KPIs and no narrative is exactly what my old LP was doing in a different format. one headline, one "so what", a couple of supporting points. that's the structure i should have started with

      and yeah, the 0.5% baseline being measurable at all is something i didn't appreciate until i talked to other founders who couldn't tell me their conversion rate at all. having the floor is what made the ceiling visible

  8. 1

    This is a strong lesson because the issue was not “more features,” it was signal hierarchy. Privacy-first analytics with revenue attribution, replay, heatmaps, Stripe, AI scoring, and CLI is powerful, but on a landing page that can quickly feel like five products fighting for attention.

    The live globe seems smart because it makes the promise visible immediately. It turns “real-time analytics” from a claim into proof. I’d probably keep pushing that same direction: one clear reason to believe, one clear buyer outcome, then let the advanced features support the story instead of leading it.

    One thing I’d watch long term is the Zenovay name. It is brandable, but for a privacy-first revenue analytics platform, Beryxa.com feels a bit sharper and more enterprise-SaaS oriented if you ever want the brand to feel less solo-tool and more serious analytics layer.

    1. 1

      signal hierarchy is exactly the right frame for what was broken. five products fighting for attention is what it felt like to build, and apparently what it felt like to read too.
      one clear reason to believe, one clear buyer outcome. that's the test i'm going to keep running the page against from here.
      on the name: noted, and i hear the enterprise-feel argument. i'm a year deep though, domain is owned, docs and CLI are all branded around it, switching now would cost more than it would gain. if i ever start a v2 of this i'll think harder about it upfron.

      1. 1

        That’s fair, and I get the switching-cost point.

        But that’s also the exact reason I’d pressure-test it now rather than later.

        If Zenovay already has domain, docs, CLI, and product surface around it after one year, then another year of users, integrations, screenshots, tutorials, and customer memory will make the name even harder to change.

        So the real question is not “is switching annoying now?”

        It is: if this becomes a serious privacy-first revenue analytics platform, will Zenovay still feel like the right company/product name when buyers compare it against more enterprise-native analytics brands?

        That’s where I think Beryxa.com is worth considering seriously.

        It feels more SaaS-native, cleaner for revenue analytics, and stronger if you want the product to be seen as a serious decision layer rather than an indie-built analytics tool.

        I’m not saying rename casually. But if your own direction is moving from solo-tool to serious analytics infrastructure, waiting until v2 may just make the brand decision more expensive.

        This is exactly the stage where securing the stronger name early can save you from rebuilding the brand later.

Trending on Indie Hackers
7 years in agency, 200+ B2B campaigns, now building Outbound Glow User Avatar 105 comments How I built an AI workflow with preview, approval, and monitoring User Avatar 59 comments The "Book a Demo" Button Was Killing My Pipeline. Here's What I Replaced It With. User Avatar 46 comments I built a desktop app to move files between cloud providers without subscriptions or CLI User Avatar 27 comments Show IH: I built an AI agent that helps founders find the right people User Avatar 24 comments