2
17 Comments

I thought more traffic would fix my SaaS. It didn’t.

I hit a plateau around low 5-figure MRR.

My first instinct:
→ More SEO
→ More content
→ More paid traffic

Traffic went up.

Revenue didn’t move proportionally.

Here’s what I learned the hard way:

1. Low-friction offers attract low-commitment users.

I optimized for easy signups.

It worked.

But those users:

Churned faster

Didn’t refer

Didn’t expand

I wasn’t building depth.
I was building volume.

2. Everyone is my customer” is expensive.

When your ICP is broad:

Messaging gets weak

Landing pages get generic

Ads get expensive

Referrals don’t happen

The moment I narrowed the ICP, conversion went up without increasing traffic.

3. Feature shipping is addictive.

Shipping feels productive.

But most features didn’t move:

Activation

Retention

Expansion

The stuff that moved the needle:

Better onboarding

Clearer outcome positioning

Charging more

Not new features.

4. Paid acquisition is fuel, not a moat.

Ads kept revenue stable.

But the second I tried scaling spend:

CAC rose

Margins compressed

If your business only works at one ad spend level, it’s fragile.

5. Compounding channels take longer than you expect.

SEO, partnerships, integrations, community…

They don’t pay off in 30 days.

But once they start working, they’re much harder to compete against than ads.

The biggest shift for me:

Stop asking “How do I get more users?”

Start asking “How do I get better users?”

That single question changed how I think about everything:

Pricing

Positioning

Features

Channels

Curious if others hit this phase.

At what point did you realize traffic wasn’t the real bottleneck?

posted to Icon for group Saas Makers
Saas Makers
on February 20, 2026
  1. 2

    "Stop asking how do I get more users, start asking how do I get better users" - nailed it.

    For DTC pricing: Same shift. Not "how do I get cheaper" but "where do competitors premium vs budget, and where do I position?". Traffic/volume = amplifies what exists. If positioning is vague, more traffic just scales confusion. Your point on narrow ICP improving conversion without traffic increase = exactly what I'm seeing in competitive pricing work.

    At what MRR did you make the ICP narrowing shift?

    1. 1

      Around $15k MRR. That’s when I saw 20% of customers driving most revenue and almost all expansion. The rest were signups, not customers. I narrowed ICP, raised pricing, and optimized for buyers who had urgency not curiosity. Conversion improved without more traffic. Traffic wasn’t the bottleneck. Quality of demand was.

      1. 1

        $15K MRR inflection point makes sense.
        "Optimized for urgency not curiosity" = filtering question I'm wrestling with now. Day 12 pre-revenue. Getting engagement (warm leads, questions, likes) but no audit acceptances yet.

        Trying to figure: Are they curious or urgent?

        Did you use qualifying questions upfront ("What's your timeline?" / "What happens if you don't solve this?") or let them self-select through pricing/positioning?

        (Asking as someone still figuring out demand quality)

        1. 1

          Engagement without acceptance is usually a signal, not a mystery.

          In my experience, urgency shows up in behaviour more than answers - people with real urgency will push the conversation forward, ask about next steps, or try to anchor timeline themselves.

          Are your warm leads leaning in after the first touchpoint, or staying at the “interesting…” stage?

          1. 1

            Mostly staying at 'interesting' stage — engagement without forward movement. Which is why I'm testing adding friction: instead of explaining the offer, just asking 'what's your competitor URL?' directly. Curious if that matches what you saw at your inflection point.

            1. 1

              Adding friction is useful but only if it filters for consequence, not curiosity.

              Asking for a competitor URL is smart because it forces context. It shifts the conversation from “idea” to “reality.”

              At the inflection point I mentioned, the shift wasn’t in qualifying questions, it was in positioning around consequences.

              When the messaging moved from “optimize pricing” to “protect margin in competitive markets,” the tone of replies changed.

              Are your leads feeling the cost of mis-pricing right now, or just exploring optimisation?

              1. 1

                That reframe hits hard. 'Protect margin' vs 'optimize pricing' = completely different urgency level. Honest answer: my leads are probably exploring optimization not feeling active margin pain right now. Which means I'm attracting curious founders not urgent ones. Shifting messaging to consequence language starting today — 'you're bleeding margin where competitors are intentionally protecting it' vs 'here's a pricing gap analysis.' Does that match the shift you made?

                1. 1

                  Yeah that’s a good shift. When the language moves to consequences, people who actually feel the problem tend to respond very differently I think.

                  One thing I noticed though is that the message alone isn’t always enough, it also matters where those people are in their business. Founders who are already competing on tight margins feel it immediately, while earlier-stage founders often don’t notice the leak yet.

                  Out of interest, are the people you’re talking to already operating in competitive markets, or are they still early enough that pricing mistakes haven’t really hurt them yet?

                  1. 1

                    Honestly mix of both. The ones feeling it immediately are established brands getting undercut or watching margins compress. Earlier stage founders engage intellectually but don't feel the bleeding yet. Starting to filter harder for the first group — founders already in competitive markets where pricing mistakes have a cost they can measure.

        2. 1

          At $15K MRR, I stopped trying to diagnose urgency through questions. I changed the offer structure.

          Curious people will answer qualifying questions well.
          Urgent people will tolerate friction.

          I didn’t ask about timelines. I:

          Charged for the audit

          Framed it around a clear trigger

          Made the cost of inaction explicit

          The signal wasn’t what they said it was whether they committed.

          At pre-revenue, engagement is noise.
          Commitment is data.

          Add light friction and watch behavior. That tells you everything.

  2. 2

    This is the "leaky bucket" realization every founder hits eventually. You can pour water in as fast as you want, but if the bucket has holes, you'll never fill it.

    Your point about "better onboarding" vs "new features" is spot-on. I fell into the same trap — shipping features felt like progress, but the real leaks were in the first 5 minutes of the user journey. A confusing signup flow kills more users than a missing feature ever will.

    The "how do I get better users?" reframe is everything. For me, the shift happened when I stopped optimizing for "more signups" and started optimizing for "more users who complete their first meaningful action." That single metric clarified my entire roadmap.

    Curious: when you narrowed your ICP, did you have a specific signal that told you who the "good" users were? Or was it more intuition + trial and error?

    1. 1

      The “first meaningful action” shift is a strong filter.

      I’ve found the interesting part is what happens after that action - do they integrate it into workflow, or treat it as a one-off experiment?

      That’s usually where the difference between “good user” and “curious user” becomes obvious.

      Did your best users share any behavioural patterns beyond just activation?

    2. 1

      good question. it wasn’t intuition.

      the signal showed up in behavior, not demographics:
      who activated without hand-holding
      who stuck past the first outcome
      who asked about expanding usage, not discounts

      once i mapped those users back, patterns emerged (role, job-to-be-done, urgency).

      icp clarity came after watching who actually extracted value....not before.

  3. 1

    Point 3 (feature shipping is addictive) resonates — the stuff that moves retention almost never makes the roadmap.

    The category that's almost never on a SaaS retention checklist: payment recovery infrastructure. Involuntary churn — subscriptions lapsing because a payment failed, not because the customer chose to leave — runs at 5–9% of MRR for most subscription businesses. It doesn't show up on most dashboards because Stripe groups voluntary and involuntary churn together by default.

    So it looks like a product problem when it's actually a billing infrastructure problem. The traffic treadmill you describe often has an invisible drain running in the background: you add 10 customers through acquisition, lose 7 to voluntary churn and 1 to payment failure, net +2. Fix the payment failure leak and suddenly the same acquisition effort compounds differently.

    Checking the voluntary vs. involuntary split in your Stripe dashboard is worth doing before you optimize anything else. Every founder we've spoken to while building tryrecoverkit.com who ran that calculation for the first time was surprised by how much of the number was fixable without touching the product.

  4. 1

    The “better users vs more users” shift is huge.

    The layer I’ve seen sit above that is decision discipline - once you commit to depth over volume, every feature, channel, and experiment has to justify whether it compounds that specific ICP or dilutes it.

    Plateau often isn’t a traffic ceiling - it’s a conviction ceiling.

    Was narrowing the ICP uncomfortable at first, or immediately clarifying?

    1. 1

      uncomfortable at first.
      saying no to segments always feels like shrinking the market.

      but the moment the ICP narrowed, a lot of decisions got easier:
      messaging sharpened, onboarding simplified, and feature requests started making more sense.

      the plateau wasn’t traffic. it was trying to serve too many different jobs with one product.

      1. 1

        Okay got you. That sounds like great feedback for you, really helpful data to take on board. Especially with decisions getting easier because of it.

Trending on Indie Hackers
I'm a lawyer who launched an AI contract tool on Product Hunt today — here's what building it as a non-technical founder actually felt like User Avatar 150 comments A simple way to keep AI automations from making bad decisions User Avatar 63 comments “This contract looked normal - but could cost millions” User Avatar 54 comments Never hire an SEO Agency for your Saas Startup User Avatar 53 comments 👉 The most expensive contract mistakes don’t feel risky User Avatar 41 comments I spent weeks building a food decision tool instead of something useful User Avatar 28 comments