20
57 Comments

I lost 11 users in 30 days and had no idea why. Here's what I learned.

Every founder knows churn hurts. But what actually broke me was the silence after.

No email back. No reply to my survey. Just gone.

I tried the usual stuff. Sent exit emails. Added a cancellation survey. Got responses like "too expensive" or "not what I needed." Generic. Useless. I couldn't fix anything with that.

Then one day I just manually messaged a user who had cancelled. Asked them directly what happened.

They replied instantly. Told me exactly what went wrong. Specific feature. Specific moment. Something I could actually fix.

That one conversation was worth more than 3 months of survey data.

So I started thinking. What if every cancelling user got that conversation automatically? Not an email later. Not a survey after. Right at the moment they click cancel, when the reason is still fresh in their head.

I built that. A small chat that appears the second someone clicks cancel. They type or speak for 10 seconds. Real reason lands in a dashboard.

First week of testing I found out 3 users left because of one missing feature I could build in a day. I had no idea.

Also learned something that scared me. Almost 30% of my churn wasn't even a decision. Failed payments. Card expired. Those users didn't want to leave. They just did.

Same problem. Wrong moment to find out.

If any of you are dealing with churn right now and don't know exactly why people are leaving, drop a comment. Happy to share everything that worked for us.

posted to Icon for group Growth
Growth
on April 24, 2026
  1. 1

    This is exactly the gap GuestPulse was built to solve just in hospitality instead of SaaS.

    We found the same thing during user research. Hotel managers were sending feedback forms after checkout. Guests had already mentally moved on. The moment was gone.

    So we put a QR code in the room. Feedback came in while the guest was still there, still feeling it. Completely different quality of response.

    The “30% didn’t want to leave” insight hit hard. In hospitality we call it involuntary dissatisfaction — guest had a bad experience but never said anything, so staff never fixed it, so the guest just didn’t come back. Silent churn.

  2. 1

    My fix here was embarrassingly low-tech. I started keeping a one-line note next to every churn event in a sheet, even when all I could write was "no clue." A month in, the "no clue" rows ended up clustered around the same two onboarding screens, which the dashboard never would have shown me. Cohort charts are great for telling you something happened, useless for telling you why. The only thing that ever moved the needle for me was being annoyingly specific while the loss was still fresh. I lost a couple of months pretending the data would eventually explain itself.

  3. 1

    This hits hard from the opposite angle. I'm at the stage where I'd kill to have users TO churn. Built an AI support tool for DeFi protocols - 77K lines of code, 46 chains, everything works. Zero paying customers. Sent 80+ cold DMs to protocol founders, nothing. Your point about "one conversation was worth more than 3 months of survey data" resonates, I recently switched from mass outreach to just being helpful in crypto communities one person at a time. Haven't converted anyone yet but the conversations are 10x more real than any cold DM ever was. The failed payments insight is smart too — 30% of churn being accidental is the kind of thing you'd never find without asking at the right moment.

    1. 1

      The community angle is the right move. Cold DMs to protocol founders are hitting people who get 50 of those a day. Being genuinely helpful in the community is how you become the person they think of when the problem gets painful enough.
      77K lines and zero customers is a hard place to be but the code is not the problem. Distribution is. The one conversation that converts you will probably come from someone who saw you help a stranger in a Discord three weeks earlier.
      Keep going with the community approach. It compounds slowly then all at once.

  4. 1

    Thank you for this, this idea may help me with my projects !

    1. 1

      Glad it was useful. If you ever want to try it on your project, get started free at flidget.com or reach out at [email protected]

  5. 1

    One thing that helped me - adding a short exit survey right in the uninstall flow (for Chrome extensions it's the uninstall URL redirect). Even 1-question surveys with 3-4 radio options gave me more signal than any analytics dashboard. Most people won't write a paragraph, but they'll click a radio button.

    1. 1

      The uninstall redirect is an underrated move, most people skip it entirely. One click is the right friction level for that moment, low enough that people actually do it, specific enough that the answer means something.
      The limit though is the same as any survey. A radio button tells you the category, not the story. "Missing feature" as an option and someone naming the exact feature they needed are very different signals. Both have a place depending on what you're optimizing for.

  6. 1

    The timing insight is what got me. The number of times I've had a good product idea while my hands were busy doing something else, only to lose it because I couldn't type fast enough, is embarrassing. Same thing applies here. When you're in the moment of churn discovery, you've got the real signal, but most of us are typing it into a form after the fact, or worse, losing it entirely. The chat-at-cancel approach is basically capturing feedback right when it matters. I run a voice dictation tool, and that's exactly the problem we tried to solve, catching the thought while it's hot.

    1. 1

      The parallel is exact. Both are the same problem in different contexts, capture the signal while it's still live or lose it forever. A thought you had while driving and a churn reason you had while clicking cancel follow the same decay curve. Wait even a few minutes and the specificity is gone.
      The chat at cancel is basically voice dictation logic applied to retention. Get it while it's hot.

  7. 1

    The exit survey is broken by design. People don't know why they're leaving' they just know they're gone. The real time chat at cancellation is smart since you're catching them at the only moment they actually have context. That 30% involuntary churn stat in the buried lead here.

    1. 1

      Exactly. The survey assumes people have a formed opinion ready to report. Most of the time they just have a feeling, and feelings don't survive a text box. The cancel moment is the only place where the feeling and the context are still connected.

  8. 1

    The 30% failed payments line is the one I'd want to pull apart separately. Those aren't churned users, they're a billing problem dressed up as one. Most retention dashboards code them the same way, which means founders optimize the wrong levers entirely. Catching that as its own bucket might be worth more than the chat itself.

    1. 1

      This is the right way to split it. Involuntary churn and voluntary churn need completely different responses and mixing them in the same dashboard is how founders end up improving onboarding for people who never actually decided to leave. Flidget already flags these separately — it changes what you do next entirely.

  9. 1

    The 30% passive churn point hit me hard. I'm building for local business owners (auto shops, dental clinics, etc.) and the card-expiry problem is even worse in that segment these are not people who obsessively monitor their billing. They don't even realize they cancelled.

    The survey approach also fails completely with this audience. They won't fill out forms. What actually worked for my early users was a literal phone call, not a Calendly link, a call. The response rate vs. any written survey was not even comparable.

    The timing insight is real. I've noticed that the users most likely to give honest feedback are the ones who just hit a wall and are frustrated in the moment. Wait 24 hours and they've moved on mentally and the answer you get is sanitized. Catching them at the exact moment of friction is everything.

    1. 1

      The local business segment makes the passive churn problem even sharper. A SaaS founder at least gets a Stripe notification and investigates. An auto shop owner just assumes the software stopped working and moves on.
      The phone call point is something I keep hearing and it always comes back to the same thing — the format has to match the audience. For your users a form is friction, a call feels normal. The underlying principle is the same though, catch them before they've mentally filed it away.
      Timing really is everything. The honest answer has a very short window.

      1. 1

        yeah, session length is what I watch more than cancel reasons now. by the time they're filling out the survey they've already said goodbye - it's just courtesy at that point

        1. 1

          Session length as the leading indicator makes complete sense. It's behavioral, it's honest, and it moves before the decision closes. By the time someone fills out anything they're already gone mentally, you're just collecting their forwarding address at that point.

  10. 1

    exit surveys hit people who already decided to leave - wrong timing, wrong framing. real signal lives in the 7 days before cancellation. it's the week before the cancel click that matters.

    1. 1

      The cancel click is just the receipt, not the decision.
      By the time someone hits cancel they've already moved on. What they type is the easiest version of the truth, not the real one. The real signal lives in the week before, in the feature they quietly stopped using or the session that got shorter every day.
      That's exactly what we're building toward. Connecting the behavioral shift and the exit reason into one timeline so the decision becomes visible before it closes.

  11. 1

    The line about manually messaging a cancelled user being worth more than three months of survey data hit hard. On my own small iOS side project (a Captio replacement) I shipped a beautifully worded exit survey for weeks — got "too expensive" five times and zero usable signal. The moment I switched to a one-line DM with first-name personalization, three of four replies named the exact screen where I'd lost them. The variable, I think, is conversational reciprocity, not the channel. People owe a survey nothing; they owe a human a sentence. Curious — when you trigger the chat at cancel, what reply rate are you seeing vs the old survey?

    1. 1

      People owe a survey nothing, they owe a human a sentence. That's the cleanest way I've heard it put.
      Reply rate on the chat sits meaningfully higher than the old survey, mostly because of timing. The frustration is still live when someone clicks cancel, which is the same reason your one-line DM worked. You caught them before they'd mentally filed it away. The survey asked them to reconstruct something they'd already moved on from.
      The screen-level specificity you got from those DMs is exactly the signal that matters. "Too expensive" five times tells you nothing. One person naming the exact screen tells you everything.

  12. 1

    The 30% failed payment thing is nuts. I run a DTC supplement store and I've been so focused on getting new customers that I never really thought about how many people "leave" just because their card expired. Completely different problem, completely different fix. This reframed how I think about retention.

    1. 1

      The new customer focus is the default mode for most founders and it makes sense early on, but failed payments are recoverable revenue that's already yours. Someone who churned because their card expired never actually decided to leave, which means the conversation you need to have with them is completely different from someone who consciously cancelled.
      Glad the reframe landed. Worth auditing your payment failure rate if you haven't already, the number is usually surprising.

  13. 1

    This new world of AI continues to AMAZE me and leave me speechless. What will be our biggest challenges with these tools at our disposal? What was your biggest challenge with this tool that is already active? Is the tool solely for your use, or can anyone access it, and under what conditions is that possible?

    1. 1

      Anyone can use it, just one script tag on your site and it's live in under two minutes. Free to start at flidget.com.
      The biggest challenge building it was timing. Getting the chat to appear at exactly the right moment without feeling intrusive took a lot of iteration. Too early and it feels pushy. Too late and the reason is already gone.

  14. 1

    The cancel-moment is genius for subscription products, but what about free-tier apps where churn is just someone quietly stopping to open it?

    No cancel button. No payment moment. Just silence which is exactly what you described at the start.

    The manual DM is probably the only equivalent, but curious if you've thought about how to build a trigger around that.

  15. 1

    That 30% involuntary churn stat is genuinely shocking — and the fact that you only found it by manually messaging is the real story here.

    The "silence after" problem is something I've been thinking about a lot too. Exit surveys fail because they're asynchronous and low-stakes. But a live conversation right at the cancel moment? That's when people are actually feeling the pain. Brilliant timing.

    What I found interesting is how this mirrors a problem on the pre-launch side too — developers asking for feedback and getting "looks great!" from friends who don't want to hurt feelings. Same dynamic: the structured, high-friction format (survey, form) fails. The human conversation works.

    Thanks for sharing the involuntary churn breakdown especially — that's the kind of specific, counterintuitive data that actually changes how you think about retention.

    1. 1

      The pre-launch parallel is spot on. Structured format gives people an easy way to be polite. A real conversation removes that escape. The reason the cancel moment works is the same reason a direct message works, the person has already made a decision so there is nothing left to protect. That honesty is what makes it actually useful.

  16. 1

    I keep seeing the same pattern to many entrepreneurs. Most - the majority that I have talked to - do not want to talk to customers. They just want to launch a product then go take care their flowers while the product does the selling, and their emai ldoes the rest.
    So your case here just proves the fact, again, that people want to be treated as people, not just numbers. They do not want a faceless, remote, distant business.
    So well done for taking the courage - if we can call it that, excuse my pun - to send a personal message to your customer.

    I would like to suggest something. If you can, ask customers during signup to provide their "best phone number"" for immediate customer support". Test it between optional and mandatory. You will see that more than 60% will add it, because they will WANT to be called when they add their phoe number.
    And, what we have done with my former agency clients is - that we always set up a call center - even if it was with just one person - to pick up the phone and call the customer to ask what happened.

    1. This beats email and chat by 85%
    2. Your brand solidifies, and earns trust in the customer's mind

    So in my own products now that I am building them, I am adding a "Call us for eimmediate customer support" OR "Add your phone number to call you if you experience any problems"

    Well done in your effort! Very few people do this and they miss out.

    1. 1

      The phone number idea is interesting and honestly underused in SaaS. Most founders default to email because it scales, but you are right that a real call at the right moment is a different conversation entirely. The signup friction argument against collecting phone numbers is probably overblown — if someone genuinely wants support they will add it. Worth testing.

  17. 1

    This hits hard—same thing happened to me, surveys gave nothing but one real convo showed the actual problem. Timing matters more than the question.

    1. 1

      Exactly. The conversation works because the reason is still alive in their head. A survey two days later is asking them to reconstruct something they've already moved on from.

  18. 1

    Thats really good and I think I've more way to fix that is to add Google analytics or any analytics in every page that can help because you will know on which page most conversion broke. Btw that reason before cancel is good idea

    1. 1

      Analytics definitely helps with where people drop off, but it tells you the what not the why. You can see someone spent 40 seconds on the pricing page and left, but you still don't know what stopped them. That gap is exactly what the cancel moment tries to close.

  19. 1

    This is gold! 🚀 The insight about 30% churn being due to failed payments is a huge eye-opener. I'm currently building JewelViz, and I've been so focused on the tech that I almost overlooked the human side of silent churn
    ​Talking directly to users to find that one missing feature' is definitely the move. Thanks for sharing this—definitely going to implement a real-time feedback loop instead of just relying on generic surveys. Keep building

    1. 1

      Good luck with JewelViz. The tech focus trap is real, easy to spend months on the product and forget that the person using it has feelings about it that they'll never type into a form. The real-time feedback loop will change how you see your users completely.

  20. 1

    The silence after churn is genuinely the worst part, "too expensive" and "not what I needed" tell you nothing actionable. You can't fix vague.

    The manual message approach is something I've been doing with my first few users on Trakly (trakly.pro) and you're right that the response quality is incomparable to any survey. People will tell a human things they'd never type into a form.

    The 30% involuntary churn stat is what got me though. Nearly a third of people who "left" didn't actually decide to leave, their card just failed at the wrong moment. That's recoverable revenue that most founders never even realize they're losing. I built a past_due grace period and "fix billing" banner into my SaaS specifically because of this but I hadn't thought about catching it at the cancellation moment itself.

    The chat-at-cancel idea is smart precisely because of timing, exit intent is highest right at that moment. A survey 24 hours later is asking someone to remember why they were frustrated yesterday. A chat right now catches the raw feeling.

    How are you handling the users who don't engage with the chat at all? Curious what your response rate looks like compared to traditional exit emails.

    1. 1

      Response rate is higher than exit emails simply because timing is different — email asks them to remember, the chat catches the raw feeling. Not everyone responds but even 60 percent gives more signal than any survey.
      On involuntary churn, you're right, the cancel moment is actually the perfect place to catch it. Someone whose card failed needs a completely different response than someone actively choosing to leave.

  21. 1

    The "no email back, no reply to my survey, just gone" line is the part that stings the most because it feels personal when it's probably not. Most users who churn aren't angry — they're indifferent. And indifference is harder to learn from than complaints.

    One thing I'd push back on: I think the AI conversational approach works great at scale, but at the pre-100-user stage, a personal email from the founder converts better than any automated system. The fact that the user "replied instantly" when you messaged directly proves it — people respond to humans, not workflows. Save the automation for when you can't keep up manually.

    1. 1

      Fair pushback and mostly agree. Under 100 users the personal email wins because people respond to a human with skin in the game. The automation isn't trying to replace that — it's capturing the reason at the exact moment someone clicks cancel, which the follow up email always misses regardless of who sends it. Both can exist. Manual outreach for win-backs, the chat for real-time signal you'd otherwise never get.

  22. 1

    Manual outreach > surveys is so true. The "one personal message reply was worth 3 months of survey data" insight matches what I see with my own indie app (a small Captio-style memo tool) — every time I treat a churned user like a person and not a data point, I get specific, fixable feedback. Generic exit surveys feel like extra work; a 1-on-1 message feels like being heard.

    Also the 30% involuntary-churn stat is wild and underrated. For me, Stripe Smart Retries plus a pre-dunning email ~7 days before card expiry recovered close to 40% of those.

    Out of curiosity — does the in-cancel chat ever feel intrusive, or are users actually willing to vent in that moment?

  23. 1

    The shift from generic surveys to manual one-on-one messages is the part I keep underlining. On my small indie iOS side project (a lightweight Captio-style memo app), my first churned users gave me canned "too expensive" answers in a form, but a single direct DM uncovered that they couldn't find the export button on iPad — a 20-minute fix that stopped that bleed. The 30% failed-payment finding is sobering too; that's invisible churn no survey ever surfaces. Did you end up doing anything different for that cohort, like a personalized "your card expired" flow versus the standard dunning emails? Curious whether reaching out as a human there moved the needle as much as it did for the voluntary cancellers.

  24. 1

    This hits hard. The “silence after churn” is honestly the worst part.

    I’m currently building a job tracker SaaS, and I’m already worried about this exact problem — people just disappearing without context.

    The idea of capturing feedback at the exact moment of cancellation makes a lot of sense. Timing > method.

    Curious — did you see any drop in completion rate when showing the chat vs a simple cancel button? Or do most users actually engage when it’s immediate?

    1. 1

      Completion rate is actually higher than you'd expect because the timing does the heavy lifting. People who just clicked cancel have a reason fresh in their head — most of them want to say it, they just never had the right moment. The ones who skip were going to disappear anyway. Early stage worry about this is normal but the signal you get from even a handful of responses is worth it.

  25. 1

    Losing users without knowing 'why' is the worst kind of churn. I’ve found that automated exit surveys are often ignored, but a personal reach-out (even if it's just a manual email) usually gets the real truth. As a founder building a self-hosted tool, I’ve realized that sometimes users leave not because the product is bad, but because they hit a technical wall they didn't want to admit. Great lesson on the importance of 'listening' between the lines.

    1. 1

      The technical wall point is underrated. With self-hosted especially, users hit a setup issue, don't want to admit they're stuck, and just quietly disappear. "Too expensive" is easier to say than "I couldn't figure it out." That's exactly why the cancel moment matters — people are more honest when they've already decided to leave, nothing left to protect.

  26. 1

    That's a great idea! I feel like it's something we all would think is obvious, but in the moment, we may not think to include it, and it definitely beats getting surveys out after the fact. Perhaps even before letting the users cancel, having a feature request option where users can share what they would like to see from the product can also help prevent this churn.

    1. 1

      The failed payment stat hit hard. 30% of churn that isn't even a real decision - that's not a product problem, that's a timing problem. Most founders optimize for the wrong thing entirely. And the survey vs real conversation point is something I've felt too. People give survey answers that are socially acceptable, not actually true. "Too expensive" is almost never the real reason.

  27. 1

    Manual outreach beating survey data is one of those lessons that keeps showing up across every kind of product. The real signal lives in a 5 minute conversation, not a 200 person multiple choice survey. The 30 percent involuntary churn from card failures is wild too, that's basically free MRR sitting on the floor for most founders. Curious what tone you used for the in-cancel chat, because the line between "we want feedback" and "please don't go" is pretty thin and one of them feels grabby.

  28. 1

    That “silence after churn” is the frustrating part.

    What stood out is how different the answers are when you catch people in the moment versus asking later. By the time a survey shows up, the real reason is already diluted.

    The failed payments point is interesting as well. That’s not really churn, more like accidental loss, but it probably gets grouped the same way in most setups.

    Have you seen better response rates from the in-flow chat compared to email or surveys?

    1. 1

      Yes and the gap is bigger than I expected honestly.
      In-flow response rates are sitting around 60 to 70 percent. Emails after cancellation rarely broke 10 percent for us and even those replies were vague because the moment was already gone.
      Your point on failed payments is exactly right. It is not real churn but it looks identical in the numbers. That 30 percent finding scared me because those users did not want to leave, they just did. Catching that separately changed how we think about recovery flows entirely.
      Same question, two different moments, completely different answers. That is really the whole idea behind Flidget.

  29. 1

    The real shift here is timing.

    Most churn tools optimize for collecting reasons after the decision.
    But once someone has already left, you’re not collecting truth — you’re collecting a cleaner version of it.

    The closer the question is to the exact moment of friction, the more useful the answer gets.

    Also worth noting: when the problem is this tied to trust and retention, the product has to feel credible before they even hit cancel.

    Curious whether you’ve thought about how much of adoption here is the workflow itself vs how safe the product feels upfront (positioning / brand / trust)?

    1. 1

      That timing point is exactly right. The honest reason exists for maybe 60 seconds after someone clicks cancel. After that they've rationalized it, moved on, and whatever they tell you is a cleaned up version of the truth.
      On the trust question, the widget showing up at cancel is a brand moment whether you want it to be or not. If it feels pushy or automated people just close it. The ones that actually get responses feel like a genuine person asking, not a tool collecting data. So the credibility of the product upfront directly affects whether someone responds or bounces.
      What we found is that plain and low friction wins every time. No survey UI, no progress bars, just a simple question at the right moment. The less it looks like a feature, the more honest the answer gets.

      1. 1

        Exactly — and that’s the part most churn tooling misses.

        By the time someone hits cancel, the UX matters less than the intent they assign to it.

        People answer if it feels like:
        “someone is trying to understand what broke”

        They close it if it feels like:
        “the product is trying to extract one more thing before I leave”

        Same surface.
        Completely different response rate.

        That’s why this ends up being more positioning than widget design.

        The question isn’t just when you ask.
        It’s whether the product has earned enough trust by that moment for the question to feel credible.

        1. 1

          Exactly right and that framing of "earned enough trust by that moment" is something we keep coming back to internally.
          What we noticed is the widget itself almost does not matter. If the product has been quietly useful and stayed out of the way, people answer. If it has felt pushy or salesy at any point before, they close it without reading the question.
          So in a weird way the cancel moment is a trust audit for everything that came before it. The response rate tells you less about your offboarding and more about the relationship you built during the whole journey.
          That is why we keep the widget as plain as possible. No branding, no framing, just a question. Because by that point you either earned the answer or you did not.

  30. 1

    This comment was deleted 2 days ago.

Trending on Indie Hackers
The most underrated distribution channel in SaaS is hiding in your browser toolbar User Avatar 185 comments I launched on Product Hunt today with 0 followers, 0 network, and 0 users. Here's what I learned in 12 hours. User Avatar 157 comments I gave 7 AI agents $100 each to build a startup. Here's what happened on Day 1. User Avatar 98 comments How are you handling memory and context across AI tools? User Avatar 83 comments Do you actually own what you build? User Avatar 46 comments Show IH: RetryFix - Automatically recover failed Stripe payments and earn 10% on everything we win back User Avatar 34 comments