35
231 Comments

“After 600+ founder conversations, 90% are building the wrong thing”

After 600+ founder conversations, 90% are making the same mistake.

And I was doing the same.

A lot of people think their situation is unique.

But in reality - most of the mistakes repeat.

Out of ~30–40 founders I spoke to more deeply, around 8–9 out of 10 were struggling with the same thing:

Building… without really understanding what matters
(I saw this over and over again while talking to founders about contracts, risk, and real decisions they had to make)

One pattern stood out:

A lot of founders skip real validation.

They copy what seems to work for others,
jump into building based on чужой опыт,
or define their market and competitors without really understanding the context.

As a result - wrong assumptions, wrong priorities, wrong product direction.

Here’s what kept repeating:

Building features before validating the problem
Listening to random feedback instead of patterns
Waiting for a “perfect launch” instead of testing early
Overcomplicating instead of making one thing clear

It’s not a lack of effort.

It’s a lack of focus.

One example - when I first launched on Product Hunt, it was just a single landing page.

No full product. Just testing if anyone cared.

That, combined with ~6 years building startups, going through accelerators, and speaking with founders across different stages, made one thing clear:

You don’t need more features - you need clarity on what actually matters.

Still learning, but this shift changed how I approach building completely.

Curious - what was the biggest mistake or shift in your journey?

For those curious what I’m working on:
https://joyful-granita-8415bc.netlify.app/index.html

Founders and business owners - curious to hear your take.
What was the biggest mistake or shift in your journey?

on March 27, 2026
  1. 1

    This resonates. I'm running six apps right now and the ones getting traction are the ones where I built the simplest version of something I personally needed. The ones where I tried to anticipate what users might want before shipping? Crickets. Solving your own problem first is the fastest validation loop there is.

  2. 1

    Its not about building the wrong stuff , its about distribution! Many think about product but the most important is be able to think first on how to plan to distribute the product rather than building a product and hope clients comes to them

    1. 1

      That’s a good point.

      Distribution matters a lot - but if the problem isn’t real, even good distribution won’t help.

  3. 1

    eally insightful post! I’m a Flutter developer and I help founders turn validated ideas into functional apps quickly, without overcomplicating features. If anyone wants help building or testing an MVP efficiently, feel free to check my portfolio or DM me.

    1. 1

      Appreciate it - makes sense. Keeping things simple is underrated.

  4. 1

    This hits home. I spent 3 months building features for my macOS app before I had a single user. Rewrote the landing page 4 times. Added integrations nobody asked for. Then I stripped it down to literally one thing — show a number in the menu bar — and that's when people actually started buying. The 'overcomplicating instead of making one thing clear' point is the one I wish someone had tattooed on my forehead earlier. Every feature you add before product-market fit is just noise that makes it harder to figure out what's actually working.

    1. 1

      That’s a great example.
      One clear thing works better than many features.

  5. 1

    this matches what i've seen on the outreach side too. i've pitched 540+ agencies across 43 countries and the ones who reply are never the ones with the biggest teams or the fanciest websites. they're the ones actively looking for a specific solution to a specific problem they have right now. the founders building 'general purpose' tools get ignored. the ones who can say 'i found 47 broken links on your site and can fix them for $297/mo' get replies. specificity is the whole game.

    1. 1

      That makes a lot of sense.

      The more specific the problem and outcome, the easier it is for people to respond.

      General tools are easy to ignore, but something tied to a clear issue gets attention.

  6. 1

    The "overcomplicating instead of making one thing clear" point is the one that hits hardest for me. I spent weeks building a dashboard with charts and historical trends for my app when the thing users actually wanted was one number in their menu bar. Literally stripped out 70% of the features and signups went up. The instinct to add more is so strong but the products that win are usually the ones that do less, better. Your point about testing with a landing page first is something I wish more builders internalized — I've seen so many people (including past me) spend 3 months building before finding out nobody cares.

    1. 1

      That’s a great breakdown - really appreciate this.

      The “who can see this document” angle is something I’ve been thinking about as well, and the auto-renewal point is spot on.

  7. 1

    Mighty Hacker Recovery as a specialized service for retrieving lost or stolen cryptocurrency and digital assets. It highlights the company’s use of advanced blockchain analytics, cybersecurity intelligence, and legal investigation methods to handle cases such as phishing attacks, wallet breaches, and fraud. Their services include crypto asset tracing and recovery, investment scam recovery, and assistance with accessing USDT/ETH wallets using decryption techniques. The firm also emphasizes its expertise in blockchain forensics, claiming the ability to track transactions across complex systems like mixers and cross-chain bridges. Additionally, it offers phone hack recovery to restore security and retrieve stolen funds. However, the article includes questionable services such as altering academic grades, which raises credibility concerns. Overall, it positions the company as a fast, confidential, and professional solution for digital asset recovery, urging victims to act quickly and contact their team for immediate assistance.

  8. 1

    Most founders don’t skip validation. They just validate something that never converts.

    1. 1

      That’s a great way to put it.

      Validation without conversion doesn’t really mean much.

  9. 1

    The pattern you describe — building before understanding what matters — is exactly why I built Tivoli as part of TaiwildLab.
    Instead of guessing, Tivoli scans 19 communities every 6 hours looking for repeated pain signals. The filter is simple: the same problem appearing 3+ times with high frustration and commercial intent = real market.
    In 30 days: 523 clusters analyzed, 520 rejected, 3 GO signals, 7 functional products built from real pain.
    The biggest shift for me: I stopped asking "what should I build?" and started asking "where are people already frustrated and paying for bad solutions?". Those two questions lead to completely different places.
    The 10% who get it right aren't smarter. They just validate against behavior, not opinions.

    1. 1

      That’s a great approach.

      The “paying for bad solutions” signal is especially strong - it removes a lot of guessing.

  10. 4

    Appreciate everyone sharing their experiences here - a lot of patterns are repeating.

    Curious to hear more:
    what was the biggest mistake or shift that actually changed how you build?

    1. 1

      Definitely understanding both sides if you're building something of a marketplace. Businesses have no reason to use something you built unless they see a difference in the bottom line.

      1. 1

        That makes sense.

        If it doesn’t impact revenue, people won’t use it.

    2. 1

      I think spending a enough decent time on building the project plan is one of the most important steps to do. The rest can go 'nowhere' if the plans aren't built tight enough.

    3. 1

      Honestly - painful reps. The first few times I built before validating I wasted months. Eventually the sting of shipping something nobody wanted outweighed the discomfort of sitting with uncertainty.

      What helped most practically: replacing 'I think users want X' with 'let me talk to 5 people this week.' The conversations make it feel purposeful rather than paralysing.

      Also helped to remind myself: building without validation is not progress. It is just moving fast in an unknown direction.

      1. 1

        That’s a great way to put it.

        “Moving fast in an unknown direction” is exactly how it feels.

        Replacing assumptions with even a few real conversations changes everything

    4. 1

      Good question.

      For me, the biggest mistake was not understanding the market before starting. I had an electronics business, didn’t study demand in my city properly, and had to shut it down after 4 months.

      The shift was focusing on customers first - now I try to understand demand before putting in time and money.

      After that, I started a smaller business, and with that experience, things became much more stable.

      Learned it the hard way - if I had seen a post like this earlier, it would’ve helped a lot.

      1. 1

        Hi
        I provide eligible AI businesses access to GCP/AWS credits for the next 24 months to help reduce infrastructure costs.

        If this is relevant, happy to share more details.

        https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB

      2. 1

        Yeah, that’s a great example of it.

        It’s interesting how often it’s not about execution, but starting with the wrong assumption about demand.

        Curious - what would you do differently now before starting something new?

        1. 1

          Now I’d definitely start by researching the market and demand - talking to potential customers and seeing what’s really missing before investing. Experience taught me to focus on the customer first, not just the product.

          1. 1

            Yeah, that makes sense - starting with demand changes everything.

  11. 1

    Going through this right now. Built an AI company name generator (shipped in 2 days), put it in front of founders, and both said "I'd just use ChatGPT for free." Humbling. The product works, the names are good, but the willingness to pay isn't there. Biggest shift for me: I used to think building the product WAS the hard part. Turns out building is the easy part. Finding someone who cares enough to pay — that's the actual work. Now I'm 50+ DMs deep trying to get 5 real conversations.

    1. 3

      That’s very real.

      If people aren’t willing to pay, it usually means the pain isn’t strong enough.

      I’d probably look for a different angle where people are already spending money - and start there.

      1. 1

        You're right — the pain isn't strong enough. Naming is a one-time problem and ChatGPT handles it well enough for free. The lesson: don't compete with "good enough and free." Currently exploring where the same AI tech solves a problem people already spend money on. What angle would you look at?

        1. 1

          Yeah exactly - competing with “good enough + free” is very hard.

          I’d look for parts of the workflow where naming actually blocks something important - like branding, positioning, or anything tied to revenue.

          Or where the cost of a bad decision is higher, so people care more.

  12. 2

    I dont know what the Russian word was lol, but agreed otherwise. By trade im a validation/infra engineer, so this hits hard. Ive seen many apps with such great potential, but with the weakest possible "core". This then create AI slop and people bring pitch forks out of anger lol. Idk, a minor rant

    1. 1

      That makes sense.

      Without a strong core problem, everything else becomes noise - even if the tech looks impressive.

  13. 1

    Jumping into speed based on what???

    1. 1

      Based on assumptions instead of real signal from users.

      1. 1

        Thanks for the reply but why does it show some Cyrilic Alphabet?

        1. 1

          Interesting - where exactly do you see that?

          1. 1

            Where you say "... jump into building based on...". Here, i copy-paste: jump into building based on чужой опыт,

            1. 1

              Yeah, that was me - I sometimes mix languages.
              I speak a few, so if I don’t know something in English, I might use Russian or Kazakh.
              I’ll clean it up, thanks for pointing it out.

  14. 1

    My biggest shift: realizing that "building the wrong thing" often means building the right thing for the wrong scope.

    I kept trying to build comprehensive tools — full dashboards, analytics suites, multi-feature platforms. They'd take months and launch to crickets because nobody understood what they did in 5 seconds.

    The thing that actually worked was building something stupidly narrow. One problem, one screen, one price. A macOS menu bar app that does exactly one thing. No onboarding flow needed because there's nothing to explain.

    The pattern I see in your 90% stat: it's not that founders don't know their users' problems. They just can't resist expanding scope until the product becomes "a platform" instead of "a tool." The founders who ship fast and charge immediately seem to be the ones who can tolerate their product feeling embarrassingly simple.

    1. 1

      That’s a great way to put it.

      “Right thing, wrong scope” explains a lot of cases really well.

  15. 1

    "building the wrong thing" hits close to home. i spent 6 weeks building a social media automation system with 5 platforms, 5 themes, AI content generation — the works. total revenue from that: $0.

    then i pivoted to something dead simple: automated cold email outreach for marketing agencies. scan their website, find real SEO issues, email them a personalized pitch. $297/mo.

    same technical skills, completely different outcome. the automation system was cool tech solving no problem. the outreach service is boring tech solving an obvious problem agencies already spend money on.

    90% building the wrong thing sounds about right. i was definitely in that 90% until i stopped building what was interesting and started building what people would pay for.

    1. 1

      That’s a great example.

      Same skills, completely different outcome - the difference is solving a real, obvious problem people already pay for.

  16. 1

    the validation gap you're describing hits close to home. we spent weeks building an automated SEO scanning + cold email system, thinking the tech was the product. turns out the tech was just the delivery mechanism. the actual product was "we'll find your clients' prospects and email them something specific about their website." nobody cared about the scanner. they cared about getting leads without doing the work.

    the shift from "look what i built" to "here's what it does for you" changed every conversation. agencies went from "cool tool" to "how much per month?"

    funniest part: the version that's getting replies cost $0 to build. python, gmail, a cron job. the fancy dashboard we spent a week on? nobody's asked about it once.

    1. 1

      That’s a great insight.

      The shift from “what it is” to “what it does for you” changes everything.

  17. 1

    This resonates with my own experience building AI products for LATAM. I made the classic mistake of building features for months before properly validating with users. The shift that worked for me: talking to 20 actual potential users in one week, then immediately throwing away 60% of what I'd planned to build. It's painful but the signal is invaluable.

    1. 1

      That’s a great shift.

      It’s painful to throw things away, but that’s usually where the real clarity comes from.

  18. 1

    The distribution point someone raised here is underrated. I validated my macOS menu bar app by literally just watching how I used AI tools myself — I kept alt-tabbing to check my OpenAI dashboard to see how much I was spending. That was the pain. Didn't need 600 conversations, just one honest look at my own workflow.

    Built the simplest possible version (just shows token count + cost in the menu bar), put it on Gumroad for $5, and the signal was immediate. People who use multiple LLMs daily felt the same pain.

    The mistake I almost made: adding model comparison charts, historical analytics, team dashboards — all before a single person asked for them. Glad I shipped small first.

    1. 1

      That’s a great example.

      Solving a small, real problem in a simple way often gives much stronger signal than building something more complex.

  19. 1

    This resonates deeply. My biggest shift was realizing that building in isolation was holding me back more than any technical challenge. I spent months perfecting a lightweight memo app before talking to a single potential user. The turning point came when I started actively participating in founder communities and sharing what I was working on openly. The feedback I got in one week of genuine conversations was worth more than three months of solo guessing.

    One pattern I've noticed from connecting with other indie builders: the ones who build consistent relationships in communities — not just dropping links but actually helping others — tend to find product-market fit faster. They get signal from real conversations instead of assumptions.

    Your point about founders skipping validation is spot on. Curious — of those 600+ conversations, did you notice whether founders who were already embedded in communities (IH, Reddit, niche forums) validated better than those building in isolation?

    1. 1

      That’s a great point.

      From what I’ve seen, founders who are closer to real users - whether through communities or direct conversations - tend to get much clearer signal.

      Building in isolation usually leads to more assumptions than actual understanding.

  20. 1

    What I’ve seen is slightly different.

    A lot of founders don’t skip validation.

    They do it… but in a way that feels like progress without actually reducing risk.

    Talking to people who are “kind of” the target
    collecting feedback that isn’t tied to a real decision
    asking questions that lead to polite yes answers

    So it looks like validation, but nothing actually gets ruled out.

    In my case, the shift happened when I started treating validation as elimination, not confirmation.

    Instead of “does anyone want this?”
    it became “what would make this clearly not worth building?”

    That changed what I asked, who I talked to, and how quickly I moved.

    One thing that surprised me: most early mistakes weren’t about wrong ideas, they were about not forcing a decision early enough.

    1. 1

      That’s a great way to frame it.

      A lot of what looks like validation is really just avoiding a real decision.

      I think a lot also depends on the questions being asked and who you’re talking to.

      In many cases, it still comes down to people not fully understanding the problem they’re solving.

  21. 1

    This resonates. The "building without understanding what matters" pattern is real.

    I'd add one layer: most founders also don't understand where their time is actually going. I tracked my own 168 hours/week and found I was spending 3x more time on things I couldn't even name than on the thing I said was my priority.

    Wrong product direction is one problem. But wrong time allocation is what lets it go unchecked for months. You can build the wrong thing for a year and never notice - because you never measured where the hours were actually going.

    The clarity you're describing isn't just about what to build. It's about what you're actually spending your weeks on while you're "building."

    1. 1

      That’s a great point.

      It’s easy to think the issue is the product, but often it’s how time is actually being spent that drives everything.

      I’ve had the same experience - it’s surprisingly easy to lose track of where the time actually goes.

  22. 1

    The pattern I keep seeing (and have been guilty of): founders who talk to 20 people, hear 3 mention a pain point, and treat that as "validated." Then they build for 4 months and wonder why nobody converts.

    Real validation isn't "yeah that sounds cool" — it's "can I pay you for that right now?" or at minimum "can I get on a waitlist?"

    What was the most common type of "wrong thing" you saw in those 600 conversations? Was it solution-first thinking or going after problems that aren't painful enough?

    1. 1

      That’s very real.

      The most common pattern I saw was solution-first thinking - people jumping into building before really understanding how painful the problem is.

      A lot of “validation” was just polite feedback, not real intent or behavior.

  23. 1

    The 'building features before validating' pattern is real. We spent the first couple months adding things to our product (bank imports, invoice generation, multi-language support) before realizing the actual bottleneck was much simpler: expat freelancers in Germany get letters from the tax authority they literally cannot read. One feature, one pain point, and suddenly the product made sense to people. The irony is we had the right insight from day one because we were building for ourselves. We still managed to overbuild first. The discipline to stop is harder than the insight.

    1. 1

      That’s a great example.

      It’s interesting how one clear problem can make everything click, even if you had the insight early on.

  24. 1

    The validation point hits hard. I built patchBay (API directory, 3,100+ public APIs) and the biggest shift for me was realizing the
    idea didn't need to be new. ProgrammableWeb was the go-to API directory for a decade. It shut down in 2023. The problem was
    validated, the audience was proven, I just had to execute better.

    The mistake I almost made: spending weeks on "unique" features nobody asked for instead of doing the basics well. A clean catalog
    that's actually maintained beats a feature-packed one that's half broken. Shipped fast, validated with real outreach, and now
    focused on distribution over features.

    Biggest shift: stop building for yourself, start building for the person already googling the problem.

    1. 1

      That makes sense.

      A proven problem with better execution and distribution often matters more than trying to build something completely new.

  25. 1

    Great point. Honestly this matches what I see too.

    A lot of founders treat building like progress, when in reality learning is the real progress. Shipping features feels productive, but if the core problem isn’t validated, you’re just moving faster in the wrong direction.

    The biggest shift for me was realizing that a simple landing page, demo, or even a conversation can teach more than weeks of coding. Once you see real user behavior, priorities change fast.

    1. 1

      That’s a great way to put it.

      Shipping feels like progress, but without real user signal it’s easy to move in the wrong direction.

  26. 1

    This hits home for me as well. I spent 6-7 months building an app and only got a handful of users. I realized it was more of a vitamin rather than a painkiller. Wish I had known this before and validated lol. I'll take that lesson. How do you guys validate before you build? I've tried subreddits but it's usually anti-marketing lol. Cold DM's and cold emails hasn't helped either. I'm terrible at sales, but is this best way? To actually speak to prospect customers?

    1. 1

      That’s a very real situation.

      From what I’ve seen, it’s less about “sales” and more about understanding what people are already doing today.

      Instead of pitching, I usually just ask:
      what are you doing now, what’s frustrating about it, and how often it comes up.

      If it’s a real pain, people get very specific very quickly.

      1. 1

        That's great advice! Also makes it seems less salesy lol I've made that mistake diving straight into the product without asking them what they currently do. Thanks for the solid advice!

        1. 1

          Glad it helped.

          Once you start from what people are already doing, things usually become much clearer.

  27. 1

    This hits close to home. I'm building a health analytics platform and spent the first months adding feature after feature (7 modules, AI coaching, predictions...). Only recently realized the real problem isn't features — it's that people don't even know why they need it. Shifted focus to free tools (alcohol impact calculator, caffeine half-life calculator) that solve a micro-problem instantly. No signup needed. It's basically a "show, don't tell" approach to distribution. Still early but already seeing organic traffic from Google. Sometimes the best product work is building the thing BEFORE your product.

    1. 1

      That’s a great approach.

      Solving a small, immediate problem first makes it much easier for people to understand the value without needing explanation.

    2. 1

      Very good point! I've seen some founders do this. It's a great strategy. Give the users some free tools no sign up hassle then they see value and you can sprinkle in your product and they trust you enough to try it.

      1. 1

        Yeah, lowering the barrier like that makes a big difference - people can experience the value before committing to anything.

  28. 1

    man this post is calling me out personally.

    i've been building a job search tool for 6 months now. resume scoring, auto-apply, the whole thing. spent so long making it "complete" that i have 3 people on my waitlist. three. i could literally text all of them right now.

    should've just shipped the ATS scoring feature alone back in december and seen if anyone even wanted it. instead i kept going "ok but what about interview prep, what about application tracking, what about..." and here i am with a full product and basically no users to show for it.

    launching next week on PH and honestly kind of terrified that i'm about to find out the hard way whether i built something people actually want or just something i thought was cool.

    curious what you saw in those 600 conversations. were the founders who validated early mostly doing cold outreach to potential users? or more like landing page tests? trying to figure out the fastest way to get honest feedback in the next few days before i go live.

    1. 1

      That’s very real.

      From what I’ve seen, the fastest signal usually comes from direct conversations - not landing pages.

      If you already have people on a waitlist, I’d just talk to them directly and ask:
      what they’re doing today, what’s frustrating, and whether this would actually change their behavior.

      Landing pages can help, but they tend to give softer signals.

      Also - launching and getting real usage will tell you more in a week than months of guessing.

  29. 1

    This hits close to home. When I first launched my tool I was convinced the main use case was monitor testing — but looking at actual user behavior, most people were using it as a lighting source for video calls and product photography. The "wrong thing" trap isn't always about the product itself, sometimes it's about which problem you think you're solving vs. which one you're actually solving. Took me longer than I'd like to admit to stop pushing the testing angle and lean into the lighting/photography use case in my copy and SEO. The data was telling me the whole time.

    1. 1

      That’s a great example.

      The gap between intended use and actual use can be surprisingly big - and it’s easy to miss until you really pay attention to behavior.

  30. 1

    I ran into that exact problem.

    For me it wasn’t about building more, it was about fixing something I kept dealing with.

    Bots and automated traffic were hitting my sites constantly and everything out there either felt overbuilt or too expensive.

    So I built BlockABot to solve my own problem and see if it actually worked.

    So far it has, and I’m hoping it helps others dealing with the same thing too.

    That shift made a big difference for me.

    1. 1

      Interesting.

      In some cases I’ve seen basic filtering (IP/device patterns) handle a lot of bot traffic.

      Curious what made it not enough in your case?

      1. 1

        Yeah that helped in some cases, especially for obvious patterns.

        For me the issue was I manage a few sites across different servers, so keeping rules and blocks in sync was difficult.

        I wanted something lightweight that could sit in front with a single JS and a central backend, so it could learn and apply across everything instead of managing each site separately.

        Still early, but it’s been working better than the pieced together approach so far.

        1. 1

          Got it, that makes sense.

          I was mostly dealing with simpler cases, but once you get into things like rotating IPs, residential proxies, or headless browsers, it becomes a different problem entirely.

  31. 1

    This hits close. I spent 4+ years as a creator thinking the problem was me, wrong content, wrong timing, wrong strategy. Turns out I was building for a platform that was never designed to let me win.

    The shift for me was realizing I wasn't solving the right problem. I wasn't bad at creating. The discovery system was broken.

    Now I'm building EchoLive, a live streaming platform for the 99% of creators the algorithm ignores. The validation wasn't a survey. It was 4 years of living the problem firsthand.

    Biggest lesson: the market doesn't care how hard you worked. It cares whether you solved the right thing.

    1. 3

      I get that.

      I’ve had similar experiences over the past couple of years - thinking the problem was me, when it was actually the direction or the system.

      With time it starts to make more sense, but it doesn’t always come quickly.

      1. 1

        Exactly. And honestly, that clarity only comes from being in it long enough to see the pattern. Appreciate you sharing this, good reminder that the best validation is usually lived experience, not a survey.

        1. 1

          100%. That’s where the real signal tends to come from.

          1. 1

            Appreciate you sharing this. Following your journey from here.

  32. 1

    The 'building features before validating the problem' one hits hard. I see this constantly with people using AI coding tools now. Claude Code made building so fast that people skip straight to implementation without spending even an hour checking if anyone would pay for what they're making. Speed without direction just means you arrive at the wrong destination faster. The founders who win aren't the fastest builders. They're the ones who spent a boring afternoon searching Gumroad and Reddit to see if real people are already complaining about the problem they want to solve

    1. 1

      That’s a great way to put it.

      Speed makes it easier to build, but not to choose what’s worth building. The direction part is where most of the real work still is.

  33. 1

    This really resonates.

    I’m currently building an invoice SaaS and caught myself doing exactly this — focusing on features instead of validating what actually matters.

    Recently shifted to just testing the core idea: can someone create an invoice in 30 seconds and actually find it useful.

    Still early, but that clarity already changed how I’m building.

    Curious — how do you personally validate before building now?

    1. 1

      That’s a solid shift.

      For me it’s mostly about conversations - understanding what people are already doing today and how painful that is.

      If it’s not a clear priority, it usually doesn’t hold.

      1. 1

        That makes sense.
        I’m seeing the same — if there’s no urgency, people don’t switch.
        Right now I’m trying to understand how painful invoicing actually is for users.
        Any specific questions you ask to uncover that?

        1. 1

          I usually try to keep it simple.

          Things like:
          – how are you handling this today?
          – what’s the most annoying part of that process?
          – have you ever tried to fix it before?

          If they can answer clearly, it’s usually a real problem.

          1. 1

            That’s really helpful, appreciate it 🙌
            I like how simple those questions are — going to try this approach.

            1. 1

              I think it depends on the situation - you can adjust the questions.
              These questions probably cover around 40%.

              1. 1

                Yeah that makes sense.
                Good starting point, then adjust based on the situation. Appreciate it 🙌

  34. 1

    This hit hard. I'm building JewelViz — an AI jewelry try-on tool for Indian jewelry sellers. Honestly, I almost fell into the same trap. I was obsessing over the tech stack, the model quality, the UI — while I hadn't even confirmed if sellers actually cared about the problem.
    Took me a while to realize — Indian jewelry shop owners don't think "I need AI." They think "photo shoots cost me ₹15,000 and I still don't get good results." That reframe changed everything about how I pitch and build.
    Still early, still figuring it out. But this post is something I'll keep coming back to.

    1. 1

      That’s a great reframing.

      People rarely think in terms of tools - they think in terms of outcomes and trade-offs.

  35. 1

    The "listening to random feedback instead of patterns" bit hit home. I burned like 3 weeks chasing feature requests from people who were never gonna pay. Biggest shift for me was forcing myself to validate with strangers, not friends. Friends always say "yeah that sounds cool" lol. Do you find founders who validate well tend to do more structured interviews or is it more scrappy than that?

    1. 1

      That’s very real.

      From what I’ve seen, it’s usually less about structured interviews and more about asking the right questions. The best signal comes when you understand what they’re already doing and where this fits in their priorities.

  36. 1

    Spot on! Clarity over features is the real game-changer.

    1. 1

      Exactly - once the core thing is clear, everything else becomes much simpler.

  37. 1

    I just killed a 6-week AI lead response project because nobody actually wanted it. Built the whole system, tested with 26 businesses - 15% interest rate but zero willingness to pay. Trust barrier was insurmountable.

    This week I pivoted to AI accounting for Portuguese freelancers because I AM the customer and the pain is real. Validation changed from "sounds interesting" to "when can I use this?"

    The hardest part isn't finding the wrong thing - it's killing it when you've already invested weeks.

    1. 1

      That’s a tough but important shift.

      That gap between interest and willingness to pay is real - and hard to see until you’re deep into it.

  38. 1

    The pattern I'd add: validating with the wrong people. With Dograh we were getting feedback from people curious about voice AI but not actually building with it. The moment we only talked to people who had already tried to build a voice agent and hit a wall - one conversation told us more than twenty of the previous ones.

    Individual feedback is noise. The same complaint from five unconnected people is signal.

    1. 1

      That’s very real.

      The same feedback repeating across different people is usually where the real signal starts to show.

  39. 1

    Validation is the part that hurts to skip but costs you everything if you do. The pattern I keep seeing: founders conflate interest with intent. People saying "I would use that" is not the same as people paying, or even changing their behavior. The most useful question I have found is: what are you doing right now to solve this problem? If the answer is nothing, the problem is not painful enough to build for.

    1. 1

      That’s a great point.

      The gap between “I would use this” and actual behavior is bigger than it seems. Asking what people are doing today really exposes whether it’s a real problem or not.

  40. 1

    If you are still pre-revenue, I would make the partner search smaller and more testable. Instead of "marketing partner," look for one person who can help run a single 2-week experiment for one audience with one offer. For example: indie hackers who want to ship a micro-SaaS faster, or freelancers who need a reusable client template. That forces clarity on ICP, channel, and success metric before you add relationship complexity. The best partners usually show up after you already have one message that gets replies and one channel that produces some signal.

    1. 1

      That’s a good point.

      Keeping it small and testable makes a lot of sense - otherwise it’s easy to add complexity before there’s any real signal.

  41. 1

    The biggest shift for me was treating validation as a ranking problem, not a yes/no problem. Early on I kept asking "is this a good idea?" and people would usually give polite encouragement. Much better signal came from asking where this problem sits in their weekly priorities, what they currently do instead, and whether the workaround is painful enough to pay to remove. If the problem is real, people get specific very quickly. If they stay abstract, that usually tells you everything.

    1. 1

      That framing is really interesting.

      The “ranking vs yes/no” shift explains a lot - people often say something sounds good, but it’s not actually a priority.

    1. 1

      Not an exact statistic - more of a pattern I kept seeing in conversations.

      The majority were building before really understanding what mattered.

  42. 1

    Interesting point.From what I’ve seen, it’s often not that founders pick the wrong problem, but that they jump into building too fast.

    They don’t spend enough time checking if people actually care - and then get stuck with the first idea.

    1. 1

      That’s true - jumping into building too fast is a big part of it.

      What I kept noticing though is that even when people do feel something is off, they still move forward anyway.

  43. 1

    The validation point resonates strongly. The pull toward
    building is real — it feels productive, and it's easier
    to measure progress when you're shipping features rather
    than talking to users.

    The biggest shift for me was realizing that early feedback
    from friends and colleagues is almost always too polite to
    be useful. The signal that actually mattered came from
    strangers who had no reason to be kind — they either cared
    about the problem or they didn't.

    The "single landing page on Product Hunt" approach is
    underrated. Testing whether anyone cares before building
    anything is a much harder discipline than it sounds,
    especially when you already have a strong conviction
    about the solution.

    1. 1

      That’s a great point.

      The “too polite” feedback is real - it took me a while to realize that as well. Strangers tend to give much clearer signals.

  44. 1

    This is so real. I wasted months building features no one asked for — things only changed when I started testing small and watching what people actually use.

    1. 1

      Hi bhavin
      I provide eligible AI businesses access to GCP/AWS credits for the next 24 months to help reduce infrastructure costs.

      If this is relevant, happy to share more details.

      https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB

    2. 1

      That shift to watching what people actually do is huge.

      It usually reveals what really matters much faster than building more features.

  45. 1

    Woke up to single-digit views on my first thread this morning.
    I spent three weeks getting the color right before I talked to a single potential buyer. Factory in Dongguan, rejected prototypes, all of that — and zero market validation the whole time.
    The part about building in a vacuum is where I’m still stuck honestly.

    1. 1

      Hi
      I provide eligible AI businesses access to GCP/AWS credits for the next 24 months to help reduce infrastructure costs.

      If this is relevant, happy to share more details.

      https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB

    2. 1

      That’s very real.

      Building in a vacuum feels like progress, but it’s hard to see what actually matters.

      Even a couple of conversations can change that completely.

  46. 1

    This is my first SaaS.

    And it didn’t start with a brilliant idea…
    it started with frustration.

    I spent months building things nobody wanted.
    Not because they were bad —
    but because I never stopped to ask the right question:

    👉 Is this even worth building?

    No technical background.
    No team.
    No clear direction.

    Just trial, error… and wasted time.

    Learning how to build was hard.
    But realizing I was building without clarity
    was even harder.

    That’s where GoOrDrop came from.

    Not as another tool,
    but as a response to that mistake:

    👉 decide before you build.

    GoOrDrop doesn’t build products for you.
    It doesn’t automate everything.
    It doesn’t sell hype.

    It helps you do something more important:

    Understand if an idea deserves your time… or not.

    Because most failures don’t come from bad execution.
    They come from executing the wrong thing.

    This project cost me time, doubt, and frustration…
    but it also gave me clarity.

    And that changes everything.

    Still learning.
    But this time — with direction.

    — Ricardo Alvarez

    1. 1

      Hi
      I provide eligible AI businesses access to GCP/AWS credits for the next 24 months to help reduce infrastructure costs.

      If this is relevant, happy to share more details.

      https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB

    2. 1

      That shift from building to deciding first is a big one.

      Realizing you’re solving the wrong thing is usually the hardest part.

  47. 1

    The "overcomplicating instead of making one thing clear" point is the one I keep relearning. I built an iOS alarm app that speaks motivational quotes when you wake up. My first instinct was to add every feature — multiple voice options, quote categories, streak systems, milestone rewards. Most of that ended up mattering, but the thing that actually got people excited was dead simple: "what if your alarm said something worth hearing instead of a noise you hate?"

    I spent way too long perfecting features before I could even articulate the core feeling I was solving. Once I nailed that one sentence, everything else fell into place — the landing page, the App Store description, the pitch.

    Your point about launching with just a landing page is underrated. The founders I see struggling most are the ones who want everything polished before anyone sees it. Shipping something ugly that people actually want beats shipping something beautiful that nobody asked for.

    1. 1

      This comment was deleted a day ago.

  48. 1

    This hits hard.

    I’ve noticed the same pattern:

    Most founders don’t fail because they lack effort…
    They fail because they validate too late.

    They build first.
    Then they try to understand if it matters.

    But by then, time is already gone.

    One shift that changed everything for me:

    👉 Stop asking “how do I build this?”
    👉 Start asking “is this even worth building?”

    That single question saves months.

    Lately, I’ve been focusing more on filtering ideas before execution — understanding signals, real demand, and whether something deserves time at all.

    It’s a completely different way of thinking.

    Curious —
    what’s one decision you made earlier that would have saved you months?

    1. 1

      Hi
      I provide eligible AI businesses access to GCP/AWS credits for the next 24 months to help reduce infrastructure costs.

      If this is relevant, happy to share more details.

      https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB

    2. 1

      That question changes everything.

      Asking “is this worth building?” early saves a lot of wasted time.

  49. 1

    The "building features before validating the problem" point really hits. I just shipped an iOS app and the thing that saved me was building the share extension first — the one feature that let people actually test the core use case — before touching anything else. Collections, search, AI features, all came after I confirmed people would actually use the share button daily. Would have wasted weeks on a tagging system nobody needed if I'd followed my original roadmap. The other thing that changed my approach: watching people use the app on their phone instead of reading survey responses. Two minutes of screen recording taught me more than 20 email replies. People say they want features. Their behavior tells you what they actually need.

    1. 1

      That’s a great example.

      Watching real usage usually makes things much clearer than feedback alone.

  50. 1

    That matches what I've seen, "wrong thing" is usually a solution people can describe but will not change behavior for. The best filter has been asking for a pre-commitment before building, even a small payment, a warm intro, or time blocked on their calendar. One pitfall is over-weighting the loudest interviews and missing quieter but real demand.

    1. 1

      That’s a good point.

      Behavior is a much stronger signal than opinions - especially early on

  51. 1

    The counterweight to this: if you can observe a real user struggling with a specific problem, that is worth more than 100 interviews with hypothetical customers. Abstract feedback gets filtered through what people think they want. Watching someone actually struggle shows what they need.

    1. 1

      That’s a great point.

      Watching what people actually do is often more revealing than what they say.

  52. 1

    The counterweight to this: if you can observe a real user struggling with a specific problem, that is worth more than 100 interviews with hypothetical customers. Abstract feedback gets filtered through what people think they want. Watching someone actually struggle shows what they need.

    1. 1

      That’s a great point.

      Watching what people actually do is often more revealing than what they say.

  53. 1

    The counterweight to this: if you can observe a real user struggling with a specific problem, that is worth more than 100 interviews with hypothetical customers. Abstract feedback gets filtered through what people think they want. Watching someone actually struggle shows what they need.

    1. 1

      That’s a great point.

      Watching what people actually do is often more revealing than what they say.

  54. 1

    The "building features before validating the problem" one hits hard. When we started working on our SaaS, our first instinct was to build every ad format, every platform integration, every customization option. We spent weeks on features nobody asked for.

    The shift came when we just put a basic version in front of 10 small business owners and watched them use it. Turns out they didn't care about 90% of what we'd planned. What they actually wanted was dead simple — paste a URL, get something they could post right now. No creative direction, no template browsing, no design choices. Just "make me an ad from this."

    That single insight killed half our roadmap and probably saved us months. Your point about testing with a single landing page resonates too. We did something similar — threw up a waitlist page describing what we wanted to build before writing a line of code. The signup rate told us more in 48 hours than weeks of planning would have.

    What was the most common "wrong thing" you saw founders building in those conversations? Was it usually over-engineering the solution, or was it more fundamental — like solving a problem nobody actually had?

    1. 1

      Hi
      I provide eligible AI businesses access to GCP/AWS credits for the next 24 months to help reduce infrastructure costs.

      If this is relevant, happy to share more details.

      https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB

    2. 1

      More often it was solving something that wasn’t urgent.

      When the problem is real, things tend to simplify quickly.

  55. 1

    This resonates so much. I spent weeks building features nobody asked for before I started just talking to potential users. The gap between what founders think people want and what they actually need is wild. Curious what the other 10% were doing right from the start?

    1. 1

      Hi
      I provide eligible AI businesses access to GCP/AWS credits for the next 24 months to help reduce infrastructure costs.

      If this is relevant, happy to share more details.

      https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB

    2. 1

      From what I’ve seen, they talk to users much earlier and don’t wait to feel “ready.”

      They focus more on understanding the problem than building the solution.

  56. 1

    We should start building with proved systems, framworks, not just build. I think people that structure things have better results.

    1. 1

      Structure helps, but without real user signal it’s easy to just follow a process without learning much.

  57. 1

    Biggest shift for me: stopped building
    features and started asking "what happens
    today when X breaks for your users?"

    One conversation revealed more than
    two weeks of coding. Building a security
    tool for SaaS founders right now —
    the only thing that actually moved me
    forward was getting on calls and asking
    dumb questions.

    1. 1

      Hi
      I provide eligible AI businesses access to GCP/AWS credits for the next 24 months to help reduce infrastructure costs.

      If this is relevant, happy to share more details.

      https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB

    2. 1

      That shift to asking better questions is huge.

      One good conversation can replace weeks of building.

  58. 1

    Love this take. “Wrong thing” here isn’t tech, it’s priorities shipping features without ever really sitting in the messy reality of what actually matters to the people signing the contracts. The way you contrast a single landing page & real conversations with 600+ founders vs. months of heads‑down building is such a sharp reminder that validation is about clarity, not volume. The biggest shift for me was similar: stop asking “what can I build?” and start asking “where is someone already making a painful, expensive decision today that I don’t fully understand yet?”

    1. 1

      “Painful decision today” is a great way to frame it.

      That’s usually where the real signal is.

  59. 1

    — multiple model options, granular controls, advanced customization. Then we actually talked to the people we were building for (foThe "building features before validating the problem" one hits hard. When we started working on our SaaS, our first instinct was to build every ad format, every platform integration, every customization option. We spent weeks on features nobody asked for. The shift came when we just put a basic version in front of 10 small business owners and watched them use it. Turns out they didn't care about 90% of what we'd planned. What they actually wanted was dead simple — paste a URL, get something they could post right now. No creative direction, no template browsing, no design choices. Just "make me an ad from this." That single insight killed half our roadmap and probably saved us months. Your point about testing with a single landing page resonates too. We did something similar — threw up a waitlist page describing what we wanted to build before writing a line of code. The signup rate told us more in 48 hours than weeks of planning would have. What was the most common "wrong thing" you saw founders building in those conversations? Was it usually over-engineering the solution, or was it more fundamental — like solving a problem nobody actually had?unders, small business owners) and realized they didn't want any of that. They just wanted to paste a URL and get ads they could post immediately. That single insight — simplify ruthlessly — changed everything about our product direction. The founders who struggle most seem to be the ones who treat building as the goal rather than learning. Shipping a landing page first (like you did with PH) is underrated because it forces you to articulate the value before you've built anything. Curious — of the founders you spoke with who did validate well, was there a common pattern in how they approached it? Or was it more that they just had the discipline to talk to users before writing code?

    1. 1

      Hi
      I provide eligible AI businesses access to GCP/AWS credits for the next 24 months to help reduce infrastructure costs.

      If this is relevant, happy to share more details.

      https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB

    2. 1

      Mostly not a technical issue.

      More often it was solving something that felt interesting, but not urgent for users.

      Once the problem is real, the product usually becomes much simpler.

  60. 1

    this hits home. i spent weeks building features nobody asked for before i actually talked to service business owners lol. once i did, turns out the #1 thing they wanted wasnt a fancy dashboard - just automated follow up emails that go out without them thinking about it. simple problem simple solution. hardest part was getting out of my own head and actually listening to what they needed vs what i thought was cool to build

    1. 1

      That’s very relatable.

      The hard part is not building - it’s letting go of what you think is valuable

  61. 1

    This really resonates. The validation piece is something I learned the hard way building an AI-powered ad creative tool. Early on, I spent weeks perfecting image generation features that I thought were impressive — but when I actually talked to founders and small business owners, what they really wanted was speed and simplicity. They didn't care about having 50 style options; they wanted to paste a URL and get ready-to-use ads in under a minute. That single insight from real conversations completely reshaped our product direction. We went from a complex design tool to a focused workflow: URL in, ads out. The shift from "what can we build?" to "what do people actually need right now?" was the biggest unlock. Totally agree that clarity beats features every time.

    1. 1

      That shift from “what can we build?” to “what do people need right now?” is huge.

      Going from a complex tool to a simple workflow usually says everything.

      Clarity really does beat features.

  62. 1

    I looked at "What VIDI finds, in real contracts" and thought it very interesting how it detected the high risk clause. Though I do not know which businesses sign contracts every day, I do imagine it is very useful what you are working on. Especially saving on legal fees. Looks like a tool I'd use regularly of I was one of those businesses. How has client reception been so far? I'm always curious about other professions, even ones I don't practice myself.

    1. 1

      Appreciate that - glad it stood out.

      The idea is to make it easy to understand what could actually matter before signing, without needing legal knowledge.

  63. 1

    Yes, I fell into this same trap. Trying to perfect something when I don't even know what client's problems are. Start simple, get paying clients then build features to solve the client's problems.

    1. 1

      That’s a great way to put it.

      It’s easy to try to perfect something before even knowing what actually matters to clients.

      1. 1

        my problem was trying to perfect and not knowing what was perfect.

        Do you think it's more of a validation issue or just founders trying to overbuild early?

        1. 1

          I think it’s the same thing in practice.

          Overbuilding usually comes from not having real validation - so you try to compensate by adding more.

          When the problem is clear, you naturally build less.

  64. 1

    Honestly? Painful reps. The first few times I built before validating I wasted months. Eventually the sting of shipping something nobody wanted outweighed the discomfort of sitting with uncertainty.

    What helped most practically: replacing "I think users want X" with "let me talk to 5 people this week." The conversations make the uncertainty feel purposeful rather than paralysing - you are gathering signal, not just waiting.

    Also helped to remind myself that building without validation is not making progress. It is just moving fast in an unknown direction.

    1. 1

      That shift from “I think” to “let me talk to a few people this week” is huge.

      It turns uncertainty into something actionable instead of just guessing.

  65. 1

    The "copying what seems to work for others" trap is the sneaky one because it looks like market research from the outside. You're watching successful founders and mimicking their visible moves — but you're missing the underlying reasoning that made those choices right for their specific context.

    The core problem is almost always the same: founders never developed their own "why customers actually buy this" thesis. Building features is a lot more comfortable than sitting with that question until you genuinely know the answer. The validation step feels slow and uncertain; shipping feels like progress. That's the trap.

    1. 1

      That’s a great point.

      It’s easy to copy what others are doing without understanding the context behind it.

      And yeah - validation feels slow, but skipping it usually just delays the real problem.

      Curious - what helped you get comfortable sitting with that uncertainty instead of jumping into building?

  66. 1

    "Building features before validating the problem" — this one stings because I've done it.

    We spent weeks building template variations for ad creatives before we even confirmed that the real pain point was speed, not variety. Turns out brands don't want 50 template options — they want to paste a URL and get ads ready in seconds.

    The moment we stripped everything back to that single workflow, signups jumped. Simplicity > feature count every time.

    The pattern I keep seeing: founders who talk to users weekly ship better products than founders who talk to users monthly, regardless of technical skill.

    1. 1

      Hi
      I provide eligible AI businesses access to GCP/AWS credits for the next 24 months to help reduce infrastructure costs.

      If this is relevant, happy to share more details.

      https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB

    2. 1

      That’s a great example.

      It’s interesting how often the real value comes down to one simple workflow, not a set of features.

      The “paste URL → get ads” shift makes it very clear.

      Curious - what made you realize speed mattered more than flexibility?

  67. 1

    This hits home so hard. As an indie dev, I’m currently going through the "pains of shifting from building features to validating the business loop." As you said, we get obsessed with solving technical puzzles (even the nightmare of cross-border payment infra) while forgetting the core question: Will anyone actually pay for this?

    My current shift: Make sure the billing works first, then polish the pixels. This is exactly why I’m building LicenseKit — helping devs nail that 1% of licensing & payment friction, so they can focus 99% of their energy on what truly matters.

    Check what I'm building: https://tinystrack.com

    1. 1

      Hi
      I provide eligible AI businesses access to GCP/AWS credits for the next 24 months to help reduce infrastructure costs.

      If this is relevant, happy to share more details.

      https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB

    2. 1

      Yeah, that makes sense.

      Getting the payment part right early changes how you think about everything else.

      Curious - what made you focus on billing first instead of the product itself?

  68. 1

    What kills me is how many founders show up to pitch meetings with 6 months of build time and zero paying users. The ones who close rounds fast are the ones who can point to 10 customers who paid before the product was done. Did you notice a difference in outcomes between founders who validated with real money vs just user interviews?

    1. 1

      Yeah, I didn’t really get it at first.

      Building felt like progress, but there was no real signal.

      Big difference once people actually take action instead of just giving feedback.

  69. 1

    Just shared on Indie Hackers:

    Building features before validating the problem.
    Listening to random feedback instead of patterns.
    Waiting for a “perfect launch” instead of testing early.
    Overcomplicating instead of making one thing clear.

    It’s not a lack of effort — it’s a lack of focus.

    When I first launched on Product Hunt, it was just a single landing page. No full product. Just testing if anyone cared.

    Years of building startups and talking to founders taught me: you don’t need more features — you need clarity on what actually matters.

    What was the biggest mistake or shift in your journey?

    1. 1

      Appreciate you sharing this - glad it resonated.

      For me, one big shift actually came earlier - I once spoke at a forum with ~10,000 people, and out of 100 projects I was 4th to present.

      That experience forced me to think less about features and more about clarity - what actually matters to people in the moment.

      Curious what kind of shifts had the biggest impact for you?

  70. 1

    That really resonates. In my experience working in digital marketing and with early-stage products, one of the biggest mistakes is exactly what you mentioned, building without validated demand.

    A common stat that always stands out is that around 42% of startups fail due to no market need (CB Insights). And from what I’ve seen, it’s rarely because founders didn’t work hard, it’s because they optimized for features instead of problem clarity.

    One shift that changed my approach was focusing on problem validation before execution:

    Talking to at least 10–15 real users before defining a solution
    Testing demand with landing pages or outreach campaigns
    Looking for behavioral signals (sign-ups, replies, willingness to pay) instead of just verbal validation

    1. 1

      Hi
      I provide eligible AI businesses access to GCP/AWS credits for the next 24 months to help reduce infrastructure costs.

      If this is relevant, happy to share more details.

      https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB

    2. 1

      Yeah, makes sense.

      Especially the part about behavioral signals - people saying they’re interested and actually taking action are very different things.

  71. 1

    That's a painfully high number, but it tracks. The biggest trap I see is building for a "problem" you found in an online forum, without ever talking to a potential user directly. A concrete tip: before writing code, try to manually solve the problem for 3 people. If they won't engage with a manual process, they definitely won't pay for software.

    1. 1

      Hi
      I provide eligible AI businesses access to GCP/AWS credits for the next 24 months to help reduce infrastructure costs.

      If this is relevant, happy to share more details.

      https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB

    2. 1

      Yeah, that’s a great point.

      Solving it manually first forces you to actually understand the problem, not just assume it.

      And if people don’t engage even then, that’s a pretty clear signal.

      Curious - have you ever had a case where manual work worked, but the product version didn’t?

  72. 1

    This is a fair take - with the explosion of coding agents, barriers to building has diminished significantly and people can now build a standard app within a week, probably faster if you are a dev who know what you are doing.

    I think another angle on this discussion is distribution. Are people building the wrong product with no market, or are they not distributing it to the right people?

    The world is huge and more interconnected than ever before - so, I'm pretty sceptical of products with no buyers. There is probably a market for pretty much everything you can think of, but the challenge is whether you are talking to the right audience for it.

    A lot of people thought that coding was the main barrier to entry. Turns out distribution can be just as big of a problem...

    1. 1

      Hi
      I provide eligible AI businesses access to GCP/AWS credits for the next 24 months to help reduce infrastructure costs.

      If this is relevant, happy to share more details.

      https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB

    2. 1

      Distribution matters, but I think it’s often overrated at early stages.

      If people don’t immediately feel the problem, no amount of distribution really fixes that - it just amplifies something weak.

      In most cases it’s not “wrong audience”, it’s unclear or weak value.

      Distribution starts working once the signal is already there.

      Otherwise you just scale noise.

      1. 1

        Goes either way - if people testing it early are the wrong audience, you get the wrong signal about the value of the product.

        But if you think you are speaking to the right audience and still get no traction, that probably means either the idea needs tweaking or it's solving a non-problem.

        1. 1

          Yeah, that’s fair.

          But I think “wrong audience” is often used as an easy explanation when something doesn’t resonate.

          Even with a small or imperfect audience, if the problem is sharp enough, you usually see some signal.

          If there’s no traction at all, it’s rarely just distribution - it’s more likely the value isn’t clear or strong enough yet.

          1. 1

            Definitely agree that wrong audience is sometimes used as an easy explanation - but if you are trying to sell an innovative video editor software to a blogger, you are speaking to the wrong person and feedback might discourage you.

            But if you are speaking to a vlogger and you get some rough feedback, then it's probably best to go back to the drawing board.

            Not to say that I disagree with your original thesis completely, we are on the same page - I'm basically shedding a different light to the same problem from a different angle.

            1. 1

              Yeah, that’s a fair distinction.

              If you’re clearly talking to the wrong segment, the signal can be misleading.

              I guess the tricky part is that a lot of founders assume it’s the wrong audience too early, instead of questioning the clarity of the problem or positioning.

              In practice it’s probably both - audience and how well the problem is framed.

  73. 1

    this hit close to home. i spent six weeks building 21 digital products, a free api, automated cold email system, the whole stack — before talking to a single customer.

    the first real reply i got was a guy telling me my $49 seo fix was insane because "changing a meta description takes 20 seconds." he was right. i was solving a problem nobody valued at the price i set.

    what actually started working was flipping the model — give the diagnosis free, charge for implementation. went from 0% reply rate to about 3-4% overnight.

    the 90% stat feels generous honestly. building is comfortable. selling is terrifying. most of us (me included) default to the comfortable thing and call it progress.

    1. 1

      Hi
      I provide eligible AI businesses access to GCP/AWS credits for the next 24 months to help reduce infrastructure costs.

      If this is relevant, happy to share more details.

      https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB

    2. 1

      Yeah, that’s real.

      It’s easy to spend weeks building before even talking to someone - feels productive but usually isn’t.

      The “diagnosis vs implementation” shift is interesting too.

      Curious - what made people actually start paying after that change?

  74. 1

    This hits close to home. I've been building SaaS products for a while now and the "building without validation" trap is incredibly easy to fall into — especially for technical founders. You get excited about the architecture, the stack, the features, and before you know it you've spent 3 months building something nobody asked for.

    The shift that changed things for me: I started treating the landing page as the product. Before writing a single line of backend code, I'd put up a page describing the problem and the solution, drive some traffic to it, and watch what happened. Not just signups — but how people described the problem back to me in support emails and questions. That language gap between how I described the problem and how users described it was often the real insight.

    The other underrated thing: willingness to pay is a completely different signal from interest. Someone saying "this is cool" costs them nothing. Someone putting in a card number (even for a free trial) is a much stronger signal. I've had posts go viral and convert to almost zero paying users, and I've had boring niche tools quietly hit $1k MRR from a tiny audience.

    What's the product you're building with VIDI? Curious what problem space you're validating in.

    1. 1

      Hi
      I provide eligible AI businesses access to GCP/AWS credits for the next 24 months to help reduce infrastructure costs.

      If this is relevant, happy to share more details.

      https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB

    2. 1

      Yeah, that’s real.

      Treating the landing page as the product early on makes a big difference.

      And agree - payment is a much stronger signal than interest.

      Still early on my side, just focusing on understanding what actually matters in the decision before someone signs.

      Curious - what made you realize something was worth paying for early on?

  75. 1

    The "patterns vs. random feedback" point is underrated.
    Used to assume that each and every feature request is a data point. Took me embarrassingly long to realize the only real data is in the questions people are asking. Not the questions they're asking in terms of "feature," but where they're getting stuck. Same three questions over and over again. Different people asking them. That's a pattern. "Add calendar sync" is not.
    What's your go-to for recognizing patterns early? Especially before product when there's not a whole lot of data yet?

    1. 1

      Yeah, that’s a good question.

      Early on it’s less about volume and more about repetition - even 2–3 people describing the same problem in a similar way is already a strong signal.

      Not the feature they ask for, but the situation they’re in.

      That’s usually where the pattern starts.

      Curious - what’s something you thought was a pattern early on, but turned out to be just noise?

      1. 1

        One of the users requested integration with some third-party software, that would be very time-consuming to build, it might make sense later though, but since there's just one request I prefer just to keep in a backlog for now.

        1. 1

          Yeah, that makes sense.

          Single requests like that can be dangerous - especially when they’re expensive to build.

          Feels like the right move to keep it in the backlog until you see the same need coming up again.

          Curious - have you ever built something like that early and later realized it wasn’t worth it?

  76. 1

    "Listening to random feedback instead of patterns" is the one that gets me. Early on we'd get one user asking for Unity support and immediately start scoping it out. Took us months to realize 80% of our actual paying users were on Godot and just wanted the existing thing to work better. The pattern was right there in the data the whole time, we just kept chasing the loudest voice instead.

    1. 1

      Hi
      I provide eligible AI businesses access to GCP/AWS credits for the next 24 months to help reduce infrastructure costs.

      If this is relevant, happy to share more details.

      https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB

    2. 1

      Yeah, that happens a lot.

      It’s crazy how one loud request can pull you in the wrong direction, even when the real pattern is right there.

      Easy to miss in the moment.

  77. 1

    The pattern I keep seeing: founders validate against their own assumptions, but ignore the validation already baked into the market.

    Competitors have been learning for years about who pays, for what, and at what price. That signal is hiding in plain sight — pricing pages show who they're actually targeting, G2 reviews surface the exact jobs customers are hiring the product to do, job listings reveal strategic bets before press releases do, changelog announcements tell you which features got traction.

    The irony: founders will spend weeks on customer interviews but skip a two-hour competitor intelligence pass that would give them ground truth from thousands of paying customers instead of a handful of conversations.

    This doesn't replace talking to customers. It filters which customers to talk to and which questions matter. The founders who get this right treat competitor data as a first draft of market knowledge — then customer interviews as refinement. The ones who get stuck tend to skip step one entirely.

    1. 1

      Hi
      I provide eligible AI businesses access to GCP/AWS credits for the next 24 months to help reduce infrastructure costs.

      If this is relevant, happy to share more details.

      https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB

    2. 1

      Yeah, I get what you’re saying.

      But I think a lot of founders already look at competitors, pricing pages, reviews - and still miss the point.

      The problem is not lack of information, it’s not understanding what actually matters for the decision.

  78. 1

    I'm a solo dev and I've been struggling with localization costs for my micro-SaaS. Tools like Lokalise are just too expensive for small projects.

    I'm thinking of building a very simple, pay-as-you-go API that translates JSON while keeping the UI context (character limits, etc.). It wouldn't have a dashboard, just a clean endpoint.

    Would you actually use something like this if it cost around 1 buck per translation? Or am I overthinking this? Just looking for some honest feedback before I write more code.

    1. 1

      Hi
      I provide eligible AI businesses access to GCP/AWS credits for the next 24 months to help reduce infrastructure costs.

      If this is relevant, happy to share more details.

      https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB

    2. 1

      I’d probably try to validate it first.

      Feels like something that could work, but depends a lot on who actually needs it and how often.

      Have you talked to anyone who’s already dealing with this problem regularly?

  79. 1

    This resonates. I've been building an AI tool for Canadian government questions and the biggest shift for me was realizing that the problem validation was already there. People were already searching for these answers and getting lost in hundreds of government pages. I didn't need to invent demand, I just needed to make the existing pain go away. The mistake I almost made was overbuilding before putting it in front of real users.

    1. 1

      Hi
      I provide eligible AI businesses access to GCP/AWS credits for the next 24 months to help reduce infrastructure costs.

      If this is relevant, happy to share more details.

      https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB

    2. 1

      Yeah, that’s a good point - sometimes the demand is already there, just buried under a bad experience.

      Feels like the mistake is trying to invent something instead of fixing what already hurts.

      Sometimes you get it right early - but that’s pretty rare.

  80. 1

    Living this right now. Built a fully functional AI receptionist (multi-tenant, Stripe, Twilio, onboarding form, dashboard) before getting a single paying customer. Zero validation.

    The product works. But I skipped the hard part — proving anyone actually wants to pay £39/mo for it. Now I'm 30+ days in, £0 revenue, learning that 'build it and they will come' is fiction.

    Your line about 'clarity on what actually matters' hits hard. I thought the tech mattered. Turns out the conversation with the customer matters way more.

    1. 1

      Hi
      I provide eligible AI businesses access to GCP/AWS credits for the next 24 months to help reduce infrastructure costs.

      If this is relevant, happy to share more details.

      https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB

    2. 1

      Yeah, that’s real.

      It’s crazy how easy it is to build something that works but nobody actually needs.

      That shift to talking to users first is harder than it sounds.

  81. 1

    90% 가 아니라 99%일걸?

    1. 1

      맞아요, 거의 99%인 것 같아요 😄

  82. 1

    Get up to $200K in GCP credits (24 months)

    Eligible AI businesses can access up to $200K in GCP credits (24 months)
    *Note : only for AI teams who are focused to build profitable scalable businesses models from day 1

    https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB

    1. 1

      Oh nice, didn’t know about this - will check it out, thanks.

  83. 1

    I’ve been thinking about this a lot recently.

    I started building something from a problem I was personally experiencing (in my case, around worldbuilding tools).

    So in a way, it didn’t feel like “guessing” — it felt real.

    But reading this made me realize something:

    That’s still internal validation.

    I’ve had positive feedback from a few people who saw what I was working on, but I’m not sure yet how strong the demand actually is.

    Right now I’m trying to shift from:
    “this makes sense to me”
    to
    “how many people actually need this, and how urgently?”

    Curious — how do you personally distinguish between real validation and just “positive feedback”?

    1. 1

      Get up to $200K in GCP credits (24 months)

      Eligible AI businesses can access up to $200K in GCP credits (24 months)
      *Note : only for AI teams who are focused to build profitable scalable businesses models from day 1

      https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB

    2. 1

      Yeah, that’s a good realization.

      I think the difference shows up in what people do, not what they say.

      A lot of people will say “this is cool” - but real validation is when they actually try to use it, come back, or bring their own problem.

      Have you had anyone actually try to use it in a real situation yet?

  84. 1

    which database does the project use?

    1. 1

      Still experimenting with different setups - keeping things flexible for now.

  85. 1

    This applies to business in general, not just startups.

    I had an electronics business before - didn’t really study my local market or demand properly.
    Opened, but there just weren’t enough customers, and I had to shut it down after 4 months.

    Biggest lesson for me was understanding the market first, before building anything.

    1. 1

      Agreed. Searching market is one of the hardest things during the whole process tbh.

    2. 1

      Yeah, makes sense - understanding demand upfront probably saves a lot of time and pain.

      How would you test that today before launching something

      1. 1

        Today, I’d start by talking to potential customers directly - surveys, interviews, even small test offers - to see if there’s real interest. I’d also check competitors and see what’s missing in the market before investing time or money.

        1. 1

          Hi
          I provide eligible AI businesses access to GCP/AWS credits for the next 24 months to help reduce infrastructure costs.

          If this is relevant, happy to share more details.

          https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB

        2. 1

          Agreed - talking to real customers early is key.

  86. 1

    This comment was deleted a day ago.

Trending on Indie Hackers
I've been reading 50 indie builder posts a day for the past month. Here's the pattern nobody talks about. User Avatar 163 comments I shipped 3 features this weekend based entirely on community feedback. Here's what I built and why. User Avatar 144 comments Finally reached 100 users in just 12 days 🚀 User Avatar 120 comments I'm a lawyer who launched an AI contract tool on Product Hunt today — here's what building it as a non-technical founder actually felt like User Avatar 112 comments I realized showing problems isn’t enough — so I built this User Avatar 32 comments Maintaining open-source projects that millions use User Avatar 27 comments