After 600+ founder conversations, 90% are making the same mistake.
And I was doing the same.
A lot of people think their situation is unique.
But in reality - most of the mistakes repeat.
Out of ~30–40 founders I spoke to more deeply, around 8–9 out of 10 were struggling with the same thing:
Building… without really understanding what matters
(I saw this over and over again while talking to founders about contracts, risk, and real decisions they had to make)
One pattern stood out:
A lot of founders skip real validation.
They copy what seems to work for others,
jump into building based on чужой опыт,
or define their market and competitors without really understanding the context.
As a result - wrong assumptions, wrong priorities, wrong product direction.
Here’s what kept repeating:
Building features before validating the problem
Listening to random feedback instead of patterns
Waiting for a “perfect launch” instead of testing early
Overcomplicating instead of making one thing clear
It’s not a lack of effort.
It’s a lack of focus.
One example - when I first launched on Product Hunt, it was just a single landing page.
No full product. Just testing if anyone cared.
That, combined with ~6 years building startups, going through accelerators, and speaking with founders across different stages, made one thing clear:
You don’t need more features - you need clarity on what actually matters.
Still learning, but this shift changed how I approach building completely.
Curious - what was the biggest mistake or shift in your journey?
For those curious what I’m working on:
https://joyful-granita-8415bc.netlify.app/index.html
Founders and business owners - curious to hear your take.
What was the biggest mistake or shift in your journey?
This really resonates. The validation piece is something I learned the hard way building an AI-powered ad creative tool. Early on, I spent weeks perfecting image generation features that I thought were impressive — but when I actually talked to founders and small business owners, what they really wanted was speed and simplicity. They didn't care about having 50 style options; they wanted to paste a URL and get ready-to-use ads in under a minute. That single insight from real conversations completely reshaped our product direction. We went from a complex design tool to a focused workflow: URL in, ads out. The shift from "what can we build?" to "what do people actually need right now?" was the biggest unlock. Totally agree that clarity beats features every time.
That shift from “what can we build?” to “what do people need right now?” is huge.
Going from a complex tool to a simple workflow usually says everything.
Clarity really does beat features.
I looked at "What VIDI finds, in real contracts" and thought it very interesting how it detected the high risk clause. Though I do not know which businesses sign contracts every day, I do imagine it is very useful what you are working on. Especially saving on legal fees. Looks like a tool I'd use regularly of I was one of those businesses. How has client reception been so far? I'm always curious about other professions, even ones I don't practice myself.
Appreciate that - glad it stood out.
The idea is to make it easy to understand what could actually matter before signing, without needing legal knowledge.
Appreciate everyone sharing their experiences here - a lot of patterns are repeating.
Curious to hear more:
what was the biggest mistake or shift that actually changed how you build?
Honestly - painful reps. The first few times I built before validating I wasted months. Eventually the sting of shipping something nobody wanted outweighed the discomfort of sitting with uncertainty.
What helped most practically: replacing 'I think users want X' with 'let me talk to 5 people this week.' The conversations make it feel purposeful rather than paralysing.
Also helped to remind myself: building without validation is not progress. It is just moving fast in an unknown direction.
That’s a great way to put it.
“Moving fast in an unknown direction” is exactly how it feels.
Replacing assumptions with even a few real conversations changes everything
Good question.
For me, the biggest mistake was not understanding the market before starting. I had an electronics business, didn’t study demand in my city properly, and had to shut it down after 4 months.
The shift was focusing on customers first - now I try to understand demand before putting in time and money.
After that, I started a smaller business, and with that experience, things became much more stable.
Learned it the hard way - if I had seen a post like this earlier, it would’ve helped a lot.
Yeah, that’s a great example of it.
It’s interesting how often it’s not about execution, but starting with the wrong assumption about demand.
Curious - what would you do differently now before starting something new?
Now I’d definitely start by researching the market and demand - talking to potential customers and seeing what’s really missing before investing. Experience taught me to focus on the customer first, not just the product.
Yeah, that makes sense - starting with demand changes everything.
Yes, I fell into this same trap. Trying to perfect something when I don't even know what client's problems are. Start simple, get paying clients then build features to solve the client's problems.
That’s a great way to put it.
It’s easy to try to perfect something before even knowing what actually matters to clients.
my problem was trying to perfect and not knowing what was perfect.
Do you think it's more of a validation issue or just founders trying to overbuild early?
I think it’s the same thing in practice.
Overbuilding usually comes from not having real validation - so you try to compensate by adding more.
When the problem is clear, you naturally build less.
Honestly? Painful reps. The first few times I built before validating I wasted months. Eventually the sting of shipping something nobody wanted outweighed the discomfort of sitting with uncertainty.
What helped most practically: replacing "I think users want X" with "let me talk to 5 people this week." The conversations make the uncertainty feel purposeful rather than paralysing - you are gathering signal, not just waiting.
Also helped to remind myself that building without validation is not making progress. It is just moving fast in an unknown direction.
That shift from “I think” to “let me talk to a few people this week” is huge.
It turns uncertainty into something actionable instead of just guessing.
The "copying what seems to work for others" trap is the sneaky one because it looks like market research from the outside. You're watching successful founders and mimicking their visible moves — but you're missing the underlying reasoning that made those choices right for their specific context.
The core problem is almost always the same: founders never developed their own "why customers actually buy this" thesis. Building features is a lot more comfortable than sitting with that question until you genuinely know the answer. The validation step feels slow and uncertain; shipping feels like progress. That's the trap.
That’s a great point.
It’s easy to copy what others are doing without understanding the context behind it.
And yeah - validation feels slow, but skipping it usually just delays the real problem.
Curious - what helped you get comfortable sitting with that uncertainty instead of jumping into building?
"Building features before validating the problem" — this one stings because I've done it.
We spent weeks building template variations for ad creatives before we even confirmed that the real pain point was speed, not variety. Turns out brands don't want 50 template options — they want to paste a URL and get ads ready in seconds.
The moment we stripped everything back to that single workflow, signups jumped. Simplicity > feature count every time.
The pattern I keep seeing: founders who talk to users weekly ship better products than founders who talk to users monthly, regardless of technical skill.
That’s a great example.
It’s interesting how often the real value comes down to one simple workflow, not a set of features.
The “paste URL → get ads” shift makes it very clear.
Curious - what made you realize speed mattered more than flexibility?
This hits home so hard. As an indie dev, I’m currently going through the "pains of shifting from building features to validating the business loop." As you said, we get obsessed with solving technical puzzles (even the nightmare of cross-border payment infra) while forgetting the core question: Will anyone actually pay for this?
My current shift: Make sure the billing works first, then polish the pixels. This is exactly why I’m building LicenseKit — helping devs nail that 1% of licensing & payment friction, so they can focus 99% of their energy on what truly matters.
Check what I'm building: https://tinystrack.com
Yeah, that makes sense.
Getting the payment part right early changes how you think about everything else.
Curious - what made you focus on billing first instead of the product itself?
What kills me is how many founders show up to pitch meetings with 6 months of build time and zero paying users. The ones who close rounds fast are the ones who can point to 10 customers who paid before the product was done. Did you notice a difference in outcomes between founders who validated with real money vs just user interviews?
Yeah, I didn’t really get it at first.
Building felt like progress, but there was no real signal.
Big difference once people actually take action instead of just giving feedback.
Just shared on Indie Hackers:
Building features before validating the problem.
Listening to random feedback instead of patterns.
Waiting for a “perfect launch” instead of testing early.
Overcomplicating instead of making one thing clear.
It’s not a lack of effort — it’s a lack of focus.
When I first launched on Product Hunt, it was just a single landing page. No full product. Just testing if anyone cared.
Years of building startups and talking to founders taught me: you don’t need more features — you need clarity on what actually matters.
What was the biggest mistake or shift in your journey?
Appreciate you sharing this - glad it resonated.
For me, one big shift actually came earlier - I once spoke at a forum with ~10,000 people, and out of 100 projects I was 4th to present.
That experience forced me to think less about features and more about clarity - what actually matters to people in the moment.
Curious what kind of shifts had the biggest impact for you?
That really resonates. In my experience working in digital marketing and with early-stage products, one of the biggest mistakes is exactly what you mentioned, building without validated demand.
A common stat that always stands out is that around 42% of startups fail due to no market need (CB Insights). And from what I’ve seen, it’s rarely because founders didn’t work hard, it’s because they optimized for features instead of problem clarity.
One shift that changed my approach was focusing on problem validation before execution:
Talking to at least 10–15 real users before defining a solution
Testing demand with landing pages or outreach campaigns
Looking for behavioral signals (sign-ups, replies, willingness to pay) instead of just verbal validation
Yeah, makes sense.
Especially the part about behavioral signals - people saying they’re interested and actually taking action are very different things.
That's a painfully high number, but it tracks. The biggest trap I see is building for a "problem" you found in an online forum, without ever talking to a potential user directly. A concrete tip: before writing code, try to manually solve the problem for 3 people. If they won't engage with a manual process, they definitely won't pay for software.
Yeah, that’s a great point.
Solving it manually first forces you to actually understand the problem, not just assume it.
And if people don’t engage even then, that’s a pretty clear signal.
Curious - have you ever had a case where manual work worked, but the product version didn’t?
This is a fair take - with the explosion of coding agents, barriers to building has diminished significantly and people can now build a standard app within a week, probably faster if you are a dev who know what you are doing.
I think another angle on this discussion is distribution. Are people building the wrong product with no market, or are they not distributing it to the right people?
The world is huge and more interconnected than ever before - so, I'm pretty sceptical of products with no buyers. There is probably a market for pretty much everything you can think of, but the challenge is whether you are talking to the right audience for it.
A lot of people thought that coding was the main barrier to entry. Turns out distribution can be just as big of a problem...
Distribution matters, but I think it’s often overrated at early stages.
If people don’t immediately feel the problem, no amount of distribution really fixes that - it just amplifies something weak.
In most cases it’s not “wrong audience”, it’s unclear or weak value.
Distribution starts working once the signal is already there.
Otherwise you just scale noise.
Goes either way - if people testing it early are the wrong audience, you get the wrong signal about the value of the product.
But if you think you are speaking to the right audience and still get no traction, that probably means either the idea needs tweaking or it's solving a non-problem.
Yeah, that’s fair.
But I think “wrong audience” is often used as an easy explanation when something doesn’t resonate.
Even with a small or imperfect audience, if the problem is sharp enough, you usually see some signal.
If there’s no traction at all, it’s rarely just distribution - it’s more likely the value isn’t clear or strong enough yet.
Definitely agree that wrong audience is sometimes used as an easy explanation - but if you are trying to sell an innovative video editor software to a blogger, you are speaking to the wrong person and feedback might discourage you.
But if you are speaking to a vlogger and you get some rough feedback, then it's probably best to go back to the drawing board.
Not to say that I disagree with your original thesis completely, we are on the same page - I'm basically shedding a different light to the same problem from a different angle.
Yeah, that’s a fair distinction.
If you’re clearly talking to the wrong segment, the signal can be misleading.
I guess the tricky part is that a lot of founders assume it’s the wrong audience too early, instead of questioning the clarity of the problem or positioning.
In practice it’s probably both - audience and how well the problem is framed.
this hit close to home. i spent six weeks building 21 digital products, a free api, automated cold email system, the whole stack — before talking to a single customer.
the first real reply i got was a guy telling me my $49 seo fix was insane because "changing a meta description takes 20 seconds." he was right. i was solving a problem nobody valued at the price i set.
what actually started working was flipping the model — give the diagnosis free, charge for implementation. went from 0% reply rate to about 3-4% overnight.
the 90% stat feels generous honestly. building is comfortable. selling is terrifying. most of us (me included) default to the comfortable thing and call it progress.
Yeah, that’s real.
It’s easy to spend weeks building before even talking to someone - feels productive but usually isn’t.
The “diagnosis vs implementation” shift is interesting too.
Curious - what made people actually start paying after that change?
This hits close to home. I've been building SaaS products for a while now and the "building without validation" trap is incredibly easy to fall into — especially for technical founders. You get excited about the architecture, the stack, the features, and before you know it you've spent 3 months building something nobody asked for.
The shift that changed things for me: I started treating the landing page as the product. Before writing a single line of backend code, I'd put up a page describing the problem and the solution, drive some traffic to it, and watch what happened. Not just signups — but how people described the problem back to me in support emails and questions. That language gap between how I described the problem and how users described it was often the real insight.
The other underrated thing: willingness to pay is a completely different signal from interest. Someone saying "this is cool" costs them nothing. Someone putting in a card number (even for a free trial) is a much stronger signal. I've had posts go viral and convert to almost zero paying users, and I've had boring niche tools quietly hit $1k MRR from a tiny audience.
What's the product you're building with VIDI? Curious what problem space you're validating in.
Yeah, that’s real.
Treating the landing page as the product early on makes a big difference.
And agree - payment is a much stronger signal than interest.
Still early on my side, just focusing on understanding what actually matters in the decision before someone signs.
Curious - what made you realize something was worth paying for early on?
The "patterns vs. random feedback" point is underrated.
Used to assume that each and every feature request is a data point. Took me embarrassingly long to realize the only real data is in the questions people are asking. Not the questions they're asking in terms of "feature," but where they're getting stuck. Same three questions over and over again. Different people asking them. That's a pattern. "Add calendar sync" is not.
What's your go-to for recognizing patterns early? Especially before product when there's not a whole lot of data yet?
Yeah, that’s a good question.
Early on it’s less about volume and more about repetition - even 2–3 people describing the same problem in a similar way is already a strong signal.
Not the feature they ask for, but the situation they’re in.
That’s usually where the pattern starts.
Curious - what’s something you thought was a pattern early on, but turned out to be just noise?
One of the users requested integration with some third-party software, that would be very time-consuming to build, it might make sense later though, but since there's just one request I prefer just to keep in a backlog for now.
Yeah, that makes sense.
Single requests like that can be dangerous - especially when they’re expensive to build.
Feels like the right move to keep it in the backlog until you see the same need coming up again.
Curious - have you ever built something like that early and later realized it wasn’t worth it?
"Listening to random feedback instead of patterns" is the one that gets me. Early on we'd get one user asking for Unity support and immediately start scoping it out. Took us months to realize 80% of our actual paying users were on Godot and just wanted the existing thing to work better. The pattern was right there in the data the whole time, we just kept chasing the loudest voice instead.
Yeah, that happens a lot.
It’s crazy how one loud request can pull you in the wrong direction, even when the real pattern is right there.
Easy to miss in the moment.
The pattern I keep seeing: founders validate against their own assumptions, but ignore the validation already baked into the market.
Competitors have been learning for years about who pays, for what, and at what price. That signal is hiding in plain sight — pricing pages show who they're actually targeting, G2 reviews surface the exact jobs customers are hiring the product to do, job listings reveal strategic bets before press releases do, changelog announcements tell you which features got traction.
The irony: founders will spend weeks on customer interviews but skip a two-hour competitor intelligence pass that would give them ground truth from thousands of paying customers instead of a handful of conversations.
This doesn't replace talking to customers. It filters which customers to talk to and which questions matter. The founders who get this right treat competitor data as a first draft of market knowledge — then customer interviews as refinement. The ones who get stuck tend to skip step one entirely.
Yeah, I get what you’re saying.
But I think a lot of founders already look at competitors, pricing pages, reviews - and still miss the point.
The problem is not lack of information, it’s not understanding what actually matters for the decision.
I'm a solo dev and I've been struggling with localization costs for my micro-SaaS. Tools like Lokalise are just too expensive for small projects.
I'm thinking of building a very simple, pay-as-you-go API that translates JSON while keeping the UI context (character limits, etc.). It wouldn't have a dashboard, just a clean endpoint.
Would you actually use something like this if it cost around 1 buck per translation? Or am I overthinking this? Just looking for some honest feedback before I write more code.
I’d probably try to validate it first.
Feels like something that could work, but depends a lot on who actually needs it and how often.
Have you talked to anyone who’s already dealing with this problem regularly?
This resonates. I've been building an AI tool for Canadian government questions and the biggest shift for me was realizing that the problem validation was already there. People were already searching for these answers and getting lost in hundreds of government pages. I didn't need to invent demand, I just needed to make the existing pain go away. The mistake I almost made was overbuilding before putting it in front of real users.
Yeah, that’s a good point - sometimes the demand is already there, just buried under a bad experience.
Feels like the mistake is trying to invent something instead of fixing what already hurts.
Sometimes you get it right early - but that’s pretty rare.
Living this right now. Built a fully functional AI receptionist (multi-tenant, Stripe, Twilio, onboarding form, dashboard) before getting a single paying customer. Zero validation.
The product works. But I skipped the hard part — proving anyone actually wants to pay £39/mo for it. Now I'm 30+ days in, £0 revenue, learning that 'build it and they will come' is fiction.
Your line about 'clarity on what actually matters' hits hard. I thought the tech mattered. Turns out the conversation with the customer matters way more.
Yeah, that’s real.
It’s crazy how easy it is to build something that works but nobody actually needs.
That shift to talking to users first is harder than it sounds.
90% 가 아니라 99%일걸?
맞아요, 거의 99%인 것 같아요 😄
Get up to $200K in GCP credits (24 months)
Eligible AI businesses can access up to $200K in GCP credits (24 months)
*Note : only for AI teams who are focused to build profitable scalable businesses models from day 1
https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB
Oh nice, didn’t know about this - will check it out, thanks.
I’ve been thinking about this a lot recently.
I started building something from a problem I was personally experiencing (in my case, around worldbuilding tools).
So in a way, it didn’t feel like “guessing” — it felt real.
But reading this made me realize something:
That’s still internal validation.
I’ve had positive feedback from a few people who saw what I was working on, but I’m not sure yet how strong the demand actually is.
Right now I’m trying to shift from:
“this makes sense to me”
to
“how many people actually need this, and how urgently?”
Curious — how do you personally distinguish between real validation and just “positive feedback”?
Get up to $200K in GCP credits (24 months)
Eligible AI businesses can access up to $200K in GCP credits (24 months)
*Note : only for AI teams who are focused to build profitable scalable businesses models from day 1
https://www.linkedin.com/posts/sai-rithvik-2176302b1_eligible-ai-companies-can-access-up-to200k-activity-7442865181254209536-EiDB
Yeah, that’s a good realization.
I think the difference shows up in what people do, not what they say.
A lot of people will say “this is cool” - but real validation is when they actually try to use it, come back, or bring their own problem.
Have you had anyone actually try to use it in a real situation yet?
which database does the project use?
Still experimenting with different setups - keeping things flexible for now.
This applies to business in general, not just startups.
I had an electronics business before - didn’t really study my local market or demand properly.
Opened, but there just weren’t enough customers, and I had to shut it down after 4 months.
Biggest lesson for me was understanding the market first, before building anything.
Yeah, makes sense - understanding demand upfront probably saves a lot of time and pain.
How would you test that today before launching something
Today, I’d start by talking to potential customers directly - surveys, interviews, even small test offers - to see if there’s real interest. I’d also check competitors and see what’s missing in the market before investing time or money.
Agreed - talking to real customers early is key.