A few weeks ago I posted here for the first time.
I had a half-finished tool, no users, and no idea if anyone needed it.
That post got 400+ comments.
I read every single one.
Here's something I didn't expect:
The comments didn't just give feedback.
They gave me the product.
One insight changed everything:
Founders don't think in terms of "contract analysis."
They ask one question:
"Is there anything in this contract that could cost me money later?"
That became the entire product.
So I rebuilt VIDI around that.
What's new in VIDI 2.0:
Today I launched VIDI 2.0 on Product Hunt:
→ https://www.producthunt.com/products/vidi-ai-contract-review?launch=vidi-2-0-ai-contract-review
Current numbers:
Still early - but it finally feels like a real product.
If you want to try it:
→ https://joyful-granita-8415bc.netlify.app
Curious:
Have you ever signed a contract that looked fine…
but later realized it had a clause that cost you money?
Nice, a few years back we launched on Producthunt and got 650 signups in 2-3 days. I am launching another product today on PH. Hope it would work.
That’s impressive - 650 signups in a few days is solid.
Curious what drove most of that - was it your existing audience or mainly Product Hunt itself?
Good luck with today’s launch 🚀
Honestly, that day I was feeling sick and was guilty of not working, so I posted on PH just to make it a non-zero work day. Evening, my colleague calls me and tells either someone is spamming us, or we are blowing up with real users. Google Analytics confirmed it's from PH. Here is what I think happened.
Thank you for today's launch, I am nervous already :)
That’s really interesting - especially the “market pull” part.
I feel like I’m starting to see small signs of that, but definitely not at that level yet. Still early, and a lot of it is coming from conversations rather than pure pull.
Trying to understand what actually makes people come back and use it again - feels like that’s the key.
Really appreciate you sharing this, super helpful.
Anytime and good luck!
Happy to answer any questions or hear your feedback 🙌
Curious how you usually review contracts today?
Before VIDI, I mostly skimmed contracts myself or relied on basic templates - definitely risky. Using VIDI 2.0 caught clauses I would have missed and gave me real peace of mind. Honestly, it’s become my go-to for any new contract now.
Love that - that’s exactly what I’m trying to build
Curious what else it flags for you as you go through more contracts
Stuff like termination clauses, hidden fees, and liability risks - the things you don’t notice until it’s too late.
That makes a lot of sense - those are exactly the areas where things get expensive fast.
Really appreciate you sharing that
Appreciate it
If you run into anything that feels off or missing as you keep using it, definitely let me know - that kind of feedback is gold.
The reframe from "contract analysis" to "will this cost me money later?" is the whole product in one insight. That's not just positioning — it changes what you measure, what features you prioritize, and who you're actually serving.
Building ShieldWays (security scanning for code), the same pattern came up. We started with "static analysis tool" and discovered users were really asking "does my codebase have vulnerabilities that will get me breached or fail a compliance audit?" Different framing, same underlying product, completely different conversion.
On the data security angle that came up in comments — this matters a lot for your specific use case. Contracts often contain commercially sensitive terms (pricing, liability caps, exclusivity). The trust question isn't just "will AI analyze this correctly" — it's "who can see this document?" Making your data handling visible in the product (not just a privacy page) is worth prioritizing early. That alone can unlock B2B customers who'd otherwise hesitate.
One thing worth exploring: the auto-renewal detection is probably your stickiest feature. That's the clause that bites quietly — users who get burned by one will come back to check every contract going forward. Surfacing that clearly in your marketing ("never miss an auto-renewal again") might convert better than the general risk framing.
That’s a great breakdown - really appreciate this.
The “who can see this document” angle is something I’ve been thinking about as well, and the auto-renewal point is spot on.
The Contract Health Score is a great way to productize complex AI outputs. Reducing a long legal analysis to a 0-100 score makes the value immediately graspable for founders who don't want to read the fine print. Curious if you found that users trust the score immediately or if they still dive straight into the 'missing clause' detection first?
Yeah, the score helps make it immediately understandable, but trust doesn’t really come from the number itself.
People usually look at the specific clauses or issues first - the score just gives quick context around it.
Still figuring out how that balance should work.
Relaunching after that much feedback is a huge step. One thing I've learned from my own AI tool launches is that the most critical feedback often comes from users who almost loved it, but had one deal-breaking friction point. Did you find a common "almost" moment that became your top priority for this version?
Yeah, that’s been very true so far.
The biggest ‘almost’ has been after the first analysis - people see the value, but it’s not always clear what to do next or how it ties to an actual decision.
That’s been the main thing I’m focusing on now.
Congrats on the relaunch — the insight about reframing from "contract analysis"
to "what could cost me money later" is exactly the kind of repositioning that
changes everything.
I'm seeing the same pattern building Stratiic, which is on the proposal writing side. The original framing was "save time on proposals" — but the comment that actually resonated with consultants was "stop losing deals because your proposal looked like
everyone else's."
Same product, completely different reason to care.
Did you find that the reframe also changed who your best users are?
We're finding the pain is sharpest for solo consultants, not firms — curious if you're seeing something similar.
Appreciate that - yeah, that shift in ‘why people care’ really changes how people look at it.
Still early, but I’m noticing context (who’s signing, how often, what’s at stake) plays a big role in how it’s perceived.
Interesting point about consultants - curious what made it resonate more with them in your case?
That sounds quite interesting, and perhaps I should adopt that approach. I'm also at the beginning of app development and wondering how best to involve the target audience to find out what they'd like. I've also considered ProductHunt, but I'm hesitant to go public with my still-nascent idea. Best regards.
Yeah, I was in a similar spot not long ago.
What helped most wasn’t trying to figure everything out upfront, but just talking to people and watching how they actually deal with the problem in real life.
PH is fine, but it didn’t really change much for me compared to direct conversations.
What kind of product are you working on?
An analysis tool for athletes (runners, triathletes, swimmers, cross-country skiers, etc.).
Got it - sounds like a pretty different space, interesting though. Good luck with it.
If anything, just talking to athletes early on and seeing how they actually approach it usually gives more clarity than trying to figure it out upfront.
Thank you for your input.
Super interesting pivot.
From an engineering perspective, turning unstructured legal text into a clear “risk score” is a big abstraction leap — that’s where the real value is.
Feels like the UX layer (score + highlights) matters as much as the model itself.
Curious — where has the system struggled most so far? False positives, missing clauses, or edge cases?
Appreciate it - yeah, a lot of it comes down to balancing simplicity and usefulness. Still early, so mostly observing how people interact with it.
Curious how you think about this space?
I think you’re on the right track. Congrats on the relaunch — the shift in positioning makes a lot of sense.
Appreciate it - that shift made a bigger difference than I expected.
Nice ! i work on exactly the project hahaha.
i will try it now
Congrats !
Nice, curious to hear what you think once you try it.
Would be great to get your feedback.
As someone also building an AI-powered contract tool (EqualDocs), this post hit home on every level. The reframe from "contract analysis" to "is there anything in this that could cost me money later" is exactly the kind of shift we went through too — and it changes everything about how you build, position, and talk to users.
What resonates most is the 400+ comments → rebuild approach. I've been posting about our progress and the feedback has been invaluable, but reading every single comment and then fully rebuilding around the pattern is the hard part. Most founders would cherry-pick. You actually listened at scale.
The Contract Health Score is a smart addition. Risk is abstract; a number is actionable. We've found the same thing — turning uncertainty into something concrete is where the real value lives.
Curious about your approach to onboarding new users — did you add any guided walkthrough or contextual help in 2.0? That's been one area where we've seen first-time users get the most confused despite the product being straightforward.
Following along and rooting for VIDI!
Appreciate this - sounds like you’ve gone through a very similar shift.
On onboarding, it’s still pretty minimal on my side for now - trying to keep things simple and let the output do most of the work. But I’m starting to see that first-time experience matters more than I expected.
Curious how you’re approaching it on your side - more guided flow or keeping it lightweight?
Also, I might be misunderstanding your direction a bit - are you leaning more toward a contract management/workflow tool (like Ironclad or DocuSign), or more on the risk/decision side?
Feels like there are a few different directions in this space.
Thanks for your quick reply — it does feel like a lot of us are arriving at the same place.
What’s been interesting on our side is realizing that “keeping it simple” only works up to a point. In law, simplicity without context can actually reduce trust — because the user doesn’t know what assumptions the system is making. So we’ve been trying to stay lightweight on the surface, but quietly introduce just enough structure to ground the output.
Not a guided flow in the traditional sense — more like capturing intent early, so the system can reason in a way that actually maps to how a lawyer would approach risk.
On direction — we’re intentionally staying away from the Ironclad / DocuSign category.
Those products are built around process — routing, approvals, execution.
What we’re more interested in is the layer before that: helping someone understand what they’re actually agreeing to, where the leverage is, and what decisions need to be made.
So less workflow, more judgment.
My sense is the space is starting to separate along that line:
• execution systems
• decision systems
And a lot of products are still sitting somewhere in between, which makes positioning (and onboarding) harder than it looks.
Curious how you’re thinking about that boundary as well — especially as the first-time experience starts to carry more of the product weight.
Interesting - the “less workflow, more judgment” framing makes sense.
Curious how you're thinking about that boundary in practice - where does your product lean more today?
Still figuring out how people actually expect to interact with something like this on first use.
With our current version, we intentionally built it UI‑heavy, so users still need to perform quite a few operations to complete a task. That’s been useful for learning, but with the trend toward more conversational, AI‑first tools, a more minimal UI is clearly the way to go. We’re already working on a second version that pushes more of the workflow into the AI layer so people can focus on judgment instead of clicking through screens. I just post my very first story here, hope I can receive many feedback as you do: https://www.indiehackers.com/post/im-a-lawyer-who-launched-an-ai-contract-tool-on-product-hunt-today-here-s-what-building-it-as-a-non-technical-founder-actually-felt-like-5bf3eb144d
Interesting - makes sense that you’re moving more toward a conversational flow.
Curious how you're thinking about that shift in practice - what changes most for the user compared to your current version?
Love this question, because that’s exactly what we’re trying to correct in v2.
Right now, our product still feels like a “classic SaaS tool” — lots of UI, multiple steps. But tools like Claude and ChatGPT have already trained people to expect a simple text box where they can just say what they want and let the system handle the rest. Once you get used to that, traditional contract UIs start to feel heavy and unfriendly.
So the shift in practice for us is: v2 should look and feel much closer to that plain text window — “Here’s my contract, here’s my role, here’s what I’m worried about” — and let the AI orchestrate the workflow behind the scenes. The hope is that by matching the habits people already have, we’ll get clearer, more honest feedback on the underlying value, instead of them fighting the interface.
Yeah, makes sense - curious to see how that plays out in practice.
I'm an indie developer too and this post resonated with me — not because of contracts specifically, but because of the lesson underneath it.
I spent months building features without truly understanding what my users were actually asking for. The moment I stopped describing what my tool does and started answering what they fear — everything shifted.
The health score (0–100) is a smart call. Numbers reduce anxiety. Instead of "here are 12 risks," you give users one clear signal to act on.
Reading 400 comments and rebuilding from scratch takes real courage. Most founders skim, take 3-4 notes, and move on. You found the pattern and built around it. That's the difference.
Congrats on the relaunch. Following this.
Really appreciate this - and yeah, that shift was a big one.
At some point I realized I was explaining what the product does, instead of answering what people are actually worried about before signing.
The score idea came from that - trying to turn uncertainty into something clearer to act on.
Sounds like you went through a very similar process.
this is a great idea! this line resonated w me- still early but feels like a product
Appreciate that - glad it resonated.
Still early, but trying to make it feel like something people can actually rely on, not just test once.
the "400 founders taught me" part hits hard. we're at the very beginning of this — 6 weeks in, built an seo audit tool, started cold emailing small businesses with their actual site problems. 65 emails out, 3 real conversations back. your point about listening before building the next version is something i wish i'd internalized earlier. we spent 3 weeks building features before talking to a single potential customer. now the cold emails are teaching us more about what people actually want fixed than any amount of market research did. curious how you handled the feedback loop between what users said they wanted vs what actually converted for you on PH?
That’s a great insight - those conversations are where most of the real learning comes from.
For me, the key was focusing on patterns rather than individual feedback. When the same concern kept coming up from different people, that was usually the signal.
I also paid attention to behavior - what contracts people actually uploaded and how they reacted to the output, not just what they said.
For PH, the biggest shift was framing. Once it aligned with how people think (“could this cost me money later?”), engagement and conversion improved a lot compared to the first version.
Sounds like you’re on the right track with that approach.
Good lesson for anyone building tools. Going to try apply this to what I'm working on.
Appreciate that - glad it was useful.
Curious what you're building?
Building tools which help in my current animation work.
I'm in beta testing with reformat.video which transforms one long form 16:9 video into 15 second chunks for social media.
Also had issues with instant file transfer reliability so looking at building a separate solution thats better, that also utilises abit of AI in the process. Looking to getting that in beta testing next month if things go according to plan.
That’s interesting - building from your own workflow makes a lot of sense.
Curious what’s been harder so far: getting people to use it, or making it work reliably?
little bit of both to be honest. Not being from a dev background has been a challenge building.
But yeah I struggle with marketing but it's still early days so will be looking to push out product demos, promos, social reels etc
That makes sense - early days are always a mix of both.
One thing that helped me was focusing less on promos and more on a few real conversations. The signal tends to come from there much faster
ok thanks for the insight. I'll try focus more on having conversations in that case. I read building in public really helps so might also try that and build a community around it.
Yes, take a look at the build and public there is a post and there are many questions from beginners or people who started, you can read the post itself, I think it will be useful to you https://www.indiehackers.com/post/after-600-founder-conversations-90-are-building-the-wrong-thing-be6a4175e1
That shift in framing is the interesting part. People don’t care about “analysis”, they care about the outcome.
We’ve seen something similar where the way you describe the problem matters more than the feature itself.
Once it matches how people actually think, everything else gets a lot easier.
Exactly - that was the biggest shift.
Nothing fundamentally changed in the product at first, but once the framing matched how people actually think, everything started to click much faster.
It made both the product and the conversations around it a lot clearer.
Huge congrats on the launch, Meirambek! 🚀 It’s amazing to see how community feedback shaped VIDI 2.0. As a solo founder building a visual plotting tool for writers, your journey is incredibly inspiring and gives me a lot of hope for my own upcoming launch. Hope the PH run has been a huge success so far!
Thank you, really appreciate that - means a lot.
Good luck with your launch as well, building in public and getting feedback early makes a huge difference. Hope it goes great for you.
Sorry for the late reply! I've been heads down fixing some bugs for Gridance Studio. Thanks again for the encouragement-your advice on building in public really resonates with me. I'll keep pushing forward. Cheers!
No worries at all.
Keep pushing - you’re on the right track.
400 founders worth of feedback before relaunching is smart. most people (myself included) skip that step entirely and wonder why nobody shows up. i'm six weeks into building an seo audit tool and my biggest lesson so far is basically your whole thesis — the product is maybe 20% of the work, the other 80% is talking to people and figuring out what they actually want. how did you structure those founder conversations? was it mostly cold outreach or did you have an existing audience to tap into?
Yeah, that’s exactly what I realized too - most of the work is really understanding how people think.
For me it was mostly cold at the start. I posted on Indie Hackers and engaged in the comments, then followed up with people who were clearly dealing with contracts.
Over time it became more of a mix - some inbound from posts, but still a lot of direct conversations to really understand how they approach risk.
That’s where most of the insight came from.
yeah. great day
Yeah, definitely - didn’t expect this level of discussion, but really glad it resonated.
The reframe from "contract analysis" to "what could cost me money later" is such a good example of why talking to users matters more than building features. The tool didn't change that much technically — but the framing made it click.
I'm working on a content site and went through something similar. I kept thinking about it as "a database of songs with chords" but users just want to know "how do I play this song." Same data, completely different entry point.
Curious about your PH strategy this time around — are you treating it differently than the first launch? The AppSumo post from @sberkay was interesting because he basically said PH gives you one day and then it's over, while other channels let you build momentum. Wondering if you see PH differently now with an existing user base behind you.
That’s a great analogy - exactly the same shift.
This time I approached it differently. Instead of trying to “launch”, I focused more on understanding how people actually describe the problem and building around that before pushing again.
The Product Hunt relaunch was more of a validation step than the main driver - most of the real learning came from conversations and discussions around it.
Still early, but having some initial users definitely changes how I think about distribution now.
This is such a killer example of listening so hard to your users that they literally hand you the product. Reframing everything around one question “what here could cost me money later?” instantly makes VIDI feel 10x more valuable and founder-native than generic “contract analysis.” The new stack (health score, risk tiers, missing clauses, improved version, history) feels like a real workflow, not a toy. Huge respect for going from half-finished tool + no users to a focused 2.0 and a fresh Product Hunt launch off the back of 400+ comments. Rooting for you!
Really appreciate that - means a lot.
That shift only happened after going through hundreds of comments and realizing how consistently people framed the problem in terms of risk, not analysis.
Once that became clear, everything else (score, categories, workflow) started to align much more naturally.
Still a lot to improve, but that change made a big difference.
Great pivot switching to “what could cost me money later” is exactly how founders think.
One thing I’d watch next is conversion. The product is clear, but making the risks more specific (like “this clause could cost you $X” or “hidden renewal detected”) could really increase usage.
where are you seeing the biggest drop-off right now?
That’s a great point.
Right now the biggest drop-off seems to be after the first analysis - some users test it once but don’t immediately come back.
Still early, but I’m starting to see that making the output more concrete and actionable (like you mentioned) could improve that.
Working on that now.
Makes total sense the first analysis is your hook, but the follow-up is where retention happens.
One thing that often helps: give users a next-step roadmap right after the first report. Even something small like “Here’s what to check in your next contract” or “Top 3 clauses to watch for next time” can keep them coming back.
Have you thought about using small nudges or reminders to bring them back for a second analysis?
Both, honestly. I usually start with the CTA + proof to remove hesitation, then tweak messaging once the friction is low.
It’s amazing how small clarity shifts can lift conversions without changing the offer at all.
Yeah, that’s a good way to think about it - the first pass gets attention, but what happens after probably matters more.
Still early, so I’m trying to understand what naturally makes people come back vs. forcing it with reminders.
Curious - have you seen anything that worked consistently in similar flows?
Yeah one thing I’ve seen work consistently is when the first experience creates a slight “open loop.”
If users feel like:
“This was helpful, but I might be missing something else…”
they naturally come back without needing heavy reminders.
For something like this, it could be:
showing 1–2 risks clearly, but hinting there could be more in other contracts
or even “this looks safe, but here’s what typically gets missed”
It shifts it from a one-time check → to an ongoing safety habit.
Also, patterns help if users start seeing similarities across contracts, they’re more likely to reuse it.
Are most of your current users testing with just one contract, or multiple already?
That’s an interesting way to think about it - the “open loop” idea makes sense.
Still early on my side, so I’m mostly focused on understanding how people interpret the output first.
Curious how you’ve seen that play out in practice - what actually triggers that second use?
Nice, I also started some small digital tools website , Need your suggestion on that as well how i can improve it, alldaytools
Nice, sounds interesting - feel free to share it, happy to take a quick look.
Nice progress, love that you let the community shape the product before v2.
Preparing for my own PH launch soon (Angular tooling), so posts like this are gold.
Best of luck!
Appreciate that!
Good luck with your PH launch - preparing early and getting feedback like this definitely helps a lot.
Just curious about the privacy, is there any way to show this part?
Good question.
All contracts are processed securely and remain private - data is encrypted and not shared with anyone.
I’m also working on making the privacy side more visible in the product so users can clearly see how their data is handled.
This is a great example of letting users shape the product.
That insight shift is powerful; going from “contract analysis” to “what could cost me money later” makes it immediately actionable and relatable. It’s a completely different framing.
I’m in a similar phase right now where I’ve built products, but I’m still working on refining the positioning and making sure it matches how users actually think, not how I initially described it.
Also like how you turned that into clear features like the health score and risk categories. Makes the value obvious.
Curious, after that initial post, how did you decide which feedback to prioritize and what to ignore?
Really appreciate that - that was exactly the challenge.
There was a lot of different feedback, but I focused on patterns rather than individual opinions. When the same idea kept coming up from different people, that usually meant it mattered.
Anything that didn’t connect directly to the core question (“could this cost me money later?”) I tried to ignore for now.
Still refining that, but patterns over opinions has been the main approach.
Love this, feels like you actually listened instead of guessing. I'M really curious to see how it evolves from here :)
Appreciate that - that’s exactly what I was trying to do.
A lot of it came from seeing how differently people described the problem compared to how I initially thought about it.
Still evolving, but that shift made a big difference.
The reframe from "contract analysis" to what founders actually say is the whole lesson. I've reviewed a lot of contracts on the BD side of deals and the anxiety is never abstract - it's always "what am I signing that I'll regret in 6 months." The Health Score 0-100 is the right call too. Risk feels vague. A score you can act on doesn't.
Of the 400 founders you interviewed, what was the single most common clause they'd overlooked that came back to bite them? That one insight alone would make a great follow-up post.
That’s a great question.
One of the most common patterns I’m seeing is around payment terms and auto-renewals - things that don’t seem critical at first but can have real financial impact later.
Also, missing clarity around liability shows up quite often.
Thinking of putting together a deeper breakdown once I see more data across contracts.
The pivot from contract analysis to will this cost me money later is a masterclass in user psychology. That Contract Health Score idea is really clever it directly hits the exact anxiety founders feel before signing anything. Btw congrats on VIDI 2.0 launch. This feels like a big step from tool to true trust product
Really appreciate that - that shift in framing made a huge difference in how people interact with it.
Feels like it finally moved from a tool to something people can actually rely on before signing.
Exactly once you solve the trust problem, the tool basically sells itself.
The insight about how founders actually phrase the problem is gold. 'Is there anything that could cost me money later' vs 'contract analysis' is a perfect example of why talking to users matters more than building features. Most people skip that step and build what they think users need instead of what users actually ask for. Congrats on the relaunch!
Really appreciate that - that shift came entirely from how people were describing the problem, not from what I originally planned to build.
It was actually surprising how different the real question was compared to “contract analysis.”
Still learning a lot from those conversations.
That gap between how builders describe the product and how buyers describe the problem is where all the money is. 'Contract analysis tool' is a feature. 'Will this contract cost me money later' is a purchase trigger. Sounds like you cracked that. Curious how your conversion changed after the repositioning
That framing definitely changed how people react to it.
Still early, but the difference in how people engage and describe their own situations has been noticeable.
Relaunching takes courage.
What changed the most between the first and second launch?
Appreciate that.
The biggest change was the shift in how I framed the product.
In the first version, it was more about “contract analysis.”
In the second, everything is built around one question:
“Is there anything in this contract that could cost me money later?”
That changed not just the messaging, but how the product works and how users interact with it.
the shift from "contract analysis" to "is there anything in this that could cost me money later?" is the whole strategy in one sentence - that's the actual question people are asking themselves. the fact that 400 comments surfaced that is rare, most feedback threads give noise or platitudes. rebuilding around it before the relaunch was the right call. curious whether conversion is already tracking better than the first launch, even in these early hours?
Really appreciate this - that’s exactly how I’ve started to think about it as well.
On conversion, it’s still early, but I’m already seeing a difference in how people engage. The interaction feels more intentional now - users come in with a specific contract and a clear question, not just curiosity.
There is conversion happening, though still gradual - which makes sense at this stage. What’s more interesting is that the quality of usage has improved.
Still tracking it closely, but the shift definitely changed how people approach the product.
Reading 400 comments before rebuilding is the right call. Most founders treat community feedback as a wishlist and miss the signal underneath.
The Contract Health Score as a single number is smart. Turning risk into a score removes the "I don't know if this clause is a big deal" paralysis that stops people from acting on what they've read.
One thing worth adding from a B2B angle: a negotiation priority flag on the risky clauses. When I was closing enterprise deals, the problem was never identifying risk. It was knowing which clause was actually worth a fight with the other side's legal. Some risks are deal-breakers. Some are just noise. Knowing the difference saves a lot of meetings.
What risk categories are showing up most in the contracts you've seen so far?
Really appreciate this - especially the point about removing that “is this a big deal?” uncertainty. That’s exactly what I was aiming for with the score.
The idea about negotiation priority is really interesting - I’ve started noticing a similar pattern, where not all risks matter equally in practice.
So far, the most common categories I’m seeing are around payment terms, auto-renewals, and liability clauses - things that can have direct financial impact.
Still early, but that direction is becoming clearer with each contract.
Congrats on the relaunch! The insight about user-reported friction vs actual usage patterns is so common — what people say they want and what makes them stay are often completely different things.The contract tool angle is interesting because contracts are usually where deals die or get complicated. A few questions:- What's the biggest category of feedback you got from those 400 founders? Legal clarity? Speed? Price?- How are you handli
That’s a great question.
The biggest pattern wasn’t really about price or even speed - it was more about clarity and confidence.
Most people weren’t looking for full legal analysis, they just wanted to understand if there’s anything in the contract that could cause problems later.
That’s what led me to focus on things like highlighting specific clauses, risk categories, and explaining it in simple terms.
Still early, but that’s been the most consistent signal so far.
Really interesting approach — I'm also building in the content space. How did you validate the idea before building?
This isn’t my first project, so before launching I usually spend a lot of time thinking through the problem and how it should work.
For VIDI, that phase took around 7 months to a year before I actually launched anything.
But the real validation came after posting early and seeing how people reacted. The 400+ comments helped me understand how they actually think about the problem, and that’s what shaped the product.
Curious how you’re approaching it on your side?
This is spot on. Most users don’t think in features, they think in outcomes. The moment you translate your product into the exact question they’re already asking in their head, everything becomes clearer.
100% this. The same thing happens in sales. The moment you stop describing what your product does and start repeating back the exact words customers use to describe their problem, conversion rates change noticeably. It's not just clarity. It's recognition. Are you building something in the SaaS space yourself?
Appreciate that - that shift really changed how I think about the product.
Once it became about answering a single question instead of adding features, everything got much clearer.
Congrats on the relaunch! The pivot from one-time payments to subscriptions resonates — we see the same pattern where founders undervalue recurring revenue early on. One thing I would add: the 400-founder feedback loop is gold, but be careful not to build for the loudest voices. Sometimes the best signal comes from churned users who never said anything. Good luck on PH today!
That’s a really good point - I’ve been thinking about that as well.
With 400+ comments, there were a lot of different opinions, and not all of them aligned. I tried to focus on patterns that kept repeating rather than individual suggestions.
But I agree - what people do matters more than what they say. I’m paying close attention to who actually uploads a contract and who doesn’t come back after trying it.
Still early, but trying to balance between feedback and real usage behavior.
The "is there anything that could cost me money later?" reframe is a good example of how product positioning should come from customer language, not feature descriptions.
400 comments is a lot of signal. Most founders would skim for patterns and act on 3-4 takeaways. Reading every single one and extracting the core question that drove the rebuild is the kind of thing that separates products that fit from ones that just approximate it.
The numbers (57 contracts, 19 users) are a real start. One question worth tracking: are users coming back to analyze a second contract, or is it one-and-done? Repeat usage tells you whether the product delivers on its promise.
Really appreciate this - especially the point about extracting the core question vs just patterns.
On usage - yes, there are already a few users from the initial launch who are still using the product and coming back with new contracts.
It’s still early, but that repeat behavior is starting to show, and I’m paying close attention to it.
Curious - have you seen any patterns in when repeat usage tends to kick in for products like this?
This reframing insight is gold. I see the exact same pattern in the AI agent space — founders building AI tools keep thinking in terms of features and capabilities, but their users are asking one simple question: "Can I get this thing live and in front of customers?" The gap between a working prototype and a production SaaS is where most projects die. You nailed it by listening to 400+ founders and rebuilding around their actual question instead of your original framing. That takes guts. How long did it take to go from "I need to rebuild this" to having 2.0 ready for launch?
Really appreciate this - especially the parallel with AI agents, that’s exactly the pattern I’ve been noticing.
From the initial post to VIDI 2.0 it was pretty fast - around a couple of weeks. Most of the time went into understanding how people described the problem, then rebuilding around that.
Still iterating, but that shift made everything much clearer.
Curious - what kind of tools are you building in that space?
Thanks for the thoughtful reply! The "understanding how people described the problem" part is key — I'm learning that the hard way right now.
I'm working on the infrastructure layer for AI agents. The idea is that once you have a working agent, you still need all the boring stuff to make it a real product — deployment, auth, billing, domain, monitoring. Most indie hackers I talk to spend 70%+ of their time on that plumbing instead of the AI itself.
Still very early — honestly trying to figure out if this is a problem people would pay to solve or if they'd rather just DIY it. That's the part I'm stuck on: the pain is real (I see it everywhere), but I'm not sure if the solution is "give me an API that handles it" or "give me a template I can customize."
What was your experience with VIDI's infra? Did you build auth, billing, etc. from scratch, or did you piece together existing tools?
Yeah, I built most of it myself end-to-end - from the analysis pipeline to the product.
For auth I used existing tools, and I haven’t implemented payments yet - focusing first on validating usage and repeat behavior before adding billing.
Early on I wanted to keep things as simple as possible and just get real usage.
Still iterating based on how people actually use it.
I've never read a contract in my life and I would love to have a solution like this. Good luck on your launch!
Appreciate that!
That’s actually something I’ve noticed as well - a lot of people don’t read contracts, they just want to know if there’s anything risky before signing.
That’s exactly what I’m trying to solve.
This is a great example of iterative product development. The insight about how founders think about contract problems (not just "contract analysis") is gold. It's a reminder that the best products solve the actual mental model of their users, not just the technical problem. The relaunch with specific features like risk categorization and missing clause detection shows you listened deeply to feedback. Kudos on the execution!
Really appreciate this - that’s exactly the shift I was trying to make.
The biggest change came from how people were describing the problem, not from adding more features.
Still early, but that insight changed how I think about the product.
This is a great example of letting users define the product for you instead of guessing upfront.
That one insight—“will this contract cost me money later?”—is way more concrete than “AI contract analysis.” It’s outcome-driven, not feature-driven.
Also interesting how your V2 isn’t just smarter, it’s more packaged:
Score → easy to understand
Categories → actionable
Report → shareable
Dashboard → sticky
That’s the difference between a tool and a product.
Curious—have you noticed if users actually come back after their first contract, or is it more of a one-off use case right now?
Really appreciate this breakdown - especially the “tool vs product” point.
On usage - yes, there are already some users from the initial launch who came back to analyze more contracts.
It’s still early, and finding users is the harder part right now - I’ve mostly been reaching out directly on LinkedIn and through conversations.
But the repeat usage signal is starting to show, which is encouraging.
Congrats on the Product Hunt relaunch! What was the single biggest insight from those 400 founders that changed your approach the most?
Thanks!
The biggest shift was realizing that people don’t think in terms of “contract analysis.”
They just want to know one thing before signing:
“Is there anything in this contract that could cost me money later?”
That insight changed how I built and positioned everything.
This is a great shift — “will this cost me later?” is exactly how people think.
I’ve signed contracts that looked fine but had hidden clauses… something like this would’ve saved me a lot of stress.
That’s exactly the situation I’m trying to solve.
Those “looks fine at first” clauses are the ones that usually cause the biggest issues later.
Out of curiosity - what kind of clause caught you off guard?
This is a great concept. Just tried testing it with an AI generated contract with a poison pill and it spotted it, which I guess should be expected but still good to see it didn't have issues.
Really appreciate you testing it like that - that’s a great edge case.
Poison pill clauses are exactly the kind of thing that can be easy to miss.
Curious - was the explanation clear enough, or was there anything that felt confusing?
The reframe from "contract analysis" to "what could cost me money" is such an underrated insight. I've seen this pattern in my own indie app journey too — I spent weeks describing my product in technical terms, but the moment I switched to the language my users actually used ("I just need to capture this thought before I forget it"), everything clicked. Downloads went up, retention improved, and support emails dropped.
Your 400+ comment thread becoming your product roadmap is basically the dream scenario for build-in-public. Most founders (myself included) resist this because it feels like giving up control, but it's really just letting the market pull you toward what works.
One question: now that you've rebuilt around this new framing, are you seeing a difference in the type of users signing up? I'm curious if the positioning shift attracted a different audience than V1 did.
That’s a great way to describe it - “letting the market pull you” is exactly how it felt.
And yes, I’m starting to see a shift.
Before, it was more curiosity-driven. Now it’s more people with a specific concern - they already have a contract and want to check if there’s something risky before signing.
Feels less like exploration and more like intent.
Curious - did you notice a similar shift in user behavior when you changed your messaging?
This is a great case study in product-market fit discovery. The shift from "contract analysis" to "what could cost me money" is exactly the kind of reframe that separates tools people try once from tools they depend on.
One thing I'd watch closely: your 57 contracts / 19 users ratio (3 per user) is actually a strong signal. If those users keep coming back with each new contract, you've found real retention. The next milestone I'd track is time-to-second-upload — if users come back within 30 days, you have a habit, not just a trial.
Also curious about your PH launch strategy — did you coordinate upvotes from your IH community, or was it mostly organic? The cross-pollination between IH posts and PH launches seems underutilized by most founders.
That’s a really interesting way to look at it.
The repeat usage point is something I’ve started paying more attention to - especially whether people come back with another contract after the first one.
It does feel like that’s where real value shows up.
On Product Hunt - the initial push was mostly direct outreach (LinkedIn, people I already knew), and then later Indie Hackers brought more organic conversations.
So it was a mix, but the discussions definitely made a big difference.
Curious - have you seen time-to-second-use as a strong signal in your own products?
57 contracts across 19 users is about 3 per user. That repeat usage number matters more than anything else right now. Single-use tools die fast here because contract review pain is intermittent. If those 19 keep coming back with every new contract, you have real retention. If most uploaded a few to test and bounced, you have a demo. What does the return-user curve look like after the first week?
That’s a great way to frame it.
I’ve started to see some users come back within the first week with another contract, which feels like a strong signal.
Still early, but it does feel like the difference between “trying it out” and actually needing it.
Curious - have you seen a specific timeframe where second use usually happens if the product sticks?
The reframe from "contract analysis" to "is there anything in this that could cost me money later?" is genuinely brilliant. That's the difference between a feature and a product — one describes what it does, the other describes what the user is afraid of.
I'm at a similar day 0 moment right now. Posted about silent churn on r/SaaS today — founders losing MRR with zero warning before the cancellation hits. Same pattern as yours: people kept describing the same pain in different words until the real question became clear.
Quick question — when you got those 400 comments on your first post, how did you decide which feedback to act on and which to ignore? That filter feels like the hardest part.
That was honestly one of the hardest parts.
A lot of people described things differently, but the turning point was noticing the same underlying concern repeated in different ways.
Once that clicked, it became less about individual suggestions and more about the pattern behind them.
Curious how you’re approaching that with your product right now?
Solid insights on the feedback loop. Balancing what founders say they want vs. what they actually pay for is a massive feat. Thanks for sharing the grind.
Yeah, that gap is very real.
What people say and what they actually act on can be very different - that’s been one of the biggest lessons so far.
We are looking for someone who can lend our holding company 300,000 US dollars.
We are looking for an investor who can lend our holding company 300,000 US dollars.
We are looking for an investor who can invest 300,000 US dollars in our holding company.
With the 300,000 US dollars you will lend to our holding company, we will develop a multi-functional device that can both heat and cool, also has a cooking function, and provides more efficient cooling and heating than an air conditioner.
With your investment of 300,000 US dollars in our holding company, we will produce a multi-functional device that will attract a great deal of interest from people.
With the device we're developing, people will be able to heat or cool their rooms more effectively, and thanks to its built-in stove feature, they'll be able to cook whatever they want right where they're sitting.
People generally prefer multi-functional devices. The device we will produce will have 3 functions, which will encourage people to buy even more.
The device we will produce will be able to easily heat and cool an area of 45 square meters, and its hob will be able to cook at temperatures up to 900 degrees Celsius.
If you invest in this project, you will also greatly profit.
Additionally, the device we will be making will also have a remote control feature. Thanks to remote control, customers who purchase the device will be able to turn it on and off remotely via the mobile application.
Thanks to the wireless feature of our device, people can turn it on and heat or cool their rooms whenever they want, even when they are not at home.
How will we manufacture the device?
We will have the device manufactured by electronics companies in India, thus reducing labor costs to zero and producing the device more cheaply.
Today, India is a technologically advanced country, and since they produce both inexpensive and robust technological products, we will manufacture in India.
So how will we market our product?
We will produce 2000 units of our product. The production cost, warehousing costs, and taxes for 2000 units will amount to 240,000 US dollars.
We will use the remaining 60,000 US dollars for marketing. By marketing, we will reach a larger audience, which means more sales.
We will sell each of the devices we produce for 3100 US dollars. Because our product is long-lasting and more multifunctional than an air conditioner, people will easily buy it.
Since 2000 units is a small initial quantity, they will all be sold easily. From these 2000 units, we will have earned a total of 6,200,000 US dollars.
By selling our product to electronics retailers and advertising on social media platforms in many countries such as Facebook, Instagram, and YouTube, we will increase our audience. An increased audience means more sales.
Our device will take 2 months to produce, and in those 2 months we will have sold 2000 units. On average, we will have earned 6,200,000 US dollars within 5 months.
So what will your earnings be?
You will lend our holding company 300,000 US dollars and you will receive your money back as 950,000 US dollars on November 27, 2026.
You will invest 300,000 US dollars in our holding company, and on November 27, 2026, I will return your money to you as 950,000 US dollars.
You will receive your money back as 950,000 US dollars on November 27, 2026.
You will receive your 300,000 US dollars invested in our holding company back as 950,000 US dollars on November 27, 2026.
We will refund your money on 27/11/2026.
To learn how you can lend USD 300,000 to our holding company and to receive detailed information, please contact me by sending a message to my Telegram username or Signal contact number listed below. I will be happy to provide you with full details.
To learn how you can invest 300,000 US dollars in our holding, and to get detailed information, please send a message to my Telegram username or Signal contact number below. I will provide you with detailed information.
To get detailed information, please send a message to my Telegram username or Signal username below.
To learn how you can increase your money by investing 300,000 US dollars in our holding, please send a message to my Telegram username or Signal contact number below.
Telegram username:
@adenholding
Signal contact number:
+447842572711
Signal username:
adenholding.88
Thanks for reaching out.
This doesn’t seem like a fit for me. I’m currently focused on building my own product.
Wishing you the best with your project.
Nice tool! One question: where did you find your first users?
Started with Product Hunt, then Indie Hackers.
Product Hunt gave the initial push, but Indie Hackers is where most of the real conversations happened — and some of those turned into users.
It wasn’t really “traffic” - more like discussions that led people to try it.
Curious where you’ve seen early users come from in your case?
Nice build — genuinely. A few questions/thoughts that might be useful:
What was the hardest part of the technical build vs. what's the hardest part of getting traction? I ask because they're usually very different problems and sometimes the instinct after launching is to go back and improve the product when the actual bottleneck is distribution.
Also curious: who's your target user and where do they already hang out? That tends to be the fastest path from "launched" to "first 10 paying customers" just being where they already are, not trying to get them to come to a new place.
Rooting for you.
That’s a great point - and honestly, I’ve started to feel that difference very clearly after launching.
The technical side had its challenges, but traction is a completely different problem. What helped most so far was being part of conversations where the problem already existed, rather than trying to push the product.
Right now it’s mostly small and medium-sized businesses - people who deal with contracts regularly but don’t always have legal support.
Still figuring out where they naturally hang out beyond Indie Hackers.
Have you found a channel that worked especially well for getting those first paying users?
This insight about user language vs feature language is gold. In agency work, I've learned that clients never say "I need conversion optimization" - they say "people visit my site but nobody buys." Your pivot from "contract analysis" to "will this cost me money later" is exactly the mental shift that turns browsers into buyers. Curious - how are you pricing this now vs your original version?
That’s exactly what I started noticing - once the language clicked, everything else became much clearer.
Still early on pricing - right now more focused on usage and understanding how people actually use it in real situations.
Curious how you’ve approached pricing when the value is more “risk avoided” than something directly measurable?
What most people miss on a relaunch is that traction usually comes from the customer language you earned before launch, not the launch itself. If 400+ founders shaped the product, the real asset is the patterns in what scared them, confused them, and made them buy. That is the stuff that should drive the PH copy, demo, and follow-up.
That’s a great way to put it.
Looking back, the product itself came after the conversations - not the other way around.
A lot of the language and structure now is directly shaped by what people were actually reacting to.
Curious - have you seen cases where this kind of pre-launch discussion made a big difference in conversion later on?
Woow perfect bro
Appreciate it bro!
Just curious - what made you sign up?
that's great man!!
Appreciate it, man!
Curious - what part stood out to you the most?
Good luck bud seems awesome , if u are intrested in making your site security better please look at : www.gloqsec.comm and leave a feedback
Appreciate it, thanks!
Security is definitely something I’m thinking about, especially with contracts involved.
Will take a look - curious, what kind of vulnerabilities are you focusing on most?
My tool actual help against fraudulent email, bots singups , automated requests ...
think of it as captcha but no friction and highly detailed about your clients
Got it, makes sense - especially with bots and automated traffic.
Something I’ll definitely keep in mind as usage grows.
Out of curiosity, are you mostly working with SaaS products right now?
Yeah, mostly SaaS and developer-focused products right now especially apps dealing with signups, auth flows, or anything exposed to automated abuse.
also im looking to make this even bigger in Q3 and Q4 of this year actually i have all mentioned in side and github/docs, hope you like contact me when ever you are intrested i can help with integration and give you bigger and better trials
Got it - sounds like a solid focus, especially around auth flows and abuse prevention.
Will keep it in mind as things scale on my side.
Appreciate you sharing
i just finished building my first project and i gotta say u did an amazing job man .
Really appreciate that - means a lot, especially coming from someone who just built something themselves.
What did you end up building?
Also curious - did you run into anything unexpected during the process?
The reframe from "contract analysis" to "what could cost me money later" is a great example of letting users define the product for you. That's a positioning change, not a feature change, and it probably matters more than any of the new features combined.
400 comments turning into a full rebuild is also a good reminder that the first version is never the real product — it's just the thing that starts the right conversations.
This was honestly the biggest shift.
Before that, I was building features. After that, I was answering a question people already had in their head.
The interesting part - nothing really changed technically. Just how the problem was framed.
Curious if you’ve had a similar moment where positioning mattered more than the product itself?
This is really interesting.
I’ve been thinking about this space while building something myself, and one thing I’ve noticed is that people struggle more with execution than the idea itself.
I’m currently testing a small tool around this—not fully sure if it’s useful yet.
Would you be open to taking a quick look and sharing honest feedback?
Yeah, I’ve seen the same - ideas are usually clear, execution is where things get real.
Happy to take a look. What are you building it around?
Also curious - what made you start exploring this space?
Love this idea — are you targeting a specific niche?
Mostly small and medium-sized businesses right now.
The common pattern is they just want to quickly understand if there’s anything risky before signing, without going deep into full legal analysis.
Still figuring out if it makes sense to narrow it further.
Are you seeing a specific niche where this problem is more painful?
This is interesting.
One thing I’m curious about—
Who is the best user for this right now?
I can see this being useful for a lot of people, but usually there’s one group that gets the most immediate value.
Are you seeing it more with real estate, legal, or small business contracts so far?
Great question - right now it’s mostly founders, freelancers, and small to mid-sized business owners.
Especially people reviewing client agreements or vendor contracts without legal support.
That’s where it seems to create the most immediate value so far.
That makes a lot of sense.
I actually just went through something similar on the personal side—reviewing a legal/guardianship contract and realizing how unclear the total cost structure really is beyond the retainer.
Feels like that uncertainty is exactly where something like this could add a lot of value.
Have you seen users using it more for clarity before signing, or after they’ve already run into issues?
One thing that stood out reading this is how much more valuable the conversations were than the launch itself.
It feels like Product Hunt works best not as a traffic spike, but as a forcing function to talk to a lot of potential users quickly.
I’m curious — did you find that the biggest changes came from patterns across many founders, or from a few strong opinions that made you rethink things entirely?
I’m preparing for a launch myself and trying to think of it less as “launch day” and more as “accelerated learning day.”
That’s a great way to think about it - honestly, most of the big changes came from patterns across many founders.
Different people, but the same underlying concern kept coming up, which made it hard to ignore.
“Accelerated learning day” is a great way to frame it - that’s exactly how it felt.
What API are you using? Are your costs high? curious because I'm working on a similar product, except it is something that would be creating the contract for SOW for outsourcing. Simple thing, right now just going for 100 users.
Still early - just focused on making it useful and improving it step by step.
What are you building?
A SOW generator specifically built for people outsourcing in the US-India corridor.
Nice, that’s a solid use case.
Still early on my side - mainly focused on making it useful and improving it step by step.
Curious how you're thinking about your approach?
Iteration
which LLM model does the project use?
Still iterating on the approach - focusing on making the results reliable in real-world contracts.
Best of luck
Thanks, really appreciate it
Hey Meirambek, congrats on the VIDI launch! 🎉
I tried it recently and it literally saved me ~$4,000 by catching a risky clause I almost missed. Really love how you focused on what actually costs founders money. Excited to see where you take this next!
That’s exactly the kind of use case I’m aiming for - catching things before they turn into real costs.
Out of curiosity, what kind of contracts are you reviewing most often?
Mostly client agreements and vendor contracts for now.
Those tend to have the most hidden gotchas.