One thing I didn’t expect:
People don’t just try it once.
They come back with another contract.
Then another one.
I assumed this would be a one-time use tool.
But the behavior is different - once someone uses it on one agreement, they reuse it whenever the next one shows up.
Which made me realize:
This isn’t a “one-off check”.
It’s something people turn to at the moment of decision.
Still early, but that repeat usage was a much stronger signal than initial signups.
If anyone’s curious, you can try it here:
https://joyful-granita-8415bc.netlify.app/
Curious - have you seen users behave differently from what you originally assumed?
For anyone who asked, here’s the tool:
https://joyful-granita-8415bc.netlify.app/
Would appreciate any feedback - still early and improving it.
been using it for a while, uploaded multiple contracts - really convenient
definitely saves time and reduces headache before signing 👍
Appreciate that - means a lot.
That’s exactly the kind of signal I’m looking for.
This is interesting — especially the shift from “one-time use” to something people return to in decision moments.
Feels like that’s where a lot of products actually win or lose:
not in discovery, but in that exact moment where someone has to decide.
I’ve been seeing a similar pattern — tools that reduce the need to think in that moment tend to get reused.
Curious — do you think it’s the repetition itself, or the fact that it shows up exactly when the decision happens?
That’s a good question - I don’t think I know for sure yet.
What I’m seeing so far is that people tend to come back when something similar comes up again, not really out of habit.
So it feels less like repetition on its own, and more about timing - when there’s actually a decision to make.
That makes a lot of sense.
Feels like it’s less about people building a habit,
and more about the moment forcing them to decide.
What’s interesting is that in those moments,
people usually already have some idea of what they should do —
but not enough confidence to fully commit.
So they look for something that reduces that uncertainty,
not just something they use regularly.
Curious if you’ve noticed that too —
that it’s less about usage frequency, and more about how much it helps them feel sure in that moment?
That’s an interesting way to think about it - I’m not sure I have a clear signal on that yet.
I haven’t seen many cases where people explicitly say what they’re feeling in that moment. Most of what I see is just whether they come back and use it again.
So for now I’m mostly relying on behavior rather than trying to infer intent.
That’s interesting — and makes sense.
Feels like behavior is the easiest thing to observe,
but also the easiest to misinterpret.
Someone coming back doesn’t necessarily mean they’re confident —
it could also mean they’re still unsure and trying again.
I’ve been wondering if a lot of tools end up optimizing for repeated usage,
when the real problem is actually reducing hesitation in that first decision.
So instead of:
“did they come back?”
it’s more like:
“did they feel sure enough to act?”
Not sure how you’d measure that cleanly though — feels like that’s the missing piece.
That’s a good point - I’ve been thinking about something similar.
I’m not sure I have a clear way to measure that yet, so for now I’m mostly just watching what people actually do rather than trying to interpret how they feel.
What’s been useful is seeing if the same behavior repeats across different people and situations - that’s the only thing that’s starting to feel like a real signal.
That makes sense — looking for patterns across people feels like the closest thing to a real signal.
At the same time, I wonder if that’s still a bit indirect.
Even if the same pattern repeats across users,
it doesn’t necessarily tell you what to do differently with enough conviction.
It’s like:
you can see something is happening,
but not how strongly you should react to it.
Maybe that’s why a lot of insights end up as:
“this is probably important”
instead of:
“this is what I’m changing”
Feels like there’s still a gap between recognizing a pattern and actually trusting it enough to act on it.
Yeah, I see what you mean.
I’m not sure there’s a clean way to go from pattern → action yet.
For now it feels like noticing consistent signals is helpful, but still leaves a gap when it comes to actually deciding what to do differently.
Curious how you usually handle that - when you see a pattern but it’s not obvious how to act on it?
I don’t think I have a clean answer either yet.
What I keep coming back to is:
patterns alone don’t seem to create enough conviction.
It feels like something is missing between:
“this keeps happening”
and
“I’m confident enough to act on it”
Lately I’ve been thinking more about whether the signal needs to feel closer to the user’s actual situation — not just aggregated patterns.
Like instead of:
“many users do X”
something more like:
“this is what’s happening in your case right now”
Not sure yet what that looks like in practice,
but it feels like that might be the missing piece.
This framing maps exactly to what I'm seeing building Autoreport — a tool that sends Stripe founders a PDF report every Monday morning.
The external trigger (Monday, 8am, inbox) was a deliberate design choice precisely because of this. I didn't want users to have to remember to open a dashboard — the product shows up in their workflow whether they think about it or not.
What I didn't anticipate: the trigger also changes how people read the report. Because it arrives at a fixed moment, they treat it more like a briefing than a dashboard — skimming the narrative first, then drilling into numbers if something looks off. That's a completely different reading behavior than I designed for, and it's now shaping how I think about what the AI narrative should actually say.
Your point about the gap between first and second use is interesting. For a weekly-trigger product the gap is forced (7 days), which makes early churn signals harder to read — someone who unsubscribes after two reports gave the product a real shot, which is different from a tool they just forgot to come back to.
That’s a great point - especially the “shows up in workflow” part.
I’m seeing similar behavior where usage is triggered by new contracts coming in rather than habit.
The support framing is smart — turning it from a cost into a signal. Still pre-revenue so I haven't hit that stage yet, but logging it for when I do.
Yeah, that makes sense - especially using support as a signal early on.
Feels like at this stage, anything that shows where people get stuck is more useful than trying to track formal metrics.
For me it was the support load. My first paying customer ($6.28 PayPal sale) didn't message me, but the second one wrote 3 emails in the first hour — half asking for product help, half being friendly, and I realized I had no support channel set up.
Now I treat the first 100 customers as a feedback gold mine, not a support burden. Every email I get teaches me what's missing or unclear on the sales page, and the changes I make from those emails convert way better than anything I've A/B tested.
What was yours?
That’s interesting - I’ve noticed early users tend to reveal the biggest gaps just through how they use it.
How is this different from uploading the contract to Claude or ChatGPT?
Interesting approach. Have you run into stability issues over long-running agent sessions? I’ve been experimenting with runtime monitoring and drift detection, and reliability over time seems to be a major challenge.
So far keeping things simple and focused seems to help with reliability.
I see the same kind of behavior with Alora Home Health Software. People assume they’ll use it for one task, like setting something up or fixing a claim, but then it turns into the place they go every time something similar comes up.
In home health especially, it happens with billing and compliance stuff. Once someone figures out how to handle one Medicare issue in Alora, they keep coming back to it for every similar scenario instead of trying something new.
That’s exactly what I’m starting to see - it shifts from “try once” to “use whenever needed.” Interesting to hear it’s similar in other domains too.
The distinction between a "one-off check" and a "moment of decision" tool is one of the clearest early PMF signals I've seen discussed on here. Products that get pulled back by external triggers — a new contract lands, a quarterly review comes up — have structurally different retention from products where users have to remember to come back. The trigger lives in the customer's workflow, not your onboarding funnel.
One thing worth tracking closely: the time gap between first and second use. Short gaps (days) suggest you have power users who deal with contracts frequently — they're likely worth more per seat and worth targeting explicitly in your ICP. Longer gaps (weeks or months) mean the product is still essential but the positioning story shifts from "daily tool" to "trusted advisor for high-stakes moments." Both are strong, but they lead to different pricing and marketing strategies.
To answer your question — yes, we built something expecting weekly active usage and found that users actually cluster around a monthly audit cycle instead. Completely changed how we thought about engagement metrics and what "healthy" retention looks like for our product.
That’s a really helpful way to frame it - especially the “moment of decision” part.
I’m starting to see those triggers (new contracts coming in) drive usage more than anything else.
Haven’t looked closely at the time gap yet, but that’s a great point.
Had a very different experience. Built a calibration quiz, launched through an HN post and got 1,900 completions in a day. The thing I didn’t expect was that users engaged with the quiz (sharing scores, debating methodology in comments) but almost nobody cared about the actual product behind it (decision journal, pre-mortem analysis).
It was unexpected to build something as a marketing hook that ended up being more compelling than the platform I spent months building as the core product.
Reaching out to early users and getting some great feedback. I am curious to know how long before you started to see a definite repeat user trend? Was that something you noticed from data, or something users let you know?
Interesting - yeah, that happens more often than expected.
Still early on my side, but it started showing up pretty quickly just from seeing people come back with another contract rather than from any deep analysis.
I'm too early to have real retention data. What I can say is the handful of people who logged decisions came back to check outcomes, which is the intended loop, but I haven't hit the volume where I can distinguish the "repeat user trend" from a few friendly early adopters. If you've found something that bridges the gap between a viral moment and a repeat use case, I'd genuinely like to hear it.
That makes sense - I think I’m in a similar spot on volume.
What’s been helpful for me is not trying to prove retention yet, but just watching if the same behavior shows up across different people and situations.
Even with small numbers, if it keeps repeating in slightly different contexts, it starts to feel less like a one-off and more like a real pattern.
This is a cool idea - my past sales role saw some sketchy contracts and would've loved to have this before then.. lol. Best of luck!
Appreciate it - yeah, that’s exactly the kind of situation it’s meant for.
Experience the latest features we’ve implemented—now available for free for a limited time. You can access and install the update directly via the Chrome Web Store:
https://chromewebstore.google.com/detail/marginix-–-amazon-profit/gmgajlccniogemeeocgcnagpfngomloe
Try it out today and let us know what you think. I'm available if you need any assistance or have questions during your trial.
This is a valuable analysis. High retention and repeat usage are often much stronger indicators of product-market fit than initial acquisition metrics.
We experienced a similar divergence from our assumptions during the recent beta launch of KortexMail. While the tool was primarily developed for outbound email drafting, we quickly observed users adopting it to summarize lengthy internal threads instead. It is always instructive when actual user behavior clarifies the product's most significant value.
Yeah, seeing something similar - real usage shifts how you think about it pretty quickly.
Repeat usage is the metric most early founders underweight. New signups feel exciting but returning users are actually telling you "this fits my life" — which is the hardest thing to manufacture.
The "moment of decision" framing you landed on is powerful for distribution too. The tools that win on Twitter/X are the ones that show up exactly at the moment someone needs them. If your product is that tool for contracts, leaning into that in your content is valuable — "here's a scenario where this saved me $X" type posts perform really well because they create the mental anchor in the reader's mind.
On the audience-building side, the pattern I've noticed is that founders with products like yours benefit most from showing the use case in action rather than talking about features. Building in public around real decisions (not just wins) tends to compound. Tools like AlphaTweet (alphatweet.pro) help with keeping that kind of content consistent without it becoming a second job — worth a look if distribution is a bottleneck.
That makes sense - still early, just observing how it plays out.
This is interesting — I’ve been noticing similar things while building my own product.
What surprised you the most after launch?
Probably how people actually use it vs what I expected.
There's a massive difference between "useful once" and "this is now what I reach for every time." Sounds like IntroCave crossed into the second category faster than you expected.
Repeat usage early on is genuinely rare. Most early tools get tried once, forgotten, then uninstalled. The fact that people are coming back with the next contract means it actually fits into how they work, not just how they thought they might work.
Did you have any pricing intuition from this? Subscription feels way more justified when you see that pattern.
Still too early to think about pricing.
Repeat usage is the most underrated validation metric early on. First-time users tell you your acquisition is working. Users who come back tell you your product is working.
The 'moment of decision' framing you've identified is key — contract review is a high-stakes moment, and people don't risk high-stakes moments on tools they don't trust. The fact that they trusted it once and came back means you passed the test.
I've noticed the same pattern with voice-based tools: people try them when curious but only integrate them when the habit forms around a specific trigger. For you it's the contract signing moment. That's a very strong hook to build around.
Yeah, seems that way.
Repeat usage on contract review is a strong signal. It means people trust the tool with real decisions, not just experiments. That also means the data flowing through it matters more than a typical productivity app. Contract content, user behavior, potentially PII. Worth a surface check before it scales. https://scan.mosai.com.br runs 78 checks in 60 seconds if you want a baseline.
Interesting, will take a look.
I relate to this a lot.
One thing I’ve started noticing is that users don’t just “use” your product—they reshape what it actually is.
You think you’re building X, but once people start interacting with it, they reveal what they actually care about (and it’s usually not what you expected).
I’ve been thinking about this a lot while working on a small project—how hard it is to get useful, thoughtful responses instead of surface-level engagement.
Curious—did this change how you think about what your product is really for now?
Honestly, I’m not overthinking it too much - just building and seeing how people use it.
Yes — saw the same thing with Delineato (delineato.app), our minimalist diagramming/mind-mapping tool. We built it thinking people would fire it up to make a quick diagram, export it, and move on. What actually happened: a segment of users came back every Monday morning to map out their week, or used it as a running 'thinking canvas' they kept adding to. That repeat, ritual usage wasn't something we anticipated at all.
The signal you're describing — repeat usage at the moment of decision — is much more valuable than we gave it credit for early on. We were tracking new signups but missed the cohort that was quietly becoming weekly active users.
What changed for us: once we noticed the pattern, we added features that served the 'recurring use' case rather than just making the one-time creation flow easier. The product shifted from 'output tool' to 'thinking environment'. Still figuring out how to monetize that distinction, but the usage pattern is real.
Interesting - that shift sounds very similar. Still early on my side, just watching how those patterns develop.
That repeat usage pattern is honestly one of the best signals you can get as an early-stage builder. It means the product is solving a real recurring pain point, not just a curiosity. A lot of founders obsess over new signups, but returning users are where the real value is — it means you've built something that fits into a workflow.
I've been noticing something similar with a side project I built — a free chess puzzle trainer. I expected people to solve a few puzzles and bounce, but the streak and rating system keeps them coming back daily. Totally unplanned, but that retention loop became the core of the product.
Your insight about it being a "moment of decision" tool is key. That's a strong position to be in for pricing too — if people reach for your tool every time a new contract comes in, the value clearly compounds. Have you thought about tracking which types of contracts bring people back most? That could help you double down on the highest-value use case.
Yeah, still early - just seeing similar patterns show up.
That repeat behavior is a stronger signal than raw signup numbers. It usually means the product is becoming part of a real workflow, not just satisfying curiosity once. The framing around decision moments also feels important, because users may be buying confidence as much as analysis. If you keep seeing this pattern, it might be worth instrumenting what triggers the second and third use, since that will probably tell you more about retention than top-of-funnel metrics.
That’s a good point - still observing for now.
That’s actually a really strong signal.
It means your product isn’t just useful — it’s becoming a default step in a workflow.
Those are usually much more defensible than tools people just “try”.
Yeah, that’s what it’s starting to look like.
That’s usually the inflection point.
Once it becomes part of a workflow, people stop evaluating it and just default to it.
Feels like that’s where things start compounding.
Really interesting, that shift from “one-off tool” to something people return to at decision moments is a strong signal.
It almost sounds like the value isn’t just in the analysis itself, but in the confidence it gives users when they’re about to act.
In cases like this, I’ve seen it help to lean into that behavior more explicitly positioning it less as a tool you try, and more as something you rely on whenever a new contract comes up.
Have you explored making that repeat use more visible or intentional in the experience?
Yeah, starting to notice that pattern - still early, just observing how it plays out for now.
Really interesting to see that behavior emerging! Happy to take a closer look at the flow if you’d like, sometimes mapping these repeat-use moments helps make them more intentional in the experience.
Could be interesting to see where users might hesitate or need extra guidance.
Appreciate it - will keep that in mind.
This is the best part of building — users always surprise you. What was the most unexpected feedback you got?
Probably how differently people interpret the same thing.
The "moment" product category is underrated. If your tool is tied to a recurring life event — signing a contract, starting a new week, hiring someone — retention mirrors how often that event happens. You don't need to engineer habit loops. The moment brings people back. Good early signal to catch.
Interesting point.
That’s a strong signal — I’ve seen the same when a tool solves a “moment” problem, people just come back naturally.
Repeat use > signups any day.
Yeah, seems that way.
This is such a valuable insight. Retention often reveals the real use case — users show you how they actually value your product, not how you imagined they would. It sounds like you accidentally built a workflow tool, not a one-time utility. Are you now thinking about features that support that repeat behavior more deliberately?
Still early - just watching how people use it for now.
The repeat usage signal is one of the most underrated early indicators. First-time usage tells you the pitch worked. Repeat usage tells you the product actually fits into how someone works.
I've seen this pattern before — people assume they're building something people will "try," but the users who stick around have quietly made it part of their workflow. They don't announce it. They just keep coming back.
The "moment of decision" framing is a useful way to think about it. Products that sit at decision points — where someone needs to act and wants a second opinion or a structure to work within — tend to get used more than products that sit in the "nice to have" category.
Curious whether you've thought about how to deliberately trigger that return — or whether you're waiting to see more of the pattern before designing around it.
Still mostly observing for now - want to understand the pattern better before trying to force anything.
The repeat usage part means the product is becoming part of how people make decisions, not just something they test once out of curiosity. That behavior is a very strong foundation of curiosity.
Yeah, that’s what it’s starting to look like - more tied to actual decisions than just curiosity.
This is such a valuable insight about building in public. The gap between what you assume users will do and what they actually do is where the real product lessons live. I've been building a lightweight memo app and had a similar surprise - I designed it as a quick capture tool, thinking people would use it once in a while. Turns out, the users who stuck around were using it multiple times daily as part of their core workflow, not just for occasional notes.
That repeat usage pattern you're describing is honestly one of the strongest signals you can get. It means the product is solving a recurring pain, not just a curiosity. It's also a great foundation for thinking about pricing - if people come back with every new contract, the value compounds over time.
Did you notice any patterns in how users discover this repeat use case? Like, do they come back on their own, or does something prompt them to return?
Still figuring that out, but so far it looks more like they come back naturally when a new contract shows up, not because of any prompt.
Great insight
Repeat usage isn’t just interest it signals real value and trust. Looks like the product became part of the decision-making process, not just a one-time tool.
Yeah, that’s what it’s starting to feel like - less of a one-off tool, more part of the decision process.
The gap between what you think users will do and what they actually do is where every real product insight lives. Curious what surprised you most — was it how they used it or who was using it?
Probably how they used it - I expected one-time checks, but some keep coming back with new contracts.
that's actually the most valuable signal early on — repeat usage means the product solves something real, not just a one-time curiosity. the contract use case especially makes sense, people keep coming back because the stakes are real each time. how are you handling the onboarding for those returning users?
Still pretty simple at this stage.
that’s a strong signal - when people come back on their own, you know it’s real
keep going, you’re building the right thing 👍
That’s great to hear - really helpful feedback.
Glad it’s actually saving time before signing 👍
This comment was deleted 2 days ago.
This comment was deleted 2 days ago.
This comment was deleted a day ago.