1
8 Comments

I built a document analytics tool because I was tired of sending proposals into silence. Here is what I learned.

Every salesperson knows this feeling. You spend hours preparing a proposal, you hit send, and then nothing. Complete silence. You have no idea if they opened it, which parts they read, whether they forwarded it to someone else, or if it landed in their spam folder.

That silence is not just uncomfortable. It costs deals.

I built DocMetrics to fix that. It is a document analytics platform that tells you exactly what happens after you hit send. Who opened it. Which pages they read. How long they spent on each one. Whether they came back. Whether someone else at the company opened it too.

But here is what I discovered while building it. Raw metrics are not the real product. A founder here told me something that stuck — the value is not in showing the numbers. It is in turning those signals into deal judgment. Is this deal moving. Who is really involved. What does the silence mean. What should I do right now.

So I built that interpretation layer too. DocMetrics now generates a plain English summary per prospect that answers those exact questions automatically and sends it to your Gmail, Slack, HubSpot, or Microsoft Teams the moment something meaningful happens.

What it does today

Per viewer re-read detection across sessions — know exactly which pages each prospect keeps returning to
Dead deal scoring — combines multiple signals to tell you if a deal is alive or dying before you waste more time on it
Gone silent detection — fires after inactivity and tells you what the silence probably means
AI-style deal intelligence per prospect — one plain English paragraph with a clear recommended action
Real-time viewer presence — see who is reading your document right now
E-signature tracking with hesitation detection
Heatmaps, page engagement charts, location tracking, bounce analytics
The hard honest truth I learned

Building was not the hard part. Distribution is. I am still climbing that mountain. I have the product. I do not yet have the audience. That is the real work now.

If you send proposals, pitch decks, contracts, or any document where you need to know what happens after send — I would genuinely love for you to try it. Free to start.

Three questions for this community

When you send a proposal and get silence — what do you do? How long do you wait before following up?
What is the one signal that would tell you immediately whether a deal is worth chasing?
If your document analytics tool could tell you one thing it currently does not — what would that be?
Happy to answer anything about how I built it, what worked, what failed, and where I am headed. This community has already shaped the product through feedback I did not even ask for directly. That means a lot.

docmetrics.io · free signup · no credit card

posted to Icon for group Startups
Startups
on May 4, 2026
  1. 1

    This is such a relatable problem for anyone in sales. I actually know a few B2B sales professionals personally, and they'd probably be willing to answer your questions about their proposal process for free if you want.

  2. 1

    The interpretation layer is the right call, most founders would have shipped the metrics and stopped there. The question I'd want answered that raw data can't tell me: is the silence because they're busy or shopping alternatives , or the deal is dead? those require very different next moves.

    1. 1

      That is exactly the distinction that took me the longest to figure out how to detect reliably.
      What I found is that busy silence and shopping silence actually leave different patterns in how someone interacts with the document before they go quiet.
      Busy silence looks like this. Strong first session, good completion rate, sometimes a return visit, then nothing for a few days. The engagement was genuine but life got in the way. These prospects almost always respond well to a single well timed follow up that gives them an easy next step.
      Shopping silence looks different. Multiple short sessions, low time on pricing or commitment pages, sometimes a second anonymous viewer appearing who you cannot identify. They are comparing you against someone else and do not want to show their hand.
      Dead deal silence is the clearest of all. One short session, low completion, no return. The document did not land and they moved on without telling you.
      The system I built assigns a different recommended action to each pattern automatically. Busy gets a gentle nudge. Shopping gets a differentiation message. Dead gets a complete reframe or a parking decision.
      Still not perfect but it is a lot better than guessing.

  3. 1

    This is such a painfully relatable problem — you do all the work of crafting a proposal, send it out, and then you’re left guessing in silence. Most people assume the issue is follow-ups or pricing, but in reality you’re completely blind to intent once the document leaves your hands, and that’s where deals quietly die. What I find interesting about document analytics tools is that the real value isn’t just “who opened it,” it’s spotting the subtle signals — where attention drops, what gets re-read, and when interest starts fading before the prospect ever replies. That gap between sending and understanding is where most sales energy gets wasted.

    1. 1

      You just described exactly what I spent months trying to solve and you articulated it better than I have in most of my own writing.

      The blind spot is not just who opened it. You are right that the real signals are the subtle ones. A prospect who reads your proposal once and never returns is very different from one who keeps coming back to the same page three times across three different sessions. The first might be politely ignoring you. The second is almost certainly trying to build an internal case and getting stuck somewhere.

      What I found while building this is that silence after a proposal almost always means one of two things. Either genuine disengagement which looks like a low completion rate, a single short session, and no return. Or internal friction which looks like multiple sessions, re-reads on specific pages, and sometimes a second person from the same company quietly opening the document days later.

      Those two types of silence require completely different responses. The first needs a different approach entirely. The second just needs you to help your contact sell it internally.

      DocMetrics separates those two automatically and tells you in plain English which one you are dealing with and what to do next. Would love for you to try it and tell me honestly where it falls short.

      1. 1

        That distinction between “low engagement silence” vs “internal friction silence” is exactly the gap most tools miss — and honestly that’s where the real decision-making happens, not in the opens or clicks but in the pattern of return behavior across time.
        What I find interesting is that once you classify silence correctly, the next bottleneck becomes action timing — most people still follow up either too early on genuine disengagement or too late when internal momentum has already died. The interpretation layer you built feels like it’s heading in the right direction because it turns raw behavior into a decision window instead of just another dashboard. how you’re weighting “multi-view + multi-person activity” vs “repeat single-user engagement” in your scoring model — that’s usually where most false positives creep in.

        1. 1

          The action timing point is sharp and honestly it is the part I am still refining. Identifying the silence correctly is step one but the decision window is narrow in both directions. Too early and you interrupt momentum that was building internally. Too late and the energy has already dissipated before you re-enter the conversation. Right now the system fires follow up signals based on recency combined with pattern not just activity volume which helps but it is not perfect yet.

          On the weighting question — this is exactly where the model gets nuanced.

          Multi-person activity from the same company gets weighted more heavily as a positive signal than repeat single user engagement alone. The reasoning is that a second person opening the document almost always means the first person is actively advocating internally which is a structural signal not just a behavioral one. It means the deal has moved beyond your initial contact.

          Repeat single user engagement without a second viewer gets interpreted differently depending on the pattern. If the same person returns three times and re-reads the same page each time that reads as hesitation or confusion on that specific section not necessarily buying intent. If they return three times and progress further each visit that reads as genuine momentum building.

          The false positive risk you are pointing at is real. A single highly engaged user can look identical to multi-person early stage interest in raw numbers. The way I separate them is by combining session depth with viewer count. High depth single viewer gets flagged as warm with a hesitation note. Any multi-viewer activity regardless of depth gets flagged as hot with an internal review note.

          That said I would genuinely value your perspective on where that model breaks down. You clearly have experience with this specific problem.

          1. 1

            We are an experienced Agency,we also offer professional consultations to businesses

Trending on Indie Hackers
I've been building for months and made $0. Here's the honest psychological reason — and it's not what I expected. User Avatar 176 comments 7 years in agency, 200+ B2B campaigns, now building Outbound Glow User Avatar 55 comments This system tells you what’s working in your startup — every week User Avatar 52 comments 11 Weeks Ago I Had 0 Users. Now VIDI Has Reviewed $10M+ in Contracts - and I’m Opening a Small SAFE Round User Avatar 46 comments The "Book a Demo" Button Was Killing My Pipeline. Here's What I Replaced It With. User Avatar 23 comments My AI bill was bleeding me dry, so I built a "Smart Meter" for LLMs User Avatar 17 comments