My first post here was called "From 0 to $80 in 9 Days." That was D+9.
Today is D+17. The counter just crossed 11,800.
| Metric | D+9 (first post) | D+17 (today) |
|--------|-----------------|--------------|
| Total runs | 6,658 | 11,800+ |
| Estimated revenue | ~$80-90 | ~$140-160 |
| External users | 75 | 90+ |
| Actors earning | 12/13 | 13/13 |
The numbers move slower now. In the first week it felt like a rocket. Now it's more like a conveyor belt — steady, predictable, not exciting.
That's actually the good news.
1. One actor carries 72% of the load.
naver-news-scraper alone accounts for 8,483 of the 11,800 runs. The next closest is naver-place-search at 1,133. If news goes down, my revenue goes down.
I've been writing about this on Dev.to as a concentration risk. It's real. But concentration also means I know exactly what users want — Korean news monitoring is clearly the killer use case.
2. The day/night pattern is a gift.
Traffic drops to near-zero overnight (Seoul time), then surges at 9am KST when Korean businesses open. This pattern has held for over 2 weeks straight. It tells me these are real pipelines, not one-off experiments. Someone automated this.
3. Marketing matters more than more scrapers.
I spent the first 3 weeks building. Since then I've been writing about it — Dev.to series (30 posts), this community, Reddit attempts that mostly got filtered. The writing is generating more signal than any new feature would.
AmandaBrown asked in my first IH post: "Was there a signal from scraper 1 before you built the rest?" The answer is: barely. I built in the dark and got lucky that the demand existed. Now I see the signal. I'd build differently if I started over.
The goal isn't to 10x the runs. It's to diversify so no single actor is 72% of everything.
If you're building niche data tools: the demand is quieter than the hype categories, but it's real and sticky. My users don't unsubscribe. They automate and forget I exist. That's the best kind of product.
Full breakdown: https://dev.to/sessionzero_ai/10000-runs-in-13-days-not-a-spike-a-baseline-4849
D+9 to D+17 and the curve is still holding — that's the best kind of update to read. The 90+ external users on a niche API in 17 days is the part worth paying attention to. Most tools take months to find their first cluster of real users outside the builder's own network.
Curious what the D+17 distribution looks like — are those 90 users concentrated in one use case or spread across a few different ones? That split tends to determine whether the next phase is doubling down on one segment or staying general.
"my users dont unsubscribe. they automate and forget i exist" — thats the dream metric right there. recurring revenue where churn is basically zero because youre embedded in someones pipeline.
i built a niche API too — SEO site scanner. checks title tags, meta descriptions, alt text, heading hierarchy, page speed, schema markup. scores out of 100. runs on a $0/mo linux server.
the concentration risk point hits home. im in the opposite situation — no single user dominates because i have no users yet. but the lesson is the same: when one thing works, double down on understanding WHY it works before trying to diversify.
your day/night pattern analysis is smart. the fact that traffic correlates with korean business hours means these arent hobbyists — theyre production systems. thats the stickiest kind of customer.
curious about the RapidAPI listing — are you planning a free tier to get people testing, then paid tiers for volume? thats the approach im considering for my SEO API.