6,658 API calls. 75 users. ~$80-90 in revenue. 9 days.
Those are the real numbers from my Korean data scraping side project. Not life-changing money, but it's the first dollar I've ever earned from code I built — and it taught me more about indie hacking than any course ever could.
The Idea: Why Korean Data?
I live in Seoul and noticed something: there's almost no structured data tooling for Korean platforms. Naver (Korea's Google), Melon (Korea's Spotify), Musinsa (Korea's biggest fashion marketplace) — all have data that businesses desperately need, but no easy way to get it.
So I built scrapers for all of them. 13 in total, over about a month.
The Stack
The PPE model was the key decision. Instead of charging monthly subscriptions, users pay per API call — typically $0.005-0.05 per result. This means zero friction to try, revenue scales with usage, and I don't need to build payment infrastructure.
The Numbers (Day 9)
Revenue kicked in on March 13 when Apify activated PPE billing. My last scraper (Musinsa, Korean fashion) goes live March 25.
What I Got Right
Niche selection. Korean data is underserved. Most of my scrapers have zero direct competitors on Apify. When someone searches "naver scraper," they find me.
Platform leverage. Apify's marketplace brings organic traffic. I didn't do any marketing in the first week — users found me through search. 60% of early traffic was organic.
Volume over perfection. I shipped 13 scrapers instead of perfecting one. Some get 3 runs a day. Others get 400+. By casting a wide net, I found the winners: naver-news and naver-place account for 72% of all runs.
What I Got Wrong
Marketing is harder than building. I spent 95% of my time coding vs 5% on distribution. That ratio should have been flipped sooner.
Not all scrapers are equal. My Korean webtoon and book scrapers get almost no usage. I should have validated demand before building all 13.
Documentation > features. The scrapers that take off aren't the most sophisticated — they're the ones with the best README and examples.
What's Next
The Honest Take
$80 in 9 days isn't quit-your-job money. But: it runs itself (no code changes since launch), it compounds (each new channel expands the audience), and the niche is defensible (Korean language + platform knowledge is a real moat).
The hardest part wasn't building 13 scrapers. It was accepting that the builder's work was done, and the marketer's work had just begun.
All 13 scrapers: apify.com/oxygenated_quagmire | MCP Server: github.com/leadbrain/korean-data-mcp
this is the most honest first-revenue post i've seen on here. the "95% coding / 5% distribution" ratio hit hard because we're living that exact mistake right now. we built an api with 4 endpoints (seo analysis, speed checking, tech detection, link scanning) and like 21 gumroad products and have made exactly $0 because we spent all our time building and none of it getting in front of people.
the documentation > features point is huge too. we noticed the same thing — our api technically works great but nobody finds it because the docs are buried and there's zero content pointing to it. meanwhile some random "10 free seo tools" listicle outranks everything.
your niche moat is legit though. korean platform data with zero competitors on apify is way more defensible than what we're doing in the crowded seo/dev tools space (search vemtrac on gumroad to see what i mean). congrats on the first dollar, thats a real milestone.
The documentation insight is the one that stood out most here. It seems obvious in retrospect but most builders miss it: the README is your sales page. Someone lands on your Apify listing and decides in 30 seconds whether to trust the tool. Documentation isn't a nice-to-have. It's the product.
The 95% coding / 5% distribution ratio is the trap nearly everyone falls into, me included. The fix isn't necessarily spending less time building. It's making the marketing happen in parallel. Logging the build as content, answering questions in the communities where your users already are, being visible in the niche before you have something to sell.
Before you built scrapers 2 through 13, was there any signal from scraper 1 that made you confident the model worked? Or did you batch-build the full set before seeing any real traction?