2
4 Comments

How I built SEO/GEO and got traffic in 6 weeks from day 0.

Classic SEO is almost dead, while GEO (Generative Engine Optimization) is rapidly growing. I had a hypothesis about how to build a modern organic funnel targeting exactly my ideal customers, so I decided to test it.

Development & Strategy

I started in the OpenRouter playground, trying to figure out which model writes the most human-like, engaging, and fluff-free text. I played around with system instructions, tested different tasks, and compared models.

Six weeks ago, the most capable LLM for this specific task was Gemini 3 Pro, with Claude 4.5 taking second place. (Side note: I re-tested them yesterday, and the new Claude 4.6 is now the absolute winner in writing).

The next quest was technical: providing a perfectly structured SEO/GEO markup (OGP, Meta, Schema.org/JSON-LD). I spent a whole day building the perfect markup pipeline. Then, I wrote a script to generate and publish programmatically. I scaled the velocity gradually: 1-3 posts/day for the first week, 3-9 the second week, 5-13 the third, and now I'm at 11-19 posts per day.

First Stats

Currently, Google has already indexed 283 pages out of almost 400. I got only a 2% "Crawled - Currently Not Indexed" rate, which is a fantastic result for programmatic AI content.

The first traffic started trickling in after 3 weeks. Now, I consistently get 2-5 clicks/day from traditional Search Engines + 1-3 clicks/day from AI Engines (Perplexity / ChatGPT Users). Keep in mind, my domain has basically zero authority (DR<1).

Conclusion

The numbers are small right now, but the pipeline works. I've ordered some backlinks this week, and a higher DR will inevitably lead to higher positions and citation rates in AI answers. For an automated experiment, I consider this a massive win.

What's Next?

I'm currently building Achiv.com - a tool that finds high-intent leads and aggregates their actual pain points and objections from places like Reddit.

Knowing this specific info, the next logical step is to merge these two systems. I want to build a fully automated SEO/GEO writer agent (with API and WebHook integrations) that drives organic traffic by automatically writing content targeted at the exact aggregated pain points of your audience. No keyword guessing, just answering real questions people are asking right now.

on March 9, 2026
  1. 2

    The 2% 'Crawled - Currently Not Indexed' rate on programmatic AI content from a DR<1 domain is genuinely impressive. Google has gotten stricter about crawling and indexing thin programmatic content, so that ratio suggests the structured markup + quality-first approach is working.

    The GEO angle is interesting — showing up in Perplexity/ChatGPT answers is a different optimization target than traditional SERP ranking. The citation-worthy content framing (structured, factual, canonical) overlaps with good SEO practice but the signals Perplexity responds to seem more focused on entity clarity and structured schemas.

    Your hypothesis about merging the pain-point aggregation (Achiv) with the automated writer is the right next step. The gap in most SEO tools is that they work from keyword data, not from actual user language. When your content uses the exact words people use to describe their problem, you get much higher intent match — both in search and in AI citations.

    Worth testing: whether FAQ/Q&A structured content outperforms article format for GEO specifically. Most AI engines seem to pull from conversational Q&A formats more reliably than long-form prose.

    1. 1

      Thanks for such a detailed feedback.

  2. 1

    Curious about one thing you didn't mention, what does your robots.txt look like for AI crawlers? You're getting Perplexity and ChatGPT traffic, so presumably you're not blocking their bots. But a lot of site owners do this accidentally. They block GPTBot thinking it stops training, but that also kills OAI-SearchBot (the one that indexes you for ChatGPT search results). Same with ClaudeBot vs Claude-SearchBot.

    The 2% crawled-not-indexed rate at DR<1 is really solid. Most sites I've looked at have way worse ratios, and they're usually missing basic stuff: canonical mismatches, not in their own sitemap, thin content. The fact that you nailed the structured markup from day one probably saved you from the quality filter that's catching everyone else post-June 2025 update.

    The scaling cadence (1-3 -> 11-19 posts/day) is smart too. Ramping gradually instead of dumping 400 pages at once avoids the "sudden programmatic content" signal that gets sites sandboxed.

    1. 1

      Nice catch about robots.txt. Will add one more test case on onboarding.

Trending on Indie Hackers
The most underrated distribution channel in SaaS is hiding in your browser toolbar User Avatar 162 comments I launched on Product Hunt today with 0 followers, 0 network, and 0 users. Here's what I learned in 12 hours. User Avatar 150 comments I gave 7 AI agents $100 each to build a startup. Here's what happened on Day 1. User Avatar 97 comments Show IH: RetryFix - Automatically recover failed Stripe payments and earn 10% on everything we win back User Avatar 34 comments How we got our first US sale in 2 hours by finding "Trust Leaks" (Free Audits) 🌶️ User Avatar 26 comments How to see your entire business on one page User Avatar 23 comments