Hey Indie Hackers,
Like many of you, I’ve been shipping incredibly fast using Lovable. It’s a game-changer for prototyping, but I hit a massive technical wall early on: Google simply wasn't indexing my pages.
Because Lovable lacks a native CMS and has limited metadata controls, these AI-generated apps are often "invisible" to search crawlers right out of the box. After digging into the infrastructure, I realized that without server-side rendering (SSR) or a way to serve prerendered HTML, we’re essentially building high-speed ghost towns.
I decided to build a dedicated SEO layer to solve this—not just for my projects, but as a repeatable infrastructure for anyone building in this ecosystem.
The approach I took:
DNS-Level Proxy: Serving prerendered HTML to bots while keeping the React speed for users.
Automated /blog: Adding a real content layer to rank for topic clusters without a custom backend.
Instant Indexing: Forcing discovery on Google and AI search engines by fixing the metadata bottleneck.
It takes about 5 minutes to set up and requires zero changes to your Lovable codebase. We are currently in pre-launch, and I’d love to get feedback from other founders who have struggled with indexing their AI apps.
Check it out here: https://lovableseo.ai/
How are you guys currently handling SEO for your AI-generated sites? Would love to hear your workarounds.