I’ve been experimenting with a fairly algorithmic SEO workflow for Lodo, the UK fintech startup I’m building (currently, automated switching for mobile, broadband, and energy).
After ~12 days of publishing, based on the last ~48h of Search Console data, the site is currently running at roughly 50k impressions/month run rate.
Still early, rankings are volatile and clicks are modest, but the indexing velocity surprised me.
The workflow
The goal was to build something between manual content and programmatic SEO.
Instead of thousands of thin pages, the system produces structured comparison posts targeting decision-stage keywords.
Curated CSV from Semrush scored by:
Each batch pulls:
Claude generates a focused angle per keyword to avoid generic “ultimate guide” content.
Perplexity gathers facts, comparison points, FAQs, and sources.
Claude generates structured HTML posts with:
Articles are generated as JSON, uploaded to S3, validated, deduplicated, and compiled for the site to read.
Still watching how this evolves as rankings stabilise and whether impressions convert into clicks.
Happy to share more detail on the pipeline, or even the codebase, if people are interested.
Also important to note, spend some brainstorming with Cursor + another LLM on how to maximise your internal website structure (breadcrumbs, metadata, internal linking) so Google likes it. This matters a lot!
Also keen to explore how you can 'double down' on winning pages for the ones that are driving the most impressions/clicks.
You can see the output here: