4
3 Comments

A complete guide to Programmatic SEO

submitted this link to Icon for group SEO
SEO
on May 23, 2022
  1. 2

    I've got a question on programmatic SEO... how will it affect your overall website's link authority?

    I've been going through Brian Dean's Content-Led SEO course on Semrush and in one of the sections he makes the point that it's better to have fewer high-quality content pages, and thousands of thousands of long-tail content pages.

    The other big problem with long-tails is that the long-tail approach dilutes your site's link authority, also known as PageRank.

    After all, for the long-tail keyword approach to work, you need to publish LOTS of pages, each optimized around a different long-tail term.

    And all those pages can really hurt your SEO.

    Why?

    It's simple: the more pages you have on your site, the more your link authority gets diluted across those pages.

    For example, at Backlinko right now, I've published relatively few blog posts. And as you've already seen, I rank for some very competitive keywords above sites with thousands of pages.

    And because each page on my site has a lot of concentrated link authority, they rank for super- competitive keywords. But if I had 10,000 pages, my link authority would be diluted across all of those 10,000 pages, and my site wouldn't rank nearly as well.

    In fact, back when I did SEO client work, I had a little secret weapon that would usually help improve my client's rankings within days.

    What was that secret weapon? Deleting pages from their site.

    When I deleted pages, their link authority would be more concentrated on their important pages, and their rankings would improve.

    https://www.semrush.com/academy/courses/content-led-seo-course-with-brian-dean/choose-a-medium-tail-keyword

    1. 2

      I completely agree with the point, having thousands of pages does dilute the site's link authority.

      And there are 2 ways to approach this problem:

      • Create thousands of pages targeting long-tail keywords on another domain or a subdomain, or
      • Keep creating lots of backlinks to the site to maintain the site's link authority (recommended)

      However, the most important thing is you have to understand and calculate the loss/profits of creating pages targeting long-tail keywords.

      After creating 1000s of pages, rankings of highly-competitive keywords might come down a bit, and your site might get less traffic; but as the long-tail keywords are highly-transactional, it might even bring you more customers.

      So all I would say is, experiment!

      Hope it helps.

      Thanks.

      1. 2

        Thanks for your reply.

        I've been thinking more on this... if you take a website like Zapier it's clear that SEO is a big focus to their marketing.

        They do a lot of high touch content SEO with their blog articles. e.g. if you search 'best calendar scheduling app' this is the #1 organic result: https://zapier.com/blog/best-calendar-apps/

        But they also do an enormous amount of programmatic SEO with terms like 'integrate Product X with Product Y' e.g. https://zapier.com/apps/woocommerce/integrations/xero or for specific zaps e.g. https://zapier.com/apps/gmail/integrations/xero/12415/send-gmail-messages-when-new-payments-are-received-in-xero

        So, I'm wondering, perhaps link authority is negatively affected if your programmatic SEO strategy is not creating value e.g. creating thousands of pages targeting terms such as "UX design in Oak Valley, Cupertino"

        There's no real reason why having locally optimised pages adds tangible value to Google's search results. Unless the company literally resides in that location, UX Design can be provided easily across cities, states or even countries, so as Google gets smarter they will probably penalise a website that's used programmatic SEO in that way.

        However, when it comes to what Zapier is doing, their integration and individual zap pages do provide content that people are searching for. Those pages are legitimately relevant content to show in the search results page as they're capturing the person's search intent, and that's why they're not penalised for having probably over 100,000 programmatically created pages.

        Thoughts?

Trending on Indie Hackers
I shipped 3 features this weekend based entirely on community feedback. Here's what I built and why. User Avatar 155 comments I'm a lawyer who launched an AI contract tool on Product Hunt today — here's what building it as a non-technical founder actually felt like User Avatar 139 comments “This contract looked normal - but could cost millions” User Avatar 53 comments 👉 The most expensive contract mistakes don’t feel risky User Avatar 39 comments I realized showing problems isn’t enough — so I built this User Avatar 32 comments The indie maker's dilemma: 2 months in, 700 downloads, and I'm stuck User Avatar 30 comments