We all know the value of Evergreen content. This long-term content is key for passive-income: produce once and reap the benefits for months and years to come.
However, the Internet isn't always benevolent towards long-term content. The Internet is inherently fragile. It moves fast, links break, pages are updated, etc.
Let's say you make a long-form blog post with 60ish external links. How you do know that in 3 months, 6 months, 2 years, all the links will still work?
What's more, how do you know those links still lead to the same content as when you first published? Maybe the "Top Regional Universities" paged you linked hasn't transformed into "Top Regional Colleges" 6 months later - same URL but different content, thus compromising your own content.
I am imagining a service that periodically crawls a site for external links and makes sure that (1) the links aren't broken, and (2) the links' content has not significantly changed since you linked to it.
Is this a pain-point that bloggers, newsletter-ers, academic journals (anyone that has long-term content online) encounter? Or am I just thinking of a problem that they don't really care about?