1
0 Comments

Development Update 12

Today's release makes our broken link checks better internet citizens.

Previously our crawler would crawl 100 pages on a site, making two simultaneous connections every 1/4 second, but we noticed that we were sometimes getting 429 responses ("too many requests") from some web sites as we were hitting their rate limits,.

With today's release we've added per-site settings to limit the number of pages we crawl, the number of concurrent requests we make and the time between each request.

Hopefully this should improve the reliability of our crawler results and avoid placing too much load on our customer's websites.

Trending on Indie Hackers
After 10M+ Views, 13k+ Upvotes: The Reddit Strategy That Worked for Me! 34 comments 🔥Roast my one-man design agency website 18 comments Getting first 908 Paid Signups by Spending $353 ONLY. 16 comments Launch on Product Hunt after 5 months of work! 16 comments Started as a Goodreads alternative, now it's taking a life of its own 12 comments I Sold My AI Startup for $1,500 and I'm Really Happy About It 11 comments