Today's release makes our broken link checks better internet citizens.
Previously our crawler would crawl 100 pages on a site, making two simultaneous connections every 1/4 second, but we noticed that we were sometimes getting 429 responses ("too many requests") from some web sites as we were hitting their rate limits,.
With today's release we've added per-site settings to limit the number of pages we crawl, the number of concurrent requests we make and the time between each request.
Hopefully this should improve the reliability of our crawler results and avoid placing too much load on our customer's websites.