Keeping an eye out for website problems

No Employees
Founders Code
Solo Founder

SiteSentry came out of working with a number of web dev agencies and realising how often little things that weren't regularly tested, such as SSL certificates, redirects, invalid / insecure links, etc. caused problems

May 9, 2021 Development Update 12

Today's release makes our broken link checks better internet citizens.

Previously our crawler would crawl 100 pages on a site, making two simultaneous connections every 1/4 second, but we noticed that we were sometimes getting 429 responses ("too many requests") from some web sites as we were hitting their rate limits,.

With today's release we've added per-site settings to limit the number of pages we crawl, the number of concurrent requests we make and the time between each request.

Hopefully this should improve the reliability of our crawler results and avoid placing too much load on our customer's websites.

March 21, 2021 Development Update 11

Today's release adds another new check to SiteSentry - Domain Name Expiry!

For every site it's monitoring, SiteSentry will now do a daily check of the site's WhoIs record and send a notification if the expiry date is approaching.

We've all read stories of huge companies being offline due to their domain name s expiring - both and SpamCop have suffered outages in 2021 due to exactly this.

As with SSL Certificates, it's easy to forget when a domain registration expires, or to miss the reminder the registrar sends (assuming they do, of course!), but now SiteSentry can check both SSL Certificate and Domain Name expiry dates and let you know well in advance so this doesn't happen to you!

March 9, 2021 First (“real”) user!

SiteSentry Has been in private beta for a while and hasn’t officially opened up to the whole world, but there’s nothing to stop someone signing up and this weekend someone did! It’s a tiny but significant win...the first user that isn’t someone I know and they’ve started using to monitor a site.

Not a paying customer (yet) but it’s a start.

I really need to switch focus to marketing in the very near future!

March 7, 2021 Development Update 10

I'm really pleased with today's release as it includes two big new features.

The first is a new type of check - SiteSentry will now check for broken links. This is the first SiteSentry check that crawls your website. I've deployed it in MVP form for now, meaning that it crawls a maximum of 100 pages per domain and only checks internal links. Once I get a good sense on how demanding this feature is, I'll be making these limits configurable though possibly only for paying customer.

The second enhancement is a small, but important, improvement - it's now possible to enable or disable checks on an individual basis. This is really useful if SiteSentry has notified you of a problem and you need some time to fix it, or if you have some planned downtime. You can temporarily disable the check so that SiteSentry won't nag you whilst you're working .

More details on the blog.

February 21, 2021 Development Update 9

I’ve started work on the next set of checks that SiteSentry will support: checking for mixed content and broken links.

Behind the scenes I’m also starting to work on a more flexible and scalable scheduling engine, with the core scheduler handing off check execution to separate independent tasks that can run in parallel, rather than running them itself sequentially. The broken links and mixed content checks are much more demanding on tHe servers and this should mean that they can be run for more sites and more often without adversely impacting overall performance.

February 14, 2021 Development Update 8

After a lot more time and effort than expected, I've just pushed a version of SiteSentry live with Stripe integration - I'm using Stripe Billing to handle subscription processing.

Stripe Billing includes a Customer Portal, which handles upgrading, downgrading, invoices and payment receipts, and more. It's pretty straightforward to integrate and provides massive functionality for the effort (and cost - an extra 0.5% on top of Stripe's standard fees).

February 1, 2021 Development Update 7

Added the same configuration option available for the robots meta tag and HTTP header check to the robots.txt check.

After initially making this super-configurable and realising it was getting super-confusing too, I dialled it back so you can simply say whether a robot.txt file should or shouldn't exist and, assuming it should, whether it should or shouldn't block search engines.

That's all core features complete. Next up...Stripe integration then LAUNCH!

January 13, 2021 Development Update 6

Sometimes you want your site to be blocked from being crawled or indexed by search engines - this can now be specified for the robots meta tag and robots http header checks. No more getting notifications for something that is actually the way you want it to be!

January 4, 2021 Development Update 5

Squashed a few bugs over the year-end break and added one key new feature - checking the robots.txt file to make sure that it isn't blocking access to a site. This compliments the check for the x-robots-tag header and robots meta-tag introduced late last year.

December 14, 2020 Development Update 4

Blocking search engines from indexing a site whilst under development is a common thing to do and good practice, but it's easy to forget to remove those blocks when a site goes live, so I've added a new check to SiteSentry - Search Engine Indexability.

SiteSentry will now check a site's HTTP headers for the x-robots-tag and home page for a robots meta tag and ensure that let you know if they are blocking search engines from indexing them.

SiteSentry came out of working with a number of web dev agencies and realising how often little things that weren't regularly tested, such as SSL certificates, redirects, invalid / insecure links, etc. caused problems