9
30 Comments

How do you keep your databases safe from future disasters?

As an indie hacker, I sometimes ignore backing up all of my databases to protect them from any kinds of attacks or future disasters. But I do feel that this behavior is primarily influenced by the lack of existing backup cloud solutions out there.

Well, I am mainly pointing out the lack of easier and "effortless" solution to just keep the snapshots of my entire database on the cloud somewhere. I am not talking about backing up the files since most of us keep it in a git repository anyway.

Though I found few solutions like DropMySite, it doesn't offer enough flexibility on the backup schedule. And I don't like to use AWS services if I don't need to, because it tends to be more on the "complicated" side.

So, is it just me or is there not enough backup solutions out there specifically designed for databases?
Let me know how you guys keep the backups of your databases.

How do you backup your project's databases?
  1. I use a cloud-based service (write the name in the comment)
  2. I download backups manually in local hard drives
  3. I have backup servers for that
  4. My server is protected from hackers and future disasters
  5. Is it really necessary?
  6. "Backup what?"
Vote
posted to Icon for group Developers
Developers
on December 24, 2021
  1. 3

    I use render.com pretty happy about how everything can be set up. They also handle db backups.

    1. 1

      Thanks for sharing

  2. 2

    I'm using the native backup utilities where I'm able, e.g. mysqldump, mongodump etc on VPS's but getting the backup file off the box is a bit of a problem as they're too large to send by email (which I've done for several tiny databases).

    I have other databases with GCP and Atlas that include backups with their service but I'd be interested in an off provider backup solution, possibly to a dedicated box running in my office or one of the cloud storage providers.

    The bigger part of this is that, when there's a "hair on fire" situation, restoring from a well organized backup should be drop-dead simple. That is, upload the last known good backup file(s) and restore to the appropriate database.

    As a bonus, I'd be able to test a backup by restoring it side-by-side the original db. This way I can be assured that it restores to the same DB version where it was created. I don't want to deal with errors because of MySql vs Maria or various installation configs or version differences.

    I'm happy to write custom code or complex bash scripts but haven't really applied myself to this problem.

    1. 1

      FYI, I am doing the "12 startups in 12 months" challenge from January.
      Probably this type of seamless backup solution will be one of the projects. :D

      Please follow me for updates. ;)

      1. 2

        This discussion prompted me to add scp to my mysqldump scripts. So now the backups are sent to the server in my office. I'm keeping 15 days of backups there and 7 days on the VPSs. I still get the emails every night. I'd prefer just getting an email when it didn't successfully backup, but this will do for now!

        Following!

        1. 1

          Awesome! I am glad that you added proper measures to make it more secure. :)

          Thanks for the follow.
          btw. i've published my newsletter to document the 12 months journey. plz feel free to subscribe too. :D
          http://digest.kazi.rocks/

          1. 2

            Followed on Twitter! The 12 months project sounds intriguing, good luck! Looking forward to watching your progress!

            1. 1

              Thanks man! Followed back.
              Ya, this is gonna be an epic journey i hope.

              fyi my first startup is probably gonna be a low-code tool. let me know if you are interested to try it. :)

    2. 1

      Yes, I also think that having another backup solution with another provider is important.
      I agree with the easiest restoration process as well.

      It would be awesome to have an option for multiple restorations of the same database (from different timestamps) in parallel. I am glad that you mentioned it. The idea is really interesting. :)
      But I don't think any of the cloud providers has this feature. Maybe it could be an idea of an indie project. ;)

  3. 2

    If you are comfortable with the Linux command line then I suggest investigating a database dump tool (pg_dump for postgres, mysqldump for mysql, every database has one.) Schedule your db dumps with cron at times of low traffic. Open an account with http://www.tarsnap.com/ and use their command line tool to upload the backups.
    Easy! This method should serve you well at least until you get to database sizes of around 100GB. By that time you might already have some staff to do this for you.

    If you use a hosted database somewhere and they don't do automatic backups then you are paying too much, whatever the price is.

    1. 1

      I also have been using tarsnap for a few years. It’s a great service.

    2. 1

      Wow. Tarsnap seems to be an awesome tool! Will definitely try it out. Thanks for the recommendation. It may really solve my problem. :)

      btw normally cloud databases take backup daily or weekly. I think we need more flexibility when the website will be popular.

  4. 2

    I also don't have a good solution for this yet.. I do it manually atm.

    I am not sure it would be a great SaaS business though, because as seen here in the replies many engineers seem to see it as "just do it yourself". And most/all database hosting providers offer it right out of the box.

    1. 1

      FYI, I am doing the "12 startups in 12 months" challenge from January.
      Probably this type of seamless backup solution will be one of the projects. :D

      Please follow me for updates. ;)

    2. 1

      That's what I wanted to know as well. :) Just checking the demand before jumping into building the product.

      But if the users aren't interested to use any additional backup service provider, then there is no market either. Still trying to analyze the data.

  5. 2

    Probably not the best, but haven't had/seen any problems with it:
    pgdump (postgres db)/backup to a file on a cron schedule, then backup server periodically performs sftp to pull the backup to a safe location

    It take some basic Linux knowledge

    1. 1

      It's actually not a bad idea for a small DB. I just need to write a script and use it in all of my projects.

  6. 2

    "I sometimes ignore backing up all of my databases"
    Wait, wat?!?

    Are you also handling customer data this way???

    There are tons of free and great ressources out there to create database backups. There is zero excuse not to perform regular, encrypted, restorable database backups.

    1. 1

      Currently, I am taking the backups manually and I am ashamed of it. :(

      Can you please list some of the tools that you use to backup databases? I am looking for options.

      1. 3

        Well, it depends on what underlying technology you are using. A big list of repos can be found here: https://github.com/search?q=database+backup

        If you don't want to code something yourself, or use a script and extend it, take a look at a commercial db backup service provider. I recently stumbled over https://snapshooter.com/. Don't have any experience with them myself, but a quick google search will definitely come up with lots of these services.

        1. 1

          Thanks a lot for your help. Seems like SnapShooter supports different types of DBs. Cool. :)

          1. 1

            Founder and ceo here
            Happy to help if you have questions :)

  7. 1

    It's good idea, but you need a sense of proportion

    1. 1

      Can you please elaborate on what you meant by the sense of proportion?

      1. 1

        I mean, you need to focus on solving one problem. You don't need to try to figure out all the topics.

      2. 0

        I understand that my answer is very vague. Sorry if I gave a little specifics. you are a great fellow and your material was very useful to me. If you have any clarifying questions, then I will be happy to answer these questions.

    1. 1

      RDS is not a backup solution though. If your database is hacked, it's gonna replicate it to the replica servers as well.
      Do you use AWS Backup for your RDS servers?

      1. 2

        Not really, but RDS has an automatic backup that runs daily, I'm relying basically on that. Do you think it's not enough?

        1. 1

          Yes, I think it's enough if your site is relatively new. Once you have high traffic to your website, then you may think of an option to keep backup for every hour or so (for example).

Trending on Indie Hackers
Your SaaS Isn’t Failing — Your Copy Is. User Avatar 57 comments Solo SaaS Founders Don’t Need More Hours....They Need This User Avatar 45 comments Planning to raise User Avatar 18 comments The Future of Automation: Why Agents + Frontend Matter More Than Workflow Automation User Avatar 13 comments AI Turned My $0 Idea into $10K/Month in 45 Days – No Code, Just This One Trick User Avatar 13 comments From side script → early users → real feedback (update on my SaaS journey) User Avatar 11 comments