I’ve returned after taking into consideration everyone’s advice on what I should do with my basic VPS and…listened to none of it for reasons. Well, I had a good reason. I wanted to diversify my internet consumption with all this Reddit API mess and have gone back in part to RSS. In that vein I have stood up Minflux and Wallabag on said VPS. Both are excellent if you need a RSS and a read it later app. And they can tightly integrate with one another which is rad.

So now that I’ve got it set up the way I want, what is the recommended method for backing it up in case of failure/data loss? It’s running Ubuntu 20.04, if that helps. I have Google drive space as well as Backblaze B2 I could leverage. I just need to know which direction to look for solutions. VPS is rented through Racknerd and I confirmed they don’t have snapshot function unfortunately.

  • dillydogg@lemmy.one
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    On my VPS, every night I shut down the docker containers, then backup everything (including postgres & mariadbs) with borg using borgmatic, upload to backblaze b2, then restart the containers.

  • Freeman@lemmy.pub
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    Personally i would use rsync for config files and schedule database dumps. rsync those too.

    in fact thats exactly how I do it.

    • Father_RedbeardOP
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Would you be willing to share the details of how you set it up? Looking for a jump off point to roll my own workflow for this purpose.

  • flatbield@beehaw.org
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    You might look at what your hosting provider will do for you two. I am at Linode, and pay them $2/month and just turn on backup and they do it. Plus I can take one of my own snapshots any time. Like someone else said, if state matters think about that too. I.E. dumping databases, or shutting the VM down or services down and snapshotting it yourself.

    I like the Backblaze idea too but have not done that yet.

  • Mysterious Bread@startrek.website
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Since you have BackBlaze B2, you can just regularly back up your config files and database exports to a B2 bucket, either with rclone and a cronjob or something like Duplicati for a more integrated solution.

    • Father_RedbeardOP
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I’ll have to look into cronjob. Not something I have experience with but I’m generally aware of the concept.

  • sporif@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 year ago

    I use BorgBackup over ssh to rsync.net, then run rclone to sync the backup to Backblaze B2. Rsync.net allows you to run rclone on their server, so there is no inefficient extra step of downloading the backup and then uploading to B2. I also use Healthchecks.io to monitor the backups, which are run daily. You could try restic instead, as Borg doesn’t support backing up directly to B2 (only local and ssh).

  • paperemail@links.rocks
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    You should do application level backups and put those in backblaze b2:

    • for postgres look here.
    • look at all the software you’re running and what they say about making backups.
    • for files that don’t change often, making a an archive (with tar) is probably good enough. But if it changes during making the archive, the backup will be inconsistent.
    • think about your RPO: how much data are you willing to loose in case of a crash? 1 day? 2 hours? 15min? Schedule your backups to be at least as frequent.
    • Don’t forget to test your backups! Otherwise you’ll only find out that the backup is unusable when you need it most…