My dad also used to self-host. Now I run all of the services he used to, and more.

Anyway, his server is still around so I thought I might as well use it for an offsite backup box, I run a matrix instance and nextcloud, as well as other things. But those two are the contents of which matter most.

How would you set this up?

Just a nightly rsync over sftp? That seems ineffecient. Is there a best way ro do this?

  • brownmustardminion
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    You can connect your main server and backup server to a VPS with wireguard. The main server backs up proxmox vms and cts to a proxmox backup server on the backup server. Nextcloud data can be backed up with something like duplicati encrypted over sftp to the backup server. Only hiccup about backing up Nextcloud is you should put it into maintenance mode first. You can write a script pre duplicati backup and post backup to enable and disable maintenance mode.

    • MentalEdge@sopuli.xyzOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      VPS is worth considering. I run no vms.

      Duplicati seems like a good option.

      Aware of maintenance mode. Thanks!

  • David of the Ferns@fernchat.esotericmonkey.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Tell us a bit about your environment! Are you all linux or do you have Windows as well? Are you running a hypervisor like Proxmox or VMWare or using containers? Are you just making complete backups, or can you forsee yourself needing granular file restores? There are a number of ways you could go, depending on your setup.

    • David of the Ferns@fernchat.esotericmonkey.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      1 year ago

      I personally run a ProxMox cluster. I run both Windows and Linux servers. I perform local full-VM backups using the hypervisor to a USB disk. That gives me a fast way to restore VMs if I need to. I also run Veeam, which handles the offsite copy and provides granular file restores. It’s nice because the community edition supports hardened disk immutability, which can help prevent ransomware attacks and Unfortunate Incidents. That just runs over SSH, and installs a Veeam agent/repo on the remote linux box.

    • MentalEdge@sopuli.xyzOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 year ago

      All linux.

      Mostly running stuff directly, though I have some things in containers.

      I’ve consolidated configs and such into just a few folders that I can bring over to a new system to get everything running again without losing anything.

      This backup will likely only ever be needed in a catastrophic failure scenario where my local system is entirely lost.

      Unless that happens, I already have enough redundancy locally to recover from any lesser mishaps.

  • Walter_Ego@lemmy.arpatubes.net
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    1 year ago

    wireguard tunnel, carve off an lv on his network, expose it as an ataoe/iscsi/whatever device over wg, connect it on your end, write a luks volume to it. periodically connect, unlock, mount, rsync all your shit to it, unmount, lock, disconnect?

    or just create a gpg’d backup set and rsync it over wg? or just rsync over wg if you dont give a shit about encryption or anything.