Last night I was writing a script and it made a directory literally named “~” on accident. It being 3am I did an rm -rf ~ without thinking and destroyed my home dir. Luckily some of the files were mounted in docker containers which my user didn’t have permission to delete. I was able to get back to an ok state but lost a bit of data.

I now realize I really should be making backups because shit happens. I self host a pypi repository, a docker registry both with containers and some game servers in and out of containers. What would be the simplest tool to backup to Google drive and easily restore?

  • Shimitar@feddit.it
    link
    fedilink
    English
    arrow-up
    11
    ·
    16 days ago

    Restic or Borg. For restic I use the great Backrest web GUI.

    I mounted an USB drive to one of my OpenWRT access points and backup on that one.

    Rclone or fuse can mount/access Google Drive and can be used as back end for your backup choice.

    Simplest backup ever: restic/Borg on a folder on the same PC. Not very recommendable, but indeed a good starting point.

    Zfs/brtfs seems a complex solution for a simple problem. True is that once you start eating you get hungrier so maybe worthwhile.

    • Kelly Aster 🏳️‍⚧️@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      16 days ago

      For those on the fence about Borg Backup because it’s a command line app, FYI there’s a great frontend GUI for it called Vorta (yeah, in line with the Trek theme lol) that works really well. I don’t see it mentioned often, thought I’d pass that along. Might want to avoid the Flatpak version if you need to back up stuff outside your /home dir.

  • april@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    16 days ago

    There’s different kinds of backups. For this you don’t need off-site storage.

    For this I set up zfs auto snapshotting which means when I delete stuff it isn’t really deleted because a snapshot is still pointing at it until it rolls off the time window.

    Both zfs and btrfs can do this but you do need to change the filesystem to use these which can be a lot of work.

  • PhilipTheBucket@ponder.cat
    link
    fedilink
    English
    arrow-up
    7
    ·
    16 days ago

    Borg borg borg

    You can combine it with a FUSE mount of the Google Drive, I’m not sure if that works but I don’t see why it wouldn’t.

  • Nine@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    16 days ago

    Restic, it has native S3 compatibility and when you combine with something like B2 it makes amazing offsite storage so you can enjoy the tried and true 3-2-1 backup strategy.

    Also fedora magazine did a few posts on setting it up with systemd that makes it SUPER EASY to get going if you need a guide.

    I have an ansible role that configures it on everyone’s laptops so that they have local, NAS, and remote, B2, backup locations.

    Works like a charm for the past 8+ years.

  • atzanteol@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    ·
    16 days ago

    rclone & restic work okay together to create backups in a Google drive mount. There are “issues” with backing up to Google drive since it doesn’t guarantee file names are unique which is… a choice… but it should be reliable enough.

  • Atherel@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    3
    ·
    16 days ago

    I use duplicacy, it’s free as cli and pretty cheap if you want to manage the backup via gui. Restore by gui is always free and I would recommend it because it’s way easier to navigate the backups if you want to restore single files or folders.

  • axzxc1236@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    16 days ago

    simplest tool to backup to Google drive

    Without the need for versioning, I think rclone fits the description. For backup into USB drive / remote SSH server I would recommend rsync.

  • schizo@forum.uncomfortable.business
    link
    fedilink
    English
    arrow-up
    1
    ·
    16 days ago

    I just uh, wrote a bash script that does it.

    It dumps databases as needed, and then makes a single tarball of each service. Or a couple depending on what needs doing to ensure a full backup of the data.

    Once all the services are backed up, I just push all the data to a S3 bucket, but you could use rclone or whatever instead.

    It’s not some fancy cool toy kids these days love like any of the dozens of other backup options, but I’m a fan of simple and well, a couple of tarballs in a S3 bucket is about as simple as it gets since restoring doesn’t require any tools or configuration or anything: just snag the tarballs you need, unarchive them, done.

    I also use a couple of tools for monitoring the progress and a separate script that can do a full restore to make sure shit works, but that’s mostly just doing what you did to make and upload the tarballs backwards.

  • GrapinoSubmarino@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    15 days ago

    I had use Duplicati and Kopia. Duplicati works kinda well, the problem is the installation is such a pain in the ass, actually it never work as intended, that’s why I switched to Kopia, the installation is a lot easier, have GUI and and system tray icon, have better documentation, the Google drive backups with GUI requires a liltle configuration of rClone , nothing crazy