Hopefully someone can shed some light on this idea. Or explain something that kind of fits/fills the use case and need. I am looking for a basic operating system that can be updated across multiple devices like a living OS.

For instance I have a desktop PC high end specs with the same Operating System as a laptop or tablet but it’s live sync. Meaning apps, files, changes made on one system are the same on all devices. I’ve looked at cloning drives and have done it. Far too slow and cumbersome.

This would be essentially changing devices based on hardware power requirements but having the same living operating system synced across all devices so all data and abilities remain the same anytime something is needed.

Maybe I’m being far fetched or what have you and this might possibly be in the wrong Sub. But I assumed it would fall under self hosted almost. Ive considered a NAS and I’m open to other ways to structure the concept ALL IDEAS WELCOME feel free to expand on it in any way. But dealing with different operating systems and architectures of various devices is wildly difficult sometimes for software, mobility, power requirements not watts but processing power, cross compatibility. I’ve seen apps that sync across devices but some desktop apps and mobile apps aren’t cross compatible and with self hosting so many services that function well across networks and devices after years of uptime you sort of forget the configs of everything it’s a nightmare when a single app update or container causes a domino affect. Thanks everyone hopefully this is helpful to others as well with similar needs.

  • Deckweiss@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    5 hours ago

    I run this somewhat. The question I asked myself was - do I R-E-A-L-L-Y need a clone of the root disk on two devices? And the answer was: no.


    I have a desktop and a laptop.

    Both run the same OS (with some package overlap, but not identical)

    I use syncthing and a VPS syncthing server to sync some directories from the home folder. Downloads, project files, bashrc, .local/bin scripts and everything else that I would actually really need on both machines.

    The syncthing VPS is always on, so I don’t need both computers on at the same time to sync the files. It also acts as an offsite backup this way, in case of a catasprophical destruction of both my computers.

    (The trick with syncthing is to give the same directories the same ID on each machine before syncing. Otherwise it creates a second dir like “Downloads_2”.)

    That setup is easy and gets me 95% there.

    The 5% that is not synced are packages (which are sometimes only needed on one of the computers and not both) and system modifications (which I wouldn’t even want to sync, since a lot of those are hardware specific, like screen resolution and display layout).


    The downsides:

    • I have to configure some settings twice. Like the printer that is used by both computers.

    • I have to install some packages twice. Like when I find a new tool and want it on both machines.

    • I have to run updates seperately on both systems so I have been thinking about also setting up a shared package cache somehow, but was ultimately too lazy to do it, I just run the update twice.


    I find the downsides acceptable, the whole thing was a breeze to set up and it has been running like this for about a year now without any hiccups.

    And as a bonus, I also sync some important document to my phone.

    • Ulrich@feddit.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 hours ago

      This is what I was going to suggest. Have all computers running the same OS and then just sync the home directory with SyncThing.

    • OhVenus_BabyOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 hours ago

      This seems like a nice approach that isn’t a full root disk live clone.

      • corsicanguppy@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 hours ago

        Yeah. And the full root disk clone thing is honestly gonna be more trouble than value. Ensure the big-bang stuff is the same - packages, but even not perfect (as above) but just same-version where installed; and general settings - and then synch the homedir.

        God help me, I’m thinking gluster between 2-3 machines, running a VM off that (big files so lock negot isn’t an issue) and having it commandeer the local vid for gaming. It’s doomed but it’ll be fun ha ha learning ha ha.

        There are exciting ways to ensure some settings and configs are kept the same, too, when they’re outside that synched home space. Ansible if you like thunky 2002 tech, chef or salt for newer but overkill, or mgmtconfig if you want modern decentralized peer-to-peer reactive config management.

        • OhVenus_BabyOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 hours ago

          Perhaps this idea is way way overkill and a going to a consistent headache. I am trying to simplify between devices and data. This idea is looking like a labor of love and I’m more into using the tools for what they are rather than always tinkering and working. I’m at the age where shit just needs to work. Some sort of remote desktop, or NAS or a combo might be the better easier route.