Hopefully someone can shed some light on this idea. Or explain something that kind of fits/fills the use case and need. I am looking for a basic operating system that can be updated across multiple devices like a living OS.

For instance I have a desktop PC high end specs with the same Operating System as a laptop or tablet but it’s live sync. Meaning apps, files, changes made on one system are the same on all devices. I’ve looked at cloning drives and have done it. Far too slow and cumbersome.

This would be essentially changing devices based on hardware power requirements but having the same living operating system synced across all devices so all data and abilities remain the same anytime something is needed.

Maybe I’m being far fetched or what have you and this might possibly be in the wrong Sub. But I assumed it would fall under self hosted almost. Ive considered a NAS and I’m open to other ways to structure the concept ALL IDEAS WELCOME feel free to expand on it in any way. But dealing with different operating systems and architectures of various devices is wildly difficult sometimes for software, mobility, power requirements not watts but processing power, cross compatibility. I’ve seen apps that sync across devices but some desktop apps and mobile apps aren’t cross compatible and with self hosting so many services that function well across networks and devices after years of uptime you sort of forget the configs of everything it’s a nightmare when a single app update or container causes a domino affect. Thanks everyone hopefully this is helpful to others as well with similar needs.

  • just_another_person@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    2 hours ago

    You’re describing a number of different things here, but you’re thinking about it in an overly complex manner.

    You need a centralized file store like a NAS, and a mountable workspace from said NAS that will mount to each machine, then you need some sort of Domain Directory service to join it all together. If you want the different desktops settings and stuff synced, you can achieve this with that setup, or you can go a step deeper and use an immutable distro of some sort, and commit and keep the same revision from one machine checked out on all your other machines (works kinda like a fit repo). This will likely present issues if it’s not all the same hardware though, so I would go with probably just keeping it simple if you go that route.

    User experience example would like this:

    • setup all your files on your centralized storage
    • join one machine to your domain (you can use LDAP, Samba+LDAP, NFSv4 domains…whatever)
    • login and have it pull your userinfo from the domain
    • your network mounts and user preferences will be pulled down and out in place

    Obviously this is simplified for the purposes of this post, but it should give you a direction to start investigating. Simplest path you can test this with is probably Samba, but it will be fairly limited and just serve as a starting point.

    Edit: if these concepts are a bit much for you, maybe consider getting a NAS with a good UI to make managing it much simpler. Synology has this baked in already, and I think Qnap does as well: https://www.synology.com/en-global/dsm/feature/active_directory

    • OhVenus_BabyOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      40 minutes ago

      The immutable distro is nice which I started putting /home in a separate parrition as a start and syncing across devices. I’m working on setting up a NAS now to make the process more longerterm friendly. By working I mean aquiring drives for storage currently have about 6tb. I just didn’t fully know the process and what it entails for software besides Tailscale. I’ve self hosted servers for games and some minor stuff. I was thinking about using synology but their hardware is wildly expensive. I really only need the drivebay and I can connect it to my server PC. Ill do a deeper dive after work.

  • Deckweiss@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    1 hour ago

    I run this somewhat. The question I asked myself was - do I R-E-A-L-L-Y need a clone of the root disk on two devices? And the answer was: no.


    I have a desktop and a laptop.

    Both run the same OS (with some package overlap, but not identical)

    I use syncthing and a VPS syncthing server to sync some directories from the home folder. Downloads, project files, bashrc, .local/bin scripts and everything else that I would actually really need on both machines.

    The syncthing VPS is always on, so I don’t need both computers on at the same time to sync the files. It also acts as an offsite backup this way, in case of a catasprophical destruction of both my computers.

    (The trick with syncthing is to give the same directories the same ID on each machine before syncing. Otherwise it creates a second dir like “Downloads_2”.)

    That setup is easy and gets me 95% there.

    The 5% that is not synced are packages (which are sometimes only needed on one of the computers and not both) and system modifications (which I wouldn’t even want to sync, since a lot of those are hardware specific, like screen resolution and display layout).


    The downsides:

    • I have to configure some settings twice. Like the printer that is used by both computers.

    • I have to install some packages twice. Like when I find a new tool and want it on both machines.

    • I have to run updates seperately on both systems so I have been thinking about also setting up a shared package cache somehow, but was ultimately too lazy to do it, I just run the update twice.


    I find the downsides acceptable, the whole thing was a breeze to set up and it has been running like this for about a year now without any hiccups.

    And as a bonus, I also sync some important document to my phone.

    • OhVenus_BabyOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 hour ago

      This seems like a nice approach that isn’t a full root disk live clone.

  • IsoKiero@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 hours ago

    Well that’s a interesting approach.

    First, you would need either a shared storage, like NAS, for all your devices or for them all to have equal amount of storage for your files so you can just copy everything to everywhere locally. Personally I would go with NAS, but storage problem in general has quite a few considerations, so depending on size of your data, bandwidth, hardware and everything else something other might suit your needs better.

    For the operating system, you would of course need to have the same OS installed on each device, and they all would need to run the same architecture (x86 most likely). With linux you can just copy your home directory over via shared storage and it’ll take care of most of the things, like app settings and preferences. But keeping the installed software in sync and updated is a bit more tricky. You could just enable automatic updates and potentially create a script to match installed packages between systems (Debian-based distros can use dpkg --get-selections and --set-selections, others have similar tools), so you would have pretty closely matching environments everywhere.

    Or if you really want to keep everything exactly the same you could use Puppet or similar to force your machines into the same mold and manage software installations, configuration, updates and everything via that. It has a pretty steep learning curve, but it’s possible.

    But if you want to match x86 workstations with handheld ARM devices it’s not going to work very well. Usage patterns are wildly different, software availability is hit or miss and the hardware in general differs enough that you can’t use the same configs for everything.

    Maybe the closest thing would be to host web-based applications with everything and use only those, but that limits heavily on what you can actually do and doesn’t give you that much flexibility with hardware requirements, meaning either that your slower devices crawl to halt or that your powerful workstation is just sitting idle on whatever you do.

    Maybe better approach would be to set up remote desktop environment on your desktop and just hop on to that whenever needed remotely. That way you could have the power on demand but you could still get benefits from portable devices.

  • tvcvt
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 hours ago

    With this concept in mind, I recently put together a VDI setup for a person who’s in one location for half of the year and another the other half. The idea is he’ll have a thin client at each location and connect to the same session wherever he is.

    I’m doing this via a VM on Proxmox and SPICE. Maybe there’s some idea in there you could use.

    • OhVenus_BabyOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      47 minutes ago

      Ahh this is a nice approach you sacrifice a little speed but the overall goal seems easiest to deploy.

  • catloaf@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    2 hours ago

    The most effective solution is to set up one powerful desktop and remote into it from the other devices.

    Windows and Linux have vague support for roaming profiles, but it takes a lot of work to get it working, and you’ll still only get 90% of the way there. Some programs just won’t play well with it. And you’ll be continually maintaining it.

    • Deckweiss@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      1 hour ago

      Even when my internet doesn’t suck for a minute, I have yet to find a linux remote software that is not sluggish or ugly from compression artifacts, low res and inaccurate colors.

      I tried my usual workflows and doing any graphic design or 3d work was impossible. But even stuff like coding or writing notes made me mistype A LOT, then backspace 3-5 times, since the visual feedback was delayed by at least half a second.

      • OhVenus_BabyOP
        link
        fedilink
        English
        arrow-up
        1
        ·
        43 minutes ago

        So your basically limited by speed and bandwidth. That sucks because another poster above mentioned things clients and remote connections. Nothing is as frustrating as trying to do something with lag and delay.

        • Deckweiss@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          38 minutes ago

          I think I am limited by the software.

          With a gigabit ethernet connection, I was not able to have a good experience.