Hopefully someone can shed some light on this idea. Or explain something that kind of fits/fills the use case and need. I am looking for a basic operating system that can be updated across multiple devices like a living OS.

For instance I have a desktop PC high end specs with the same Operating System as a laptop or tablet but it’s live sync. Meaning apps, files, changes made on one system are the same on all devices. I’ve looked at cloning drives and have done it. Far too slow and cumbersome.

This would be essentially changing devices based on hardware power requirements but having the same living operating system synced across all devices so all data and abilities remain the same anytime something is needed.

Maybe I’m being far fetched or what have you and this might possibly be in the wrong Sub. But I assumed it would fall under self hosted almost. Ive considered a NAS and I’m open to other ways to structure the concept ALL IDEAS WELCOME feel free to expand on it in any way. But dealing with different operating systems and architectures of various devices is wildly difficult sometimes for software, mobility, power requirements not watts but processing power, cross compatibility. I’ve seen apps that sync across devices but some desktop apps and mobile apps aren’t cross compatible and with self hosting so many services that function well across networks and devices after years of uptime you sort of forget the configs of everything it’s a nightmare when a single app update or container causes a domino affect. Thanks everyone hopefully this is helpful to others as well with similar needs.

  • just_another_person@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    edit-2
    5 hours ago

    You’re describing a number of different things here, but you’re thinking about it in an overly complex manner.

    You need a centralized file store like a NAS, and a mountable workspace from said NAS that will mount to each machine, then you need some sort of Domain Directory service to join it all together. If you want the different desktops settings and stuff synced, you can achieve this with that setup, or you can go a step deeper and use an immutable distro of some sort, and commit and keep the same revision from one machine checked out on all your other machines (works kinda like a fit repo). This will likely present issues if it’s not all the same hardware though, so I would go with probably just keeping it simple if you go that route.

    User experience example would like this:

    • setup all your files on your centralized storage
    • join one machine to your domain (you can use LDAP, Samba+LDAP, NFSv4 domains…whatever)
    • login and have it pull your userinfo from the domain
    • your network mounts and user preferences will be pulled down and out in place

    Obviously this is simplified for the purposes of this post, but it should give you a direction to start investigating. Simplest path you can test this with is probably Samba, but it will be fairly limited and just serve as a starting point.

    Edit: if these concepts are a bit much for you, maybe consider getting a NAS with a good UI to make managing it much simpler. Synology has this baked in already, and I think Qnap does as well: https://www.synology.com/en-global/dsm/feature/active_directory

    • OhVenus_Baby@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      4 hours ago

      The immutable distro is nice which I started putting /home in a separate parrition as a start and syncing across devices. I’m working on setting up a NAS now to make the process more longerterm friendly. By working I mean aquiring drives for storage currently have about 6tb. I just didn’t fully know the process and what it entails for software besides Tailscale. I’ve self hosted servers for games and some minor stuff. I was thinking about using synology but their hardware is wildly expensive. I really only need the drivebay and I can connect it to my server PC. Ill do a deeper dive after work.

      • just_another_person@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        3 hours ago

        Certainly don’t need the Synology, just an option if you want things in a clean and tidy package with a UI to manage some things.

        • OhVenus_Baby@lemmy.mlOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 hours ago

          I like things that are easy to setup. While I don’t mind tinkering the older I get I want shit to just work. Hassle and diagnostic days are very few and far between as time gets less. Any good simple setups for some SSDs? I don’t mind a little setup but simple is preferred.

            • OhVenus_Baby@lemmy.mlOP
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 hour ago

              SSDs for fast transfers, and then maybe use HDDs for large slow stuff? I’m afraid transferring things are too slow on a HDD but I’ve never used a NAS. So I’m unsure of what determines transfer speeds I mean network speed has to play a role but read and write speed as well. I’ve had HDDs in several PCs and their just ungodly slow to do file transfers and game on but large sizes are cheaper. So I was asking essentially which storage type is best for fast NAS speeds? And what is a good setup to start with as far as software and hardware?

              • just_another_person@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                27 minutes ago

                If you’re talking about network attached storage, you won’t see much benefit from SSDs since the network transfer speeds are the bottleneck. Example: SSD transfer rate of 3000MB/s, but your wifi may only go 500MB/s, or Ethernet 1000MB/s.

                You’re better off just going HDD on a NAS, at least for the $/Tb. Just start with two disks in RAID1. More than good enough for what you’re trying to do.

  • acockworkorange@mander.xyz
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 hours ago

    The good thing is someone has thought of this before. That’s basically the concept of InfernoOS. The bad news is it is defunct and, even if it wasn’t, it was only really used to for network appliances, such as routers, print servers, etc.

    Remote access seems to tick most of the boxes for you and it’s quite easy to install and maintain.

  • Deckweiss@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    4 hours ago

    I run this somewhat. The question I asked myself was - do I R-E-A-L-L-Y need a clone of the root disk on two devices? And the answer was: no.


    I have a desktop and a laptop.

    Both run the same OS (with some package overlap, but not identical)

    I use syncthing and a VPS syncthing server to sync some directories from the home folder. Downloads, project files, bashrc, .local/bin scripts and everything else that I would actually really need on both machines.

    The syncthing VPS is always on, so I don’t need both computers on at the same time to sync the files. It also acts as an offsite backup this way, in case of a catasprophical destruction of both my computers.

    (The trick with syncthing is to give the same directories the same ID on each machine before syncing. Otherwise it creates a second dir like “Downloads_2”.)

    That setup is easy and gets me 95% there.

    The 5% that is not synced are packages (which are sometimes only needed on one of the computers and not both) and system modifications (which I wouldn’t even want to sync, since a lot of those are hardware specific, like screen resolution and display layout).


    The downsides:

    • I have to configure some settings twice. Like the printer that is used by both computers.

    • I have to install some packages twice. Like when I find a new tool and want it on both machines.

    • I have to run updates seperately on both systems so I have been thinking about also setting up a shared package cache somehow, but was ultimately too lazy to do it, I just run the update twice.


    I find the downsides acceptable, the whole thing was a breeze to set up and it has been running like this for about a year now without any hiccups.

    And as a bonus, I also sync some important document to my phone.

    • Ulrich@feddit.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 hours ago

      This is what I was going to suggest. Have all computers running the same OS and then just sync the home directory with SyncThing.

      • corsicanguppy@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 hours ago

        Yeah. And the full root disk clone thing is honestly gonna be more trouble than value. Ensure the big-bang stuff is the same - packages, but even not perfect (as above) but just same-version where installed; and general settings - and then synch the homedir.

        God help me, I’m thinking gluster between 2-3 machines, running a VM off that (big files so lock negot isn’t an issue) and having it commandeer the local vid for gaming. It’s doomed but it’ll be fun ha ha learning ha ha.

        There are exciting ways to ensure some settings and configs are kept the same, too, when they’re outside that synched home space. Ansible if you like thunky 2002 tech, chef or salt for newer but overkill, or mgmtconfig if you want modern decentralized peer-to-peer reactive config management.

        • OhVenus_Baby@lemmy.mlOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 hour ago

          Perhaps this idea is way way overkill and a going to a consistent headache. I am trying to simplify between devices and data. This idea is looking like a labor of love and I’m more into using the tools for what they are rather than always tinkering and working. I’m at the age where shit just needs to work. Some sort of remote desktop, or NAS or a combo might be the better easier route.

  • tvcvt@lemmy.ml
    link
    fedilink
    English
    arrow-up
    3
    ·
    5 hours ago

    With this concept in mind, I recently put together a VDI setup for a person who’s in one location for half of the year and another the other half. The idea is he’ll have a thin client at each location and connect to the same session wherever he is.

    I’m doing this via a VM on Proxmox and SPICE. Maybe there’s some idea in there you could use.

    • OhVenus_Baby@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 hours ago

      Was this functioning at a good performance level? Remote setups are sometimes finnicky and unresponsive or very delayed. I really like this idea and setup a NAS to go along with it for the other devices. 1 or 2 devices is all I want to connect. I’m very interested if this is of good performance.

  • catloaf@lemm.ee
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    5 hours ago

    The most effective solution is to set up one powerful desktop and remote into it from the other devices.

    Windows and Linux have vague support for roaming profiles, but it takes a lot of work to get it working, and you’ll still only get 90% of the way there. Some programs just won’t play well with it. And you’ll be continually maintaining it.

    • Deckweiss@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      4 hours ago

      Even when my internet doesn’t suck for a minute, I have yet to find a linux remote software that is not sluggish or ugly from compression artifacts, low res and inaccurate colors.

      I tried my usual workflows and doing any graphic design or 3d work was impossible. But even stuff like coding or writing notes made me mistype A LOT, then backspace 3-5 times, since the visual feedback was delayed by at least half a second.

      • OhVenus_Baby@lemmy.mlOP
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 hours ago

        So your basically limited by speed and bandwidth. That sucks because another poster above mentioned things clients and remote connections. Nothing is as frustrating as trying to do something with lag and delay.

        • Deckweiss@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 hours ago

          I think I am limited by the software.

          With a gigabit ethernet connection, I was not able to have a good experience.

          • OhVenus_Baby@lemmy.mlOP
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 hours ago

            So what was your end solution? I love the idea of remote desktops but it always seems it’s laggy, basically the issues you described. I’ve tried stuff like teams, and some other legit remote connection software but it never was smooth flowing. Resonsiveness is sort of the killer to this or it would be awesome.

  • IsoKiero@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    2
    ·
    5 hours ago

    Well that’s a interesting approach.

    First, you would need either a shared storage, like NAS, for all your devices or for them all to have equal amount of storage for your files so you can just copy everything to everywhere locally. Personally I would go with NAS, but storage problem in general has quite a few considerations, so depending on size of your data, bandwidth, hardware and everything else something other might suit your needs better.

    For the operating system, you would of course need to have the same OS installed on each device, and they all would need to run the same architecture (x86 most likely). With linux you can just copy your home directory over via shared storage and it’ll take care of most of the things, like app settings and preferences. But keeping the installed software in sync and updated is a bit more tricky. You could just enable automatic updates and potentially create a script to match installed packages between systems (Debian-based distros can use dpkg --get-selections and --set-selections, others have similar tools), so you would have pretty closely matching environments everywhere.

    Or if you really want to keep everything exactly the same you could use Puppet or similar to force your machines into the same mold and manage software installations, configuration, updates and everything via that. It has a pretty steep learning curve, but it’s possible.

    But if you want to match x86 workstations with handheld ARM devices it’s not going to work very well. Usage patterns are wildly different, software availability is hit or miss and the hardware in general differs enough that you can’t use the same configs for everything.

    Maybe the closest thing would be to host web-based applications with everything and use only those, but that limits heavily on what you can actually do and doesn’t give you that much flexibility with hardware requirements, meaning either that your slower devices crawl to halt or that your powerful workstation is just sitting idle on whatever you do.

    Maybe better approach would be to set up remote desktop environment on your desktop and just hop on to that whenever needed remotely. That way you could have the power on demand but you could still get benefits from portable devices.

    • OhVenus_Baby@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 hours ago

      Do you know of a way to make it less laggy and more responsive? Working remotely sucks atleast with my past experiences. Otherwise I love the idea. I think it my best easier to setup a NAS and just deal with a little hassle transferring files back and forth. I’m super aggravated dealing with 10 devices everyday I’m trying to simplify everything possible.