Better internet connection - a lot of hosts have 40Gbps connections now, and it’s a data center grade connection with a lower contention ratio.
And also better infrastructure in general. VPS’s are running on a datacenter with (most likely) failsafes for everything. Multiple internet connections, pretty beefy setup for power reundancy with big battery banks and generators, multiple servers to take your stuff over in case a single unit fails, climate controls with multiple units and so on.
I could get 10Gbps connection (or theoretically even more) to my home, but if I want all the toys the big players are working with that would mean investing at least several tens of thousands euros to get anywhere and more likely hundred or two thousands to build anything even near to the same level. And that doesn’t include things like having mechanics to maintain generators, security stuff to guarantee physical safety and so on, so even if I had few millions to throw on a project like this it wouldn’t last too long.
So, instead of all that I have a VPS from Hetzner (I’ve been a happy customer with them for a long time) for less than a hamburger and fries per month. And that’s keeping my stuff running just fine. Obviously there’s caveats to look for, like backups in case Hetzner suddenly doesn’t exist anymore for whatever reason, but the alternative could as well be setting up a server farm in the Moon as that’s about as difficult to reach as getting similar reliability from them for ~100€/year.
Well that’s a interesting approach.
First, you would need either a shared storage, like NAS, for all your devices or for them all to have equal amount of storage for your files so you can just copy everything to everywhere locally. Personally I would go with NAS, but storage problem in general has quite a few considerations, so depending on size of your data, bandwidth, hardware and everything else something other might suit your needs better.
For the operating system, you would of course need to have the same OS installed on each device, and they all would need to run the same architecture (x86 most likely). With linux you can just copy your home directory over via shared storage and it’ll take care of most of the things, like app settings and preferences. But keeping the installed software in sync and updated is a bit more tricky. You could just enable automatic updates and potentially create a script to match installed packages between systems (Debian-based distros can use dpkg --get-selections and --set-selections, others have similar tools), so you would have pretty closely matching environments everywhere.
Or if you really want to keep everything exactly the same you could use Puppet or similar to force your machines into the same mold and manage software installations, configuration, updates and everything via that. It has a pretty steep learning curve, but it’s possible.
But if you want to match x86 workstations with handheld ARM devices it’s not going to work very well. Usage patterns are wildly different, software availability is hit or miss and the hardware in general differs enough that you can’t use the same configs for everything.
Maybe the closest thing would be to host web-based applications with everything and use only those, but that limits heavily on what you can actually do and doesn’t give you that much flexibility with hardware requirements, meaning either that your slower devices crawl to halt or that your powerful workstation is just sitting idle on whatever you do.
Maybe better approach would be to set up remote desktop environment on your desktop and just hop on to that whenever needed remotely. That way you could have the power on demand but you could still get benefits from portable devices.