Love to see upgrades with a negative net size lmao. Software should get more optimized with time, not more bloated. Oop, just got the gnome console popup notification saying that my install command finished running, sweet – it took as long as making this post
To be fair Windows also uses less disk space after an update to Linux :p
Nix store go: 😭
So tired of android eating my carefully set aside free space.
Of you don’t delete your package cache it will still use more disk space, regardless of this output.
paccache -r
There done
I just got the hook from the AUR, don’t even have to think about it lol
Back in the day there was a Mac OS update (Snow Leopard) that took gigabytes off. They dropped support for PowerPC CPUs. So the compiled binaries basically got slashed in half.
The goals of Snow Leopard were improved performance, greater efficiency and the reduction of its overall memory footprint, unlike previous versions of Mac OS X which focused more on new features. Apple famously marketed Snow Leopard as having “zero new features”.[13] Its name signified its goal to be a refinement of the previous OS X version, Leopard.
God, Snow Leopard was peak Apple.
As an avid apple disliker, they really got a lot of things right with 10.x, with snow leopard hitting it out of the park. Everything from them around that era was slick. If I wasn’t a poor college kid running a 5 year old eBay Thinkpad I would have been sucked into their oppressive ecosystem in a heartbeat.
There’s a different timeline where the board also brought back Wozniak, OS X has linux under the hood and all third party software was cross compatible.
I wouldn’t imagine iTunes on Ubuntu, but think if all that annoying office software that keeps workplace from switching to linux was suddenly available?
That indeed was a great OS.
Wait they pushed binaries for both architectures to everyone?
Yes. Thats how they made everything seem magical to the end user.
Two architectures, and two binaries in the single package.
All those programs that only had binaries in the old architecture ran through the emulator Rosetta.
Once the old architecture had been deprecated long enough, they dropped the PPC compilation in the binaries.
There was the technique to regain disk space by deleting the unused architecture binaries from the bundles.
I don’t know if that’s what they did for the PowerPC -> Intel switch, but now with the Intel -> ARM switch, Xcode compiler tools spit out dual arch binaries, so you can run the same binary natively on x86 or ARM. Things that aren’t compiled that way yet and only have x86 binaries, will be run using Rosetta 2.
Doesn’t matter much to the end user though. It’s all just pretty seamless if you’re on an ARM Mac and idk if there’s much or any problems on x86 Macs yet regarding binary compatibility. I actually doubt there is.
It was exactly the same.
Its why the Intel -> Arm is called Rosetta 2 and not Rosetta.
That OS was the last of Apple to come on optical media. So, no pushing. Buying physically.
It probably made the downloaded binary smaller, but the actual instal size for x86 machines probably didn’t change much.
…what?
We’re talking about the end of the transitional period from PowerPC (the G3 and G4 iMacs and iBooks) to the Intel architecture (about the time they went to the Macbook nomenclature). If I read this right, they didn’t push separate PowerPC and Intel architecture versions, you’d just get MacOS (or in those days, OSX) and it would ship with both binaries. Which, compiled binaries would be quite different for different architectures, data files, graphics, interpreted code etc. would be similar but pre-compiled binaries would be different.
I know for awhile a lot of applications were only available for PowerPC, so they did the Rosetta translation layer, which is a reason why you’d find PowerPC binaries on an Intel system. They did exactly that again with the transition from x86 to ARM.
I already responded to you in another comment, but:
If I read this right, they didn’t push separate PowerPC and Intel architecture versions, you’d just get MacOS (or in those days, OSX) and it would ship with both binaries.
No, it’s even crazier than that. You didn’t get separate PowerPC and Intel binaries either. You got fat binaries that had machine code for both architectures!
Might happen again one day if they decide to drop x86 support. Which they likely will.
in maybe 30 years, lol
idk, apple is very trigger-happy when it comes to discontinuing things (outside of the iphones, strangely.) i think by 2030 we will be long gone from apple x86 machines.
Well they haven’t made a single x86 machine in what, 4 or 5 years?
The 2024 version of MacOS doesn’t support anything older than 2017 and for most models it’s more like 2018-2020
I’d say in 2-3 years they’ll drop support for all x86 machines, at which point first party binaries can stop shipping with x86 code. Then eventually, several years later, they’ll drop support for x86 emulation via Rosetta 2, so that’s another thing they can drop from the OS. And once xcode stops giving you those fat dual-arch binaries, other software will also take a bit less space.
OS is bloat, if you’re not shifting CPU registers by hand are you even a Linux user?
spoiler
No, because Linux is a kernel/OS, and OS is bloat
Exactly, you boot the kernel, then get out the electron microscope to twiddle those bits (which is why Linux users are perverts)
electron microscope
Bloat, why should my microscope be running an entire chromium browser?
I’d diddle a bit
Ninja Edit: wait…
Decided to try this out on Tumbleweed. I last updated yesterday. Today I have 4 packages to upgrade and doing so will drop ruby 3.3. Looks like I also have Ruby 3.4 installed so likely I had a package depending on 3.3 and another on 3.4 and now the 3.3 has moved to 3.4. I regained a whopping 30 MB disk space!
I’m so used to it I never realized it’s unusual.
Exactly. Same here. The fact that „linux“ isnt a product that has to have the shiny new thing after every update and has no deadlines to hold and no manager to keep happy makes it a fundamentally different thing which actually is very much in line with efficiency ideas, the idea of progress and evolution as a whole. At least thats how I view it.
The shiny new thing can be better code to do the same thing.
IMO, that’s the shiniest thing
If you‘re a cave dweller like me that stares at code for pleasure, yes.
I’m not a programmer by any means, but I’m guessing, they are just removing old redundant features and code, but I could be very wrong here.
a new version of a program can also move to a different set of dependencies that is shared with another program, so you don’t need to keep both around.
This wouldn’t appear like this when upgrading the system with pacman. pacman does not automatically remove orphaned dependencies during upgrades. You have to query for them and remove them explicitly as a separate operation afterwards. So in the OP what we’re seeing is the new versions of packages themselves getting smaller.
til!
Good ol’
pacman -Rns $(pacman -Qqtd)
, or as I’ve aliased it,orphankiller
Removing some deprecated old library or just good old optimization.
I keep forgetting to run apt autoremove to save even more space.