- cross-posted to:
- linux
- cross-posted to:
- linux
tr:dr; he says “x86 took over the server market” because it was the same architecture developers in companies had on their machines thus it made it very easy to develop applications on their machines to then ship to the servers.
Now this, among others he made, are very good points on how and why it is hard for ARM to get mainstream on the datacenter, however I also feel like he kind lost touch with reality on this one…
He’s comparing two very different situations, more specifically eras. Developers aren’t so tied anymore like they used to be to the underlaying hardware. The software development market evolved from C to very high language languages such as Javascript/Typescript and the majority of stuff developed is done or will be done in those languages thus the CPU architecture becomes irrelevant.
Obviously very big companies such as Google, Microsoft and Amazon are more than happy to pay the little “tax” to ensure Javascript runs fine on ARM than to pay the big bucks they pay for x86…
What are your thoughts?
From what I learned at university:
CISC instruction set (x86) was developed to adress the technical reality of its time - time costly CPU operation and fast read from storage. Not long after that the situation has changed - storage reads became slower in comparison to computing time (putting it simply it’s faster to read an archive and unpack it than to read unpacked thing). But in the meantime the PC boom has happened. In a way backward compatibility and market inertia locked us with instruction set that is not the best optimised for our tech, despite the fact that RISC (for example ARM) was conceived earlier.
In a way software (compilers and interpreters too) is like a muscle. The more/wider it’s used, the better it becomes. You can be writing in python but if your interpreter has some missed optimization opportunities, your code will be running faster on architecture with a better optimized interpreter available.
From personal observations:
The biggest cost of software is not to write something super efficient. It’s maintainability (readability and debugging), ease of use (onboarding/training time) and versatility (“let’s add the feature that is missing to what we have, instead of reinventing the wheel and maintaining two toolsets”).
The new languages are not created because they can do something faster than assembler (they can’t, btw). If assembly code is written as optimal as possible, high level languages can at best be as fast. Writing such assembly is a problem behind the keyboard, not a technical limitation. The only thing high-level languages do better is how much time it takes a human to work with it.
I would not be surprised to learn that bigger part of these big bucks you mention go not into optimization but rather into “how can we work around that difference so the high-level interface stays the same as for more widely used x86?”
In the end it all boils down to machine code - it’s the only thing that really exists when it comes to executing code. If your “human to bits translator” produces unoptimized binaries, it doesn’t matter how high-level your code was written in.
And sometime in the meantime we’ve arrived at a level when even a few behemoths like Google or Microsoft throwing money into research (not that I believe they are doing so when it comes to optimization) is enough.
It’s the field use that from time to time provides a use-case that helps finding edge-case where optimization can be made.
To purposefully find it? Dumping your datacenter in liquid nitrogen might be cheaper and probably more predictable.
So yeah, I mostly agree with him.
Maybe the times have changed a little, the thing that gave RISCs the most kick were smartphones, then one board computers, so not long ago. The improvements are always bigger at the beginning.
But the fact that some companies are trying to get RISC back into userland in my opinion means that the computer world has only started to heal itself after the effects of PC boom. There’s around 20 year difference where x86 was the main thing and RISC was a niche
deleted by creator