Most of the stuff went over my head, Why should I care that C is no longer low-level? What exactly is considered close-to-metal in today’s time, apart from binary and assembly?

  • @velox_vulnusOP
    link
    English
    02 months ago

    At the time of writing this comment, does there exist any programming language ecosystem that does not stick to the “primitive” PDP-11 abstraction/virtual machine/whatever the author is trying to say? I’m just interested to know if such options do exist.

    • mozz
      link
      fedilink
      12 months ago

      Er… sort of. He brings up some towards the end:

      There is a common myth in software development that parallel programming is hard. This would come as a surprise to Alan Kay, who was able to teach an actor-model language to young children, with which they wrote working programs with more than 200 threads. It comes as a surprise to Erlang programmers, who commonly write programs with thousands of parallel components. It’s more accurate to say that parallel programming in a language with a C-like abstract machine is difficult, and given the prevalence of parallel hardware, from multicore CPUs to many-core GPUs, that’s just another way of saying that C doesn’t map to modern hardware very well.

      I would add to that Go with its channel model of concurrency which I quite like, and numpy which does an excellent job in my experience with giving you fast paralleled operations on big parallel structures while still giving you a simple imperative model for quick simple operations. There are also languages like Erlang or ML that try to do things in just a totally different way which in theory can lend itself to much better use of parallelism, but I’m not real familiar with them and I have no idea how well the theoretical promise works out in terms of real world results.

      I’d be interested to see someone with this guy’s level of knowledge talk about how well any of that maps into actually well-parallelized operations when solving actual real problems on actual real-world CPUs (in the specific way that he’s talking about when he’s criticizing how well C maps to it), because personally I don’t really know.