• merc@sh.itjust.works
    link
    fedilink
    arrow-up
    15
    ·
    11 months ago

    For a long time I’ve been of the opinion that you should only ever optimize for the next sucker colleague who might need to read and edit your code. If you ever optimize for speed, it needs to be done with massive benchmarking / profiling support to ensure that the changes you make are worth it. This is especially true with modern compilers / interpreters that try to use clever techniques to optimize your code either on the fly, or before making the executable.

    • Klear@sh.itjust.works
      link
      fedilink
      arrow-up
      7
      ·
      edit-2
      11 months ago

      The first rule of optimization: Don’t do it
      The second rule of optimization: Don’t do it yet (experts only)

    • Ephera
      link
      fedilink
      arrow-up
      1
      ·
      11 months ago

      I’m absolutely on-board …in application code.

      I do feel like it’s good, though, when libraries optimize. Ideally, they don’t have much else to do than one thing really well anyways.

      And with how many libraries modern applications pull in, you do eventually notice whether you’re in the Python ecosystem, where most libraries don’t care, or in the Rust ecosystem, where many libraries definitely overdo it. Because well, they also kind of don’t overdo it, since as a user of the library, you don’t see any of it, except the culmulative performance benefits.

      • merc@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        11 months ago

        Libraries are also written and maintained by humans.

        It’s fine to optimize if you can truly justify it, but that’s going to be even harder in libraries that are going to be used on multiple different architectures, etc.