I’m curious how software can be created and evolve over time. I’m afraid that at some point, we’ll realize there are issues with the software we’re using that can only be remedied by massive changes or a complete rewrite.

Are there any instances of this happening? Where something is designed with a flaw that doesn’t get realized until much later, necessitating scrapping the whole thing and starting from scratch?

  • nycki@lemmy.world
    link
    fedilink
    arrow-up
    53
    arrow-down
    2
    ·
    edit-2
    8 months ago

    Starting anything from scratch is a huge risk these days. At best you’ll have something like the python 2 -> 3 rewrite overhaul (leaving scraps of legacy code all over the place), at worst you’ll have something like gnome/kde (where the community schisms rather than adopting a new standard). I would say that most of the time, there are only two ways to get a new standard to reach mass adoption.

    1. Retrofit everything. Extend old APIs where possible. Build your new layer on top of https, or javascript, or ascii, or something else that already has widespread adoption. Make a clear upgrade path for old users, but maintain compatibility for as long as possible.

    2. Buy 99% of the market and declare yourself king (cough cough chromium).

      • flying_sheep
        link
        fedilink
        arrow-up
        4
        ·
        8 months ago

        In a good way. Using a non-verified bytes type for strings was such a giant source of bugs. Text is complicated and pretending it isn’t won’t get you far.