• @pinknoise
    link
    3
    edit-2
    3 years ago

    Just a bunch of stupid bugs can turn your well behaved code into malware. You should not trust any code to do what you think it does, especially if you have written it yourself. If it’s possible to enforce fine grained access control and isolation then it should be done.

    the web would be pretty much unusable without javascript.

    Imo it would be a better place without it

    • @poVoq
      link
      1
      edit-2
      1 year ago

      deleted by creator

      • @pinknoise
        link
        3
        edit-2
        3 years ago

        And a lot of the impacts of bugs can be more easily mitigated against with general system improvements

        Yes and these improvements will converge to be a sandboxed environment. Even original unix had (weak) process isolation and ACL’s. Should we go back to cooperative multitasking because a scheduler is bloat? No, because it’s not practical. Should we remove all exploit mitigations and fix all the bugs instead? No, because it’s not practical. For reasonably complex programs we can’t tell if they are bug-free and even if we could the hardware it runs on may have bugs. The best we can do is minimize the impact a glitched program can realistically have.

        Rust is the better idea then wrapping everything in a sandbox.

        Rust prevents a range of stupid bugs that don’t have to happen. (plus other cool stuff) It can’t prevent logic bugs. Say e.g. you have a server with an unintentional arbitrary file inclusion. Would you rather like to wait for the bug to be fixed and be completely vulnerable in the meantime or have the impact limited to the files the server process/user is explicitely allowed to access?

        In fact it had stuff like Shockwave and Flash

        Sure, compared to those (whose turing completeness javascript predates btw.) it’s nice but no builtin RCE at all is still the better solution.