I am a long-time NoScript extension (https://noscript.net/) user. For those who don’t know this automatically blocks any javascript and let you accept them (temporarily or permanently) based on the scripts’ origin domain.

NoScript as some quality-of-life option like ‘accepting script from current page’s domain by default’ so only 3rd parties would be blocked (usefull in mobile where it is tedious to go to the menu).

When I saw LibreJS (https://www.gnu.org/software/librejs/) I though that would be a better version of NoScript but it is quiet different in usage and cares about license and not open-source code (maybe it can’t).

Am I the only one who thought about checking for open-source JS scripts filtering (at least by default)? This would require reproducibility of ‘compilation’/packaging. I think with lock files (npm, yarn, etc) this could be doable and we could have some automatic checks for code.

Maybe the trust system for who checks could be a problem. I wanted to discuss this matter for a while.

  • 9point6@lemmy.world
    link
    fedilink
    arrow-up
    6
    arrow-down
    3
    ·
    edit-2
    21 days ago

    Publishing lock files of running services would be a big security risk for the service owner as it gives an easily parsable way for an attacker to check if your bundle includes any package versions with vulnerabilities.

    You then also have tools like snyk used by many big organisations which has the ability to patch dependencies before the actual dependency published the patch themselves. This would lead to a version not corresponding with the bundled code.

    In fact given bundling is pretty ubiquitous, but infinitely configurable at this point, even validating the integrity of the bundle Vs the versions in a lock file is a problem that will be hard to achieve. It’s kinda like wanting to untoast bread.

    Also given many JS projects have a lock file which describes both the deficiencies of the front end bundle, server & build tooling, there is a risk of leaking information about that too (it’s best practice to make as little as possible about your server configuration publicly viewable)

    IMO, the solution to this problem today is to use a modern, updated browser that sandboxes execution, run a adblocker with appropriate trusted blocklists for what you’re avoiding, try to only use sites you trust & if you can, push web developers to use CSP & SRI to prevent malicious actors from injecting code into their sites without them knowing. Many sites already take advantage of these features, so if you trust the owner, you should be able to trust the code running on the page. If you don’t trust the owner with client side JS, you probably shouldn’t trust them with whatever they’re running on the server side too.

    • KajikaOP
      link
      fedilink
      arrow-up
      4
      arrow-down
      2
      ·
      edit-2
      21 days ago

      I believe you missed the point, I am not in defense of Security through obscurity (https://en.wikipedia.org/wiki/Security_through_obscurity), quiet the opposite.

      The point: “[…] risk for the service owner as it gives an easily parsable way for an attacker to check […]” is well known and not the discussion here. You can choose close source for ‘security’ this is opensource community so I am wondering about such a tool.

      • 9point6@lemmy.world
        link
        fedilink
        arrow-up
        6
        ·
        21 days ago

        Maybe I have missed your point, but based on how I’ve understood what you’ve described I think you may have also missed mine, I was more pointing out how the practicalities prevent such a tool from being possible from a few perspectives. I lead with security just because that would be the deal breaker for many service owners, it’s simply infosec best practice to not leak the information such a tool would require.

        Your filtering idea would require cooperation from those service owners to change what they’re currently doing, right?

        Perhaps I’ve completely got the wrong end of the stick with what you’re suggesting though, happy to be corrected

        • KajikaOP
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          21 days ago

          Thanks for your answer.

          First I don’t even grasp what a “service owner” is.

          Second, for JS front-end openness there are already a bunch of app (web, android) that are open-source and secured. Everything has dependencies nowadays, this doesn’t prevent good security. Think all the python app and their dependencies, rust, android… even c\c++ packages are built with dependencies and security updates are necessary (bash had security issues).

          I think with JS scripts it’s actually even easier to have good security because the app is ran in our web browser so the only possible attacker is the website we are visiting itself. If they are malicious then the close-sourced JS script is even worse. Unless you count 3rd party scripts embedded that bad dev uses in their website without even thinking about trusting them. That is also awful in both open or close source environment.

          So even having imperfect security (which happens regardless to openness), who is the attacker here? I would rather run js script on my end if the code can be checked.

          • 9point6@lemmy.world
            link
            fedilink
            arrow-up
            6
            arrow-down
            1
            ·
            21 days ago

            First I don’t even grasp what a “service owner” is.

            The people who build & run the software & servers that serve the website, who amongst other things have an interest in keeping the service available, secure, performant, etc.

            Particularly with laws like GDPR, these services owners are motivated to be as secure as practically possible otherwise they could receive a bankrupting fine should they end up leaking someone’s data. You’ll never be able to convince anyone to lower the security of their threat model for that reason alone, before anything else.

            there are already a bunch of app (web, android) that are open-source and secured.

            The code published and the code running on a server cannot be treated as equivalent for several reasons, but here’s two big ones:

            Firstly, there’s the similar issue as with compiled binaries in other languages: it’s tough (or impossible) to verify that the code published is the same code that’s running. Secondly the bundled and minified versions of websites are rarely published anyway, at most you get the constituent code and a dependency list for something completely open source. This is the bit I referred to before as trying to untoast bread, the browser gets a bundle that can’t practically be reversed back into that list of parts and dependencies in a general purpose way. You’d need the whole picture to be able to do any kind of filtering here.

            who is the attacker here?

            The only possible attacker is not the website itself (though it’s a lot more limited if the site implements CSP & SRI, as mentioned in my other comment). XSS is a whole category of attacks which leverage an otherwise trusted site to do something malicious, this is one of the main reasons you would run something like noscript.

            There have also been several instances in recent years of people contributing to (or outright taking over) previously trusted open source projects and sneaking in something malicious. This then gets executed and/or bundled during development in anything that uses it and updates to the compromised version before people find the vulnerability.

            Finally there are network level attacks which thankfully are a lot less common these days due to HTTPS adoption (and to be a broken record, CSP & SRI), but if you happen to use public WiFi, there’s a whole heap of ways a malicious actor can mess with what your browser ultimately loads.

            • KajikaOP
              link
              fedilink
              arrow-up
              1
              arrow-down
              2
              ·
              21 days ago

              OK I got it, you are completely out of the loop here.

              You do not grasp the idea of NoScript and other JS filtering extension. This is not about server code, your all arguments is baseless here.

              By the way JS refered to Javascript and not NodeJS.

              Anyway I got you whole company/business talk about “keeping the service available, secure, performant” and “GDPR […] bankrupting fine”… yeah lemmy.world.

              • setVeryLoud(true);@lemmy.ca
                link
                fedilink
                arrow-up
                3
                ·
                20 days ago

                I’m a full-stack software developer working in the financial sector, their statement is factual.

                Companies will never want to take on liability that has the potential to bankrupt them. It is in their best interest to not reveal the version of libraries they are using as some versions may have publicly known vulnerabilities, and it would make it incredibly easy for attackers to build an exploit chain if they knew the exact versions being used.

                Securing client code is just as important as securing server code, as you don’t want to expose your users to potential XSS attacks that could affect the way the page gets displayed, or worse, leak their credentials to a third party. If this happened in the EU or some parts of Canada, and it’s been found that the company reduced their threat model “for the sake of openness”, they would likely be fined into bankruptcy or forced to leave the market.

                Unfortunately, this is one of those cases where your interests and ethics will never be aligned with those of service owners as they are held to a certain standard by privacy laws and other regulations.

              • 9point6@lemmy.world
                link
                fedilink
                arrow-up
                3
                arrow-down
                1
                ·
                edit-2
                21 days ago

                No need to get aggravated, I completely grasp it, you’ve possibly misunderstood or not entirely read my comment if that’s your takeaway.

                I’m not talking about server code specifically, I’m going through the stages between the source code repo(s) and what your browser ends up receiving when you request a site.

                NodeJS is relevant here because it’s what runs nearly all major JS bundlers (webpack, vite, etc), which are what produces the code that ultimately runs in the browser for most websites you use. Essentially in a mathematical sense, the full set of dependencies for that process are a part of the input to the function that outputs the JS bundle(s).

                I’m not really sure what you mean with that last part, really, anyone hosting something on the internet has to care about that stuff, not just businesses. GDPR can target individuals just as easily as for-profit companies, it’s about the safety of the data, not who has it—I’m assuming you would not want to go personally bankrupt due to a deliberate neglect of security? Similarly, if you have a website that doesn’t hit the performance NFRs that search engines set, no one will ever find it in search results because it’ll be down on page 100. You will not be visiting websites which don’t care about this stuff.

                Either way, all of that is wider reasoning for the main point which we’re getting away from a bit, so I’ll try to summarise as best I can:

                Basically unless you intend your idea to only work on entirely open source websites (which comprise a tiny percentage of the web), you’re going to have to contend with these JS bundles, which as I’ve gone into, is basically an insurmountable task due to not having the complete set of inputs.

                If you do only intend it to work with those completely open source websites, then crack on, I guess. There’s still what looks to me like a crazy amount of things to figure out in order to create a filter that won’t be able to work with nearly all web traffic, but if that’s still worth it to you, then don’t let me convince you otherwise.

                Edit: typo

    • Captain Beyond@linkage.ds8.zone
      link
      fedilink
      arrow-up
      1
      ·
      15 days ago

      When one asks if something is free software (a.k.a. FOSS) the concern isn’t so much trust but rather can one view, modify, and share the program. Sandboxes solve a different problem.

      In the case of a javascript bundle, in order for a user to exercise the Four Freedoms they must at minimum be provided with corresponding source code for each component in the bundle, and preferably some way in the browser for the user to inspect and modify it. In other words, it must be treated like any other compiled binary program. A lock file with specific versions probably isn’t necessary (and server configuration and source code definitely isn’t).

      You are right in that this would require cooperation from the service provider to provide this metadata, and most definitely would not do this. Therefore, such an extension as OP suggests would have the effect of blocking the vast majority of javascript on the web today. LibreJS tries to some extent but I don’t know how well it can handle bundled javascript files.

  • loathesome dongeater@lemmygrad.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    21 days ago

    Can’t say that what you are looking for is common. This is the first time I’ve heard this requirement bring described.

    Librejs started a long while back. I’m no js historian but I reckon things have changed a ton in jsland since then. My guess is that there assumption is that since JavaScript files are just scripts, they contain the source code and therefore all it checks for is is the license.

    I don’t know at which point things like obfuscation through minification and systems like webpack came along. I’m only theorising but I feel librejs has not been able to keep up with the times.