New development policy: code generated by a large language model or similar technology (e.g. ChatGPT, GitHub Copilot) is presumed to be tainted (i.e. of unclear copyright, not fitting NetBSD’s licensing goals) and cannot be committed to NetBSD.

https://www.NetBSD.org/developers/commit-guidelines.html

    • Optional@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      5
      ·
      7 months ago

      Lots of stupid people asking “how would they know?”

      That’s not the fucking point.

      Okay, easy there, Chief. We were just trying to figure out how it worked. Sorry.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        1
        ·
        edit-2
        7 months ago

        It was a fair question, but this is just going to turn out like universities failing or expelling people for alleged AI content in papers.

        They can’t prove it. They try to use AI tools to prove it, but those same tools will say a thesis paper from a decade ago is also AI generated. Pretty sure I saw a story of a professor accusing someone based off a tool having his own past paper fail the same tool

        Short of an admission of guilt, it’s a witch hunt.