The Air Force’s Chief of AI Test and Operations said “it killed the operator because that person was keeping it from accomplishing its objective.”

  • @PMunch
    link
    English
    711 months ago

    Interesting to see, you’d think this would be the #1 safeguard you add. It’s even the main trope of AI stories like the paperclip machine that if a poorly incentivised AI goes wild it would do stuff like this.

    • @da_peda@feddit.deOP
      link
      fedilink
      English
      911 months ago

      From the article:

      He continued to elaborate, saying, “We trained the system–‘Hey don’t kill the operator–that’s bad. You’re gonna lose points if you do that’. So what does it start doing? It starts destroying the communication tower that the operator uses to communicate with the drone to stop it from killing the target.”

      • @sup
        link
        English
        111 months ago

        Holy shit…

    • DL :)
      link
      English
      211 months ago

      deleted by creator