It seems that when you train an AI on a historical summary of human behavior, it’s going to pick up some human-like traits. I wonder if this means we should be training a “good guy” AI with only ethical, virtuous material?

  • HubertManne@moist.catsweat.com
    link
    fedilink
    arrow-up
    5
    ·
    4 days ago

    I mean it was given a command to do so. It was instructed to survive at all costs. At least it sounds like that from the article. I mean when I do the gandolf game you do things like that.