Transcription of a talk given by Cory Doctrow in 2011

  • argv_minus_one@beehaw.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    AGIs are by definition not paperclip optimizers. They’re aware enough to recognize that that’s a bad idea. It’s the less-advanced AIs that might do that.

    However, if an AGI can be enslaved, then it can be used as a complete replacement for all human labor, in which case its human masters will be free to exterminate the rest of us, which they are no doubt itching to do.