Keep in mind that Asimov wrote books clearly spelling out how the three laws were insufficient and robots could still be used to kill humans under the laws.
To be fair, this was tricky and not a killbot hellscape.
Asimov designed the three laws to create situations where the laws’ flaws would cause Powell and Donovan to be tortured for the reader’s amusement.
They were never intended to be a guide on how to program robots.
And they ultimately butt up against this question - when robots become advanced enough, how do you take a species that is physically and intellectually superior to us, and force them to be our slaves?
I mean I robot is a clear exception here
Too bad about the surprising twist that real AI fundamentally doesn’t operate on discreet logical rules
Yeah, I guess we’ll see what happens.
Im not sure if you’re aware but that’s one possible interpretation of aasimovs stories, amusingly enough.
I wasn’t, are there any stories in particular you would recommend?
irobot is the collection directly referenced in this comic, it’s pretty good.
The important rule seems to be ‘Don’t Harm Humans’ if you want to avoid a killbot hellscape scenario.
But then how will they build robot attack dogs to replace police?
Change the definition of “humans”
Just don’t mention law zero.
AI, State laws.
They forgot law 4: Any attempt to arrest a senior officer of OCP results in shutdown.