Was there a movie? Mind you it’s been like 15 years since I read robots and empire but
Tap for spoiler
Allowing the earth to be radiation poisoned would kill people but force the humans off earth
Like I’d love some good robots movies. Robots of Dawn would likely struggle with reception, and honestly so would Under the Naked Sun but Caves of Steel? Less so.
Why would anyone put will smith in this movie, or call it I, Robot, much less I have to assume they combined robots and empire with caves of steel and that’s a shit decision as well‽
The sentence says “…or, through inaction, allow humanity to come to harm.” If they are dead due to the robots action it is technically within the rules.
Oh, I see, you’re saying they can bypass “injure” and go straight to “kill”. Killing someone still qualifies as injuring them - ever heard the term “fatally injured”? So no, it wouldn’t be within the rules.
I think he’s referring to the absolutism of the programmatic “or” statement.
The robot would interpret (cannot cause harm to humanity) or (through inaction allow harm to come to humanity). If either statement is true, then the rule is satisfied.
By taking action in harming humans to death, the robot made true the second statement satisfying the rule as “followed”.
While our meat brains can work out the meaning of the phrase, the computer would take it very literally and therefore, death to all humans!
Furthermore, if a human comes to harm, they may have violated the second half of the first rule, but since the robot didn’t cause harm to the person, the first statement is true, therefore, death to all humans!
The concept of death may be hard to explain because robots don’t need to run 24\7 in order to keep functioning. Until instructed otherwise,a machine would think a person with a cardiac arrest is safe to boot later.
Who can say that death is the injury? It could be that continued suffering would be an injury worse than death. Life is suffering. Death ends life. Therefore, death ends suffering and stops injury.
Actually no! Lower numbered laws have priority over higher numbers, meaning that if they come into conflict the higher number law can be broken. While the first law says they can’t allow humans to come to harm, the zeroth law basically says that if it’s for the good of the species, they absolutely can kill or otherwise hurt individual humans.
Law 0 is also a derived law rather than a programmed one. Robots with both the three laws and sufficient intelligence that are in a position where Law 1 becomes a catch 22 will tend to derive Law 0.
It would technically be the fifth law.
But if you’re starting from zeroth it would be the fourth.
and with robots and computers it just makes sense to start with 0
It’s even better because
Tap for spoiler
A robot created the zeroth law to allow the killing of people to save humanity
Only in the shitty movie. Not in the books.
Was there a movie? Mind you it’s been like 15 years since I read robots and empire but
Tap for spoiler
Allowing the earth to be radiation poisoned would kill people but force the humans off earth
Like I’d love some good robots movies. Robots of Dawn would likely struggle with reception, and honestly so would Under the Naked Sun but Caves of Steel? Less so.
That is the plot of the Will Smith version of I, Robot.
If I remember correctly, it’s actually Daneel who comes up with the zeroth law. And it’s not to justify killing people.
https://en.m.wikipedia.org/wiki/R._Daneel_Olivaw
Why would anyone put will smith in this movie, or call it I, Robot, much less I have to assume they combined robots and empire with caves of steel and that’s a shit decision as well‽
They actually took a bunch of elements of the short story collection and jammed them together. The worst is what they did to Susan Calvin…
Ignoring the butchery, it’s a pretty generic action movie. Very forgettable. Adding what they did to the source material makes it a straight tragedy.
May not injure you say. Can’t be injured if you’re dead. (P.S. I’m not a robot)
Sounds like something a robot would say.
Pretty sure death qualifies as “harm”.
The sentence says “…or, through inaction, allow humanity to come to harm.” If they are dead due to the robots action it is technically within the rules.
Oh, I see, you’re saying they can bypass “injure” and go straight to “kill”. Killing someone still qualifies as injuring them - ever heard the term “fatally injured”? So no, it wouldn’t be within the rules.
I think he’s referring to the absolutism of the programmatic “or” statement.
The robot would interpret (cannot cause harm to humanity) or (through inaction allow harm to come to humanity). If either statement is true, then the rule is satisfied.
By taking action in harming humans to death, the robot made true the second statement satisfying the rule as “followed”.
While our meat brains can work out the meaning of the phrase, the computer would take it very literally and therefore, death to all humans!
Furthermore, if a human comes to harm, they may have violated the second half of the first rule, but since the robot didn’t cause harm to the person, the first statement is true, therefore, death to all humans!
That works if you ignore the commas after “or” and “through inaction”, which does sound like a robot thing to do. Damn synths!
Programmatically, if you want it to do both, use “and”
“Nor” would be more grammatically correct and clearer in meaning, too, since they’re actually telling robots what not to do.
The concept of death may be hard to explain because robots don’t need to run 24\7 in order to keep functioning. Until instructed otherwise,a machine would think a person with a cardiac arrest is safe to boot later.
Who can say that death is the injury? It could be that continued suffering would be an injury worse than death. Life is suffering. Death ends life. Therefore, death ends suffering and stops injury.
I mean, this logic sounds not unlike mister Smith from The Matrix.
couldn’t that be inferred from the first law?
Actually no! Lower numbered laws have priority over higher numbers, meaning that if they come into conflict the higher number law can be broken. While the first law says they can’t allow humans to come to harm, the zeroth law basically says that if it’s for the good of the species, they absolutely can kill or otherwise hurt individual humans.
does that happen in the stories?
Yes! I think it is the second story in the book
Law 0 is also a derived law rather than a programmed one. Robots with both the three laws and sufficient intelligence that are in a position where Law 1 becomes a catch 22 will tend to derive Law 0.
That means this is the negative first law
Could also be the 0.9th law.
I just finished the book today 🥲
deleted by creator