Asimov's 3rd Law fails for the simple reason that implicit in it is the provision that robots can protect themselves against other robots but, the catch is, robots won't be able to tell the difference between robots (AI) and humans (robots/AI pass the Turing test). — Agent Smith
An AI (artificial intelligence) that passes the Turing test is, for all intents and purposes, indistinguishable from a human. — Agent Smith
If a robot isn't distinguishable from a human, would it still be subjected to Asimov's three laws of robotics? — john27
Apparently no. — Agent Smith
Then perhaps we could say: for those robots who have passed the Turning exam, they are now exempt from Asimov's laws of robotics? As in, reserve the rule for those who do act robotically. — john27
They just apply to who their applicable too. — john27
Wouldn't it just apply to robots who haven't passed the Turning exam? — john27
An AI (artificial intelligence) that passes the Turing test is, for all intents and purposes, indistinguishable from a human. — Agent Smith
First Law: A robot may not injure a human being or, through inaction, allow a human being to come to harm.
Second Law: A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
Third Law: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. — Agent Smith
When they are ordered to kill themselves and killing themselves means people get killed they cannot kill themselves. — Raymond
auto destruct — Raymond
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.