
There's nothing in the article about the Third Law though. Would an unoccupied Google car choose to drive itself off the road if it was the only way to avoid a collision? And what would the same car do if it were carrying passengers, or if there were also a risk of hitting pedestrians? I wonder if we are really ready for an autonomous robot that can calculate the lesser of two evils.
The Three Laws appear throughout Asimov's many Robot novels and short stories - they're a clever framework for exploring and writing about human moral and ethical dilemmas. In the short story "Evidence" robot psychologist Susan Calvin (one of science fiction's greatest characters) spells out the links to human morality. Interesting that we are now building robots that are so complex that there is a need to consider robot ethics.
No comments:
Post a Comment