Johnny Pneumatic
Master Poster
- Joined
- Oct 15, 2003
- Messages
- 2,088
The Three Laws of Robotics are:
1. A robot may not injure a human being, or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law
When we finally create inteligent machines do we have the right to program The Laws into them?
1. A robot may not injure a human being, or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law
When we finally create inteligent machines do we have the right to program The Laws into them?