Robotics

The Three Laws

 

The "Official" Laws

  1. A robot may not injure a human being, or, through inaction allow a human being to come to harm.
  2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Handbook of Robotics
56th Edition, 2058 A.D.

 

Modifications

2029AD

The NS robot with a modified First Law . The new law was stated as "No robot may harm a human being".

2052AD

Susan Calvin first suggested the existence of a Zeroth Law of robotics. "No robot may harm humanity or, through inaction, allow humanity to come to harm". The First to Third Laws should be amended accordingly.

4722AD

Elijah Baley claimed, during a murder investigation on Solaria, that the First Law had always been misquoted. He suggested the First Law should be restated as "A robot may do nothing that, to its knowledge, will harm a human being; nor, through inaction, knowingly allow a human being to come to harm".

 

The Three Laws of Susan Calvin

  1. Thou shalt protect the robot with all thy might and all thy heart and all thy soul.
  2. Thou shalt hold the interests of US Robots and Mechanical Men, Inc. holy provided it interfereth not with the First Law.
  3. Thou shalt give passing consideration to a human being provided it interfereth not with the First and Second laws .

    Gerald Black.

 

Gaia

The planet-organism Gaia, adapted the first law as a philosophy.

  1. Gaia may not harm life or, through inaction, allow life to come to harm.

GlossaryContents

Copyright ©1997-9  Mike Carlin Last Modified: