The three new laws of robotics - Embedded.com

The three new laws of robotics

The three laws of robotics that Isaac Asimov developed in 1940 are so ingrained in the literature and movies of the last 70 years that many people literally believe that robots cannot be built without incorporating these laws. Of course nothing could be further than the truth and the consequence is that people expect today’s robots to be far safer than they are. I had early experience with a warehouse-sized, pallet-handling robot crushing a person to death back in the 1970s. Relatively strong, blissfully unaware robots permeate our work scene and it’s up to us to watch out for them and the potential harm they can wreak. It’s also up to us as engineers to infuse awareness into these tools as quickly as we can to reduce the hazard.

Asimov’s three laws were supposedly embedded and diffused into the matrix of a robot’s positronic brain in the form of mathematical concepts so that the robot could not ignore the laws. The three laws are:

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

The first problem with implementing these three laws is that we’ve never developed positronic brains. All we have are microprocessors, but they’re probably up to the task if we get smart enough about using them.

The second problem, a much bigger one, is that the robot must be advanced enough to be able to identify a human being. No robot I know can do this. There are plenty of robots that can identify infrared hot spots that don’t belong and act accordingly. Other robots can pattern-match facial expressions and react according to a pre-programmed set of rules. But no real robots I’ve heard of can direct their sensors in some direction and conclude: there’s a human being there.

That’s pretty much the end of the line for Asimov’s three laws of robotics because they’re all based on the premise that the robot can distinguish a human from a dog or a cat or a hot steam iron. You need some good sensors and some pretty sophisticated programming to make the distinction.

If that was all there was, there wouldn’t be much point to this blog. However, I just read an article on msnbc.com about three new Laws of Robotics proposed by David Woods of Ohio State University and Robin Murphy at Texas A&M for the less-advanced robots we actually can make today; the ones without those magic positronic brains. These three new Laws of Robotics recognize that today’s robots are largely not autonomous and are usually under the care and supervision of human beings. If not under direct human control, the minimally autonomous robots we can build today are released “into the wild” in very specific situations where they probably won’t hurt anyone. (Note: I’m already aware of lethally armed battlefield robots, particularly UAVs, most of which are under direct human control when it matters. Those robots kill, on command.)

These three new laws of robotics for the early 21st century are:

  1. A human may not deploy a robot without the human-robot work system meeting the highest legal and professional standards of safety and ethics.
  2. A robot must respond to humans as appropriate for their roles.
  3. A robot must be endowed with sufficient situated autonomy to protect its own existence as long as such protection provides smooth transfer of control which does not conflict with the First and Second Laws.

These three new laws are still difficult to achieve, but they’re at least within reach given today’s technology. Those other laws, the durable one’s that Asimov conjured in 1940, we’ll just have to wait for the singularity for those.

This article was previously published on Embedded.com’s sister publication, EDN Magazine.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.