1. A robot may not injure a human being or,through inaction, allow a human being to come to harm.
2. A robot must obey orders given to it byhuman beings except where such orders would conflict with the FirstLaw.
3. A robot must protect its own existence aslong as such protection does not conflict with the First or Second Law.
When he wrote the I, Robot stories the word”robot” viscerally meant to John Q Public a mechanical being whichinvariably looked human. Indeed, that perception lived even into Star Trek: Data notonly looked entirely human, he grappled with emotions.
But with the advent of inexpensive computing “robot” no longer meansa human-like machine. The car's cruise control is a form of robot; theagents that scour the net are called “bots,” and mail-delivering robotsroam the halls of the Pentagon and other buildings. They look more likefiling cabinets on wheels than like Data. So I think it's reasonable tosubstitute “computer” for “robot” in the Three Laws.
And it's time to add a Fourth Law :
4. A computer may not piss off a human, aslong as such behavior doesn't conflict with the first, second or thirdlaws.
For example, I was on a long flight recently and the seatback TVscreens were, as usual, behaving badly. They reset constantly. They'dfreeze. The flight crew cycled power a couple of times and eventuallythe units started running reasonably well.
But I wanted to sleep so turned my unit off. Abid, seated to myright in 31H, also flicked his unit off. We each reclined.
And, minutes later, both screens came back on. We turned them off,and they came on again. And again and again.
Abid 31H and I shared only the universal language of annoyedcomputer users, so could merely point at these poorly-implementeddevices and laugh. The screens won, we lost, and on they stayed whilethe Ambien finallykicked in.
Embedded computers are unlike any other devices on this planet.Other engineered products respond to a user's actions: the car followsinputs into the steering wheel and pedals. Lights go on when we throwthe switch. But a computer does the commanding; it uses both inputs anda set of rules defined by a programmer to control something.
Though I have no idea how to actually build Laws into firmware, Ican't help but think these products could use some sort of analogousmeta-guidance. In Asimov's stories the Laws are somehow distinct from,and above, the programming details, just as the Ten Commandmentssupersede the hundreds of thousands of pages of extant law andprecedent that guide judges.
Redefining the Four Laws using “computer” instead of “robot” movesthe responsibility for conforming to the Laws to the designers ratherthan to the robots. It's up to us to build systems that are safe andthat don't piss off our users.
Jack G. Ganssle is a lecturer and consultant on embeddeddevelopment issues. He conducts seminars on embedded systems and helpscompanies with their embedded challenges. Contact him at . His website is .