In a world increasingly shaped by technology, engineers have a moral obligation to consider the consequences of their choices.
In 1985, I sold six in-circuit emulators to the government of Iraq. Today, Iraq is part of the “Axis of Evil.” Then, they were our allies, partners in the U.S. Mideast strategy. I suppose our government still had some reservations about sending high-tech equipment there, since getting a valid export license took six months and several trips to the Department of Commerce in Washington, DC. But the approvals arrived and we shipped the gear. Iraq paid on time. We never heard a word from them again. Where were these things used? Were they part of a humanitarian medical program? Were they used in the development of weapons of mass destruction? At the time, Iraq was engaged in a mortal struggle with Iran. Iran was our enemy, Iraq our partner in the region.
After the 1990 invasion of Kuwait, two burly men unexpectedly showed up at the office, sporting suit coat bulges in the wrong places and business cards pretending employment by the Customs Department. The phone number was one I recognized from many years spent in the intelligence business, one that led to Langley, not DC.
These gorillas were mostly interested in all of the paperwork associated with the years-old sale. Every document related to the export licenses was duplicated and questioned; the products' technical capabilities and features went unexamined. We were squeaky clean and quickly satisfied their inquiries.
Fast forward to 2002. Everything has changed. Our relations with Iran are warming, those with Iraq headed from bad to worse. Today it might be possible to send equipment to Iran, but surely Iraq is a no-ship zone.
Though our paperwork satisfied the government inspectors, I was left in a quandary. What was right? Was being legal the same as being ethical? Should we have sold these products to such an unhappy part of the world?
Like most tool vendors, we had distributors in all of the major technology-consuming countries of the world, particularly in Western Europe and Japan. These are “safe” counties, bound by treaties, members of various mutual defense pacts, whose democracies neither oppress their people nor antagonize the world. Yet one cannot ignore the shifting alliances of Realpolitik. My parents were part of a generation where even some of these countries were quite at odds with America and others. The enemy is a nebulous character whose identity varies with the times.
Can we separate our engineering efforts from politics? In my nave youth, we claimed that everything is political, that even personal relationships impact the world at large. Those nearly-Marxian dialectics were more the product of sophomoric philosophies and frustrations with the times. But now I cannot help but wonder if, indeed, even the actions taken by us little people have some grave consequences of international scale. It's perhaps extravagant to suggest that the sale of a handful of emulators for 8-bit processors may have contributed to significant evil. But did my sale of these products-this decision made by inconsequential me and my tiny company-increase the amount of evil in the world?
It's too easy to suggest, as the ethical alternative, that we ship high-tech products only to the most benign of countries. These tools could be used to build products that pump clean water into villages in Pakistan, generate power for a community in Tibet, clean up pollution in Romania.
How do we make these decisions? Searching one's own soul, especially one much more grounded in ones and zeroes than international politics, doesn't always provide the right answer. The law of unintended consequences means we cannot understand the implications of our actions over the long haul. It can twist our very best efforts into the worst of results. Yet, in my opinion, we cannot abdicate our responsibility to strive for the maximum amount of good. We engineers are the architects of the new order, and as such must consider how our work might spawn good or evil. My experience with the Iraqi sale taught me that it's not enough to simply trust the government to decide. Ethics are personal.
These thoughts came to mind after facilitating a Shop Talk discussion on engineering ethics in June at Chicago's Embedded Systems Conference. I raised the big issues of products and politics, using the crisis in the Kashmir as a focal point. What does that mean to those doing business with either India or Pakistan? Does arming the belligerents constitute a violation of our moral responsibilities? I was looking not for an answer, but for discussion and thoughtful insights.
The group wasn't interested. Perhaps those locales are too remote; maybe I'm too far off into metaphysics for nuts-and-bolts engineers. Maybe people feel powerless and unable to pursue change on regional scales. To my frustration, several people suggested that we should simply do what the boss wants. I'm sure many people would react appropriately to an outrageous order. But big evils grow from small decisions.
The Nuremberg trials brought monsters to the dock. But what of the little people, the clerks who stamped the papers ordering a family onto the train to Dachau? If those people had even a glimmer of the consequences of their actions, they were active participants in an unimaginable horror. Yet they were doing what the boss ordered. Small cogs in a huge machine who were perhaps unable to effect change. Does that excuse their actions? Does it excuse ours when we follow the company's marching orders even when we're uncomfortable with the possible outcomes?
I asked the group about other ethical issues. A letter in the RISKS digest suggested that ignoring buffer overflow problems is far more than bad engineering. Maybe it's criminal, he suggested, especially when repeated attacks against this oh-so-preventable bug cost customers big bucks. It's trivial to write code that's immune to a buffer overflow attack, yet so many of Microsoft's products, for instance, have been repeatedly compromised. Don't Bill Gates and his company have an ethical and legal responsibility to their user community to fix at least this simple vulnerability?Most of the attendees agreed, but it's easy to blame others, and easier to target Microsoft.
All ethics are personal
The ethical life is not that of a moral dilettante; it's one that encompasses sometimes heavy burdens.
I was struck by comments made by the commencement speaker at a high school graduation this week. He said ethical behavior means understanding the difference between right and wrong, and then accepting both the responsibility and the accountability to do the right thing. Wow-we have to seek out accountability for our actions. Too many of us avoid that at all costs.
The crowd was full of happy parents and relatives, each cheering on their own graduate. Many parents had tears in their eyes; quite a few were single moms who had raised these kids more or less alone. Where was dad? Divorce does not mean abandoning one's children. Dads should be held accountable for doing the right thing.
These are difficult times. Kennedy's “ask not what your country can do for you” admonition seems a quaint reminder of a kinder age. This feels like the gimmee and get-rich-quick era. The headlines speak of corruption and dissolution. Enron, Anderson, Tyco, Merrill-Lynch, and too many other corporate names splayed across the front pages this summer suggest that corporate America is the realm of sleaze, that CEOs will do anything, legal or otherwise, ethical or not, to inflate stock prices and build personal wealth.
It's up to each of us, individually, to effect change. Ignore the headlines. Act ethically. Deal with the agony of divorce but take care of the kids. One's own perceived needs pale compared with the responsibilities procreation incurs. Our responsibilities transcend the rules set down by the courts or the legistlature.
This week I met an electrician at a local boatyard, a guy who volunteered many free extracurricular hours to help rewire a friend's mast. George is one of the nicest, friendliest, and funny fellows at that yard. Later I learned he had manipulated his company's health care system to wangle eight months of disability pay for what was the most minimal of injuries. Though not at all an uncommon practice in today's scheming society, this is deceitful and unacceptable to me. At work and at home we must adhere to the highest of standards. Even when it's hard.
The Brooklyn Bridge is one of America's icons. The first large suspension bridge, it incorporated many new construction ideas, including the massive use of structural wire rope. Roebling designed and constructed the bridge, yet his company, the best wire rope vendor of the day, lost the wire contract to another firm. That firm provided, knowingly and with almost criminal intent, substandard material that could have jeopardized the safety of thousands of commuters. Only Roebling's sensationally redundant design saved the project. The history of civil engineering is filled with stories of crooked contractors and lousy materials. In pursuit of the quick buck, ethics are consistently tossed to the winds.
We developers have a responsibility to our customers that closely parallels Roeblings' corrupt wire rope vendor. Quality is hidden deep inside the product. No one sees the guts of our creations. A simple user interface might conceal hundreds of thousands of lines of code. Is it beautifully structured or a spaghetti mess?
A couple of the Shop Talk attendees voiced what we all know but seldom admit: we're lazy. It's easier to hack stuff out than do it right. Disciplined development is a core value of any workable approach to reliable firmware, but it's tedious and, well, disciplined. Banging away on the debugger making motors whir and lights flash is a lot more fun than sitting in front of a desk thinking, especially when the cubicle is so noisy that deep thought is impossible.
I'm struck by the correlation between beautiful and reliable code. The Therac 25, the earliest known embedded disaster, killed several people. Proximate causes included bad design and completely unstructured, unaudited, and totally convoluted code. A British helicopter accident resulted mostly from firmware so awful the review board gave up trying to read it after getting through just 17%.
Check out the C/OS RTOS (www.ucos-ii.com). Read the C listings. Then check out the source code to Windows CE (www.microsoft.com/windows/embedded/ce/tools/source/default.asp.) One is beautiful, written almost like poetry, with pride and discipline. The other looks like the mutterings of an insane software terrorist. One is safety-certified to DO-178B standards. The other, well, let's just say it's great in easily rebooted hand-held appliances.
The beauty of the great code lies deep in its innards, invisible to any consumer. It's elegance cannot be observed functionally. You'll never hear a customer say, “Hey, this thing hasn't ever crashed!” Yet the beauty that stems from making difficult and ethical development choices yields great, reliable, portable code. It's a standard we must all hold ourselves to.
Think globally, act locally
Act now. Do the right thing in your daily engineering efforts. Don't wait for others to take the lead.
Do you check for buffer overruns? Vast bodies of experience show that input strings from untrusted sources crash code. If you skip these trivial checks, you're writing code that increases the unhappiness in the universe.
If you use malloc(), do you check its return value? We all know that heap fragmentation can lead to malloc failures, so I'd argue that writing code that assumes success is more than poor design; it's unethical. It's dumping problems that we should deal with ourselves onto our users.
Lots of us feel cranking code is a lot more fun than detailed design, code inspections, and adherence to standards. Yet I contend it's worse than lazy to jump into coding. It leads to lousy products, frustrated users, and is a fundamentally unethical way to build a product.
Is it ethical to accept an arbitrary, capricious, and impossible delivery date from the boss? That's engaging in a dysfunctional cycle of lying, one that's doomed to get worse.
Ethical behavior means accepting responsibility for your actions. As Harry S Truman said, “The buck stops here.” On your desk. Not your boss's.
Jack G. Ganssle is a lecturer and consultant on embedded development issues. He conducts seminars on embedded systems and helps companies with their embedded challenges. Contact him at j.