The Agile Manifesto reads, in part, that the signatories value “individuals and interactions over processes and tools.”1 The field of management was revolutionized by “empowering” people to make decisions and take charge of their work, which led companies to realize the importance of hiring wisely.
In my experience, most companies do prize their engineers. No, they're not given CEO-like rock-star status, and we all wish salaries were higher. But engineers do make a decent middle-class wage, and in these days of nearly full employment in our industry, businesses fearful of losing their developers generally treat the engineers well. I do hear some dramatic exceptions, but most correspondents claim to be satisfied with their jobs.
But there are persistent complaints that never seem to go away. The first of these is overtime; the 40-hour work week is but a dream to many, and panicked overtime is de rigueur at many outfits in the last few months of a project.
Tired people make mistakes. You'd think it would be self-evident that overtime leads to buggier products. Or that safety issues in dangerous environments will escalate when workers take a shortcut that they'd never consider in a less sleep-deprived state.
In my collection of embedded disasters, a common theme is tired engineers. Investigating bodies routinely cite “60 to 80 hours of work a week for months on end” as a contributing cause to a system failure. And, since only the most expensive systems, like space missions, are investigated when something goes wrong, the dollars lost when a weary worker enters the decimal point in the wrong place are staggering.
Most professions suffer from the curse of OT. Doctors get midnight calls from the hospital, lawyers work late in the night to respond to an unexpected judicial ruling, and accountants dread the arrival of tax season. Is this good? I doubt it, and mounting evidence suggests medical mistakes, no doubt at least partially driven by tired people, kill.
Maybe we should plan better, but even the best planning fails when something unexpected occurs. And the unexpected is one of the things we should expect most when designing a new product.
“Unexpected” sometimes comes because we're confused about R&D. One of my top ten reasons why projects fail is because the science is bad: engineers are developing the science in parallel with making the product. For instance, the algorithm changes constantly as we play with the system's chemistry to get meaningful data. That's a sure-fire route to scheduling disaster, and often results in cancellation of the project. I have seen companies go out of business because engineering is so wrapped up in R&D they never get a product to market.
We don't do R&D–or, if we do, we shouldn't. It's either R, or it's D. Sure, development contains an element of research but is mostly about achieving a pretty well-understood objective using known science or algorithms.
It's possible to plan D; no one can schedule R since that intrinsically explores the depths of the unknowns. If research could be scheduled, we'd know when the cure for cancer would be available.
Although it's possible to plan development, it's impossible to be perfect, so overtime will never go away. Wise managers, though, understand the costs.
Circadian's Shiftwork Practices 2005 survey found that productivity can decrease by as much as 25% when workers put in 60+ hour weeks.2 Clearly overtime leads to diminishing returns for everyone involved.
The same survey showed that turnover is nearly three times higher among those working long weeks. Consider that a headhunter might charge a third of a year's wage to find a replacement, and before retraining costs, each person lost represents $30,000 or more out the door.
Absenteeism is twice the national average at companies that routinely resort to long weeks. Stress leads to sickness, and even people shackled to the desk need free time to get all of the routine activities of life done.
Although overtime will always be part of the fabric of our profession, some toxic companies use it as a cost-saving strategy. Unless fairly compensated, that's servitude we should all reject. Engineers sell their skills to their employer, and their inventory is time. The company strategically translates their inventory into revenue. So should we.
What about superprogrammers? They are the folk that some companies rely on to defy all productivity statistics and crank a lot of code fast. I've been fortunate to work with some brilliant developers who hole up in their office and create wondrous products, fast, and without a fuss. They love to build stuff, eschew office politics, and take great pride in their work.
In a private study conducted for IBM in 1977, Capers Jones found that the best developers are about six times more productive than the worst.3 That's a pretty impressive number, but holds only for small (1,000 lines of code) projects. The difference decreases quickly as projects grow in size, till, at half a million lines of code, the best and worst developers are equally awful. Big systems require a lot of collaboration, so we spend much of our time on meetings, e-mail, reports, and more meetings. A superprogrammer and the worst person on the team attend meetings at exactly the same speed.
There's another breed of superprogrammer, though. Some deeply believe they are the best engineer in town when, well, they might actually be among the worst. In fact, in “Unskilled and Unaware of it: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments,” the authors show that those in the bottom 12% of performers estimate they are in the 62nd percentile.4 And those who are very competent tend to accurately estimate their own abilities.
It sounds almost like an evolutionary effect meant to protect one's ego: at least everyone feels good about their skills, whether warranted or not.
The study was hopeful, though: training that boosted the bottom group's abilities also improved their ability to access themselves.
I've worked with more than a few self-described superprogrammers who couldn't code their way out of main() . All were resistant to training efforts. The very ego that led to bloated self-assessment blinded them to their flaws and inoculated them from any attempt, however softly presented, to bolster their skills.
Tom DeMarco looked at superprogrammers. In Controlling Software Projects, he suggests that developers should specialize, an idea that somewhat mirrors IBM's old concept of structuring a team around a chief programmer.5 Although Joe might be an incredible designer, that's no reason to suspect he's any good at debugging. Perhaps Jill can crank code at a dizzying speed but is awful at testing it. Every other discipline specializes; isn't it logical that we should, too?
I've seen no data that suggests such specialization works, though it does tickle the common-sense bone. But organizing people into specializations requires quantitative data about how each person performs in different roles, which can be hard to get. And I suspect most of us would reject the idea simply because it takes much of the fun out of the job. Hey, I like doing it all, and take enormous personal satisfaction out of building much of the system.
CS, EE, or BA?
If you can't write it down in English, you can't code it. So does that imply English majors might have a role in development?
When hiring firmware developers I've found little difference between CS, CE, and EEs for doing firmware development. In some industries, though, the degree might be important. One company I know hires mechanical engineers exclusively. They, too, get quite a bit of programming experience in college. Their sound understanding of machine control is what's important to this company that makes material handling equipment.
ME, EE, CS, CE–all are engineering or engineering-like programs that stress practical problem-solving abilities. We're taught to build things, to decompose big problems into solvable chunks. That in a large sense differentiates us from our friends studying the liberal arts.
Yet, in my career, I've found that some of the best developers of all are English majors! They'll often graduate with no programming experience at all and certainly without a clue about the difference between DRAM and EPROM.
They can write. That's the art of conveying information concisely and clearly. Software development and writing are both the art of knowing what you're going to do and then lucidly expressing your ideas. The worst developers, regardless of background, fail due to their inability to be clear. Their thoughts and code tend to ramble rather than zero-in on a solution.
It's easier to train someone in a new language than to teach them to think clearly. C really isn't that hard to learn; it has but a handful of constructs. Most folks can learn the fundamentals in a week. Debugging takes longer, but all new programmers find themselves at sea when first faced with bugs. What do I do now? Should I single step through the entire program? How do I decide where to set breakpoints?
Too many engineering-trained developers have a total disregard for stylistic issues in programming. Anything goes. Firmware is the most expensive thing in the universe, so it makes sense to craft it carefully and in accordance with a standard style guide. That is: make sure it clearly communicates its intent. This is where the English majors shine; they've spent four years learning everything there is to know about styles and communication. And they're used to working to a standard, like the Chicago Manual of Style .
In a Dilbert cartoon, the pointy-haired boss, apparently frustrated by the company's sub-par products, announces that he'll reward each bug fix with a $10 bill. Wally says: “Hooray! I'm gonna code me a minivan!”
Unfortunately the heroes, those who seem to save the organization in a great flurry of activity, are often reacting dramatically to the problems they created. Like Wally, they're rewarded for the successes while no one notices that furious activity is no substitute for doing things carefully.
Solving problems is a high-visibility process; preventing them is much better, but earns few rewards. This is illustrated by an old parable:
In ancient China, there was a family of healers, one of whom was known throughout the land and employed as a physician to a great lord. The physician was asked which of his family was the most skillful healer. He replied, “I tend to the sick and dying with drastic and dramatic treatments, and on occasion someone is cured and my name gets out among the lords.
“My elder brother cures sickness when it just begins to take root, and his skills are known among the local peasants and neighbors.
“My eldest brother is able to sense the spirit of sickness and eradicate it before it takes form. His name is unknown outside our home.”
Unfortunately, sometimes the very best developers get the least acknowledgement, even from their own teams.
Jack Ganssle () is a lecturer and consultant specializing in embedded systems' development issues. For more information about Jack .
3. Jones, Caper. Program Quality and Programmer Productivity (IBM Technical Report TR 02.764) . 2nd edition, IBM Corporation: San Jose, Californiam January 1977; out of print. According to Caper Jones's website. www.spr.com/news/bibliography.shtm some sections have been reprinted separately in Programming Productivity–Issues for the Eighties.
4 Kruger, Justin and David Dunning. “Unskilled and Unaware of it: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments,”Journal of Personality and Social Psychology , December 1999, Vol. 77, No. 6, 1121-1134.
5. DeMarco, Tom. Controlling Software Projects: Management, Measurement, and Estimates. Prentice Hall, 1986.