A battle for hearts and minds - Embedded.com

A battle for hearts and minds

Microsoft's future depends on embedded systems. To succeed, the company must satisfy engineers who care more about technical merit than marketing.

In the dozen years I've been writing for Embedded Systems Programming, I've learned that using the proper noun “Microsoft” without immediately modifying it with “sucks” floods my mailbox with messages from angry readers who feel I'm a Microsoft shill. Mob thinking demands that the president's axis of evil include Microsoft. Anti-Redmond rhetoric borders on hysteria.

Yet I'd wager that without Microsoft many of us wouldn't be computing as easily as we now do. The combination of DOS and IBM made the PC market explode. My first PC cost $7,000, which included two (optional!) floppy disks. Hard drives were all but unheard of. At the time, dozens of competitors sold decent machines, mostly running CP/M, for much less money. IBM brought little to the party except its stellar reputation and a software strategy based on DOS.

Microsoft bought DOS for $50,000 from a small Seattle outfit. After a bit of repackaging, Microsoft began its rise to enormous success. Many fault Microsoft for buying the software rather than making its own, and a later lawsuit put substantially more money in the inventors' hands, but it was a shrewd business move. Was it luck or vision?

IBM's marketing and the truly useful DOS, coupled with a variety of applications, made the PC a natural winner. The choice of the hermaphrodite 8088, a 16-bit CPU with an 8-bit bus and the ability to address 640KB of memory, was wise and empowering. Sure, the 68000 was a better architecture, but Motorola was too late to meet IBM's market window.

People today sagely shake their heads and wonder how IBM could plop video memory in the middle of the 8088's address space, which created the 640KB limitation. Fact is, at the time no one ever dreamed we would need or could use so much memory. A few years earlier, I'd worked on a $10 million mainframe that had three quarters of a meg of 36-bit words of memory. (Yes, 36; that's not a misprint. Univac's Fielddata was an alternative to ASCII that packed six 6-bit characters into a word. Needless to say, there was no lower case.) That machine, with a 750ns cycle time, supported hundreds of concurrent users. Back then any sort of computer access was a godsend. So, 640KB for a home machine seemed like absurd overkill. How things have changed in two decades!

The PC came in a nice package, one that looked professional and reasonable in a business environment, something that could not be said of the Ohio Scientifics, SOLs, MITSes, and other machines, some of which had wooden enclosures. Companies snapped them up faster than home users. Killer apps flowed like water: word processors, spreadsheets, databases, and many more.

Phillipe Kahn started Borland, selling a $29.95 Pascal compiler for CP/M that was fast, interactive, and fun. Borland quickly caught on to the PC market, though promised their customers they'd never abandon CP/M support. That promise evaporated faster than the S-100 bus.

Soon everyone was building PC apps in Pascal. Borland followed with the Turbo C compiler, which was just as much fun as their Pascal. I credit Borland with making C affordable and launching a wave of C-coded applications for the PC platform.

The early PCs came with the complete source code to the BIOS as well as full system schematics. This was long before the days of custom ICs whose 250 pins go to a mysterious black box; all of the parts were standard devices available off the shelf. Before long, competitors offered clones at far lower prices than IBM anticipated.

The resulting volumes drove computer prices to the astonishing numbers we see today. Twenty years ago, who would have imagined, in their wildest dreams, that we'd get 1GHz processors with 20GB of disk and 128MB RAM for $695?

DOS, though, was just a barely souped up version of CP/M. Early versions matched the hardware with no hard disk support and the most primitive of filesystems. Millions of non-computer folks learned to use DOS, though generally only in superficial ways. It was hardly user friendly. People entered long cryptic command strings with a blind fervor and lack of understanding that was horrifying to those of us who knew our way around a computer.

Before real GUIs, most people couldn't use a computer. UNIX machines of the time cost far too much for the average person. And how many nontechies, especially before X, could use UNIX? It's a fantastically powerful OS. You can do anything with it, but you have to be an expert to do anything at all.

I worked on an early Lisa machine from Apple, which titillated with a completely unworkable yet compelling GUI. Then the Mac arrived; I still remember the moment of epiphany when I first tried one in a computer store. My rookie mouse driving was awkward yet thrilling. The ability to select desktop icons and launch apps was a true revelation. I bought one on the spot and forever loved that machine, despite it's unbearably slow single-floppy drive.

Microsoft's real genius surfaced a few years later with Windows. Win 1.0 was worse than awful. Version 2.0 wasn't much better. Even 3.0 was practically unusable. Through failure after failure, Microsoft pursued the GUI desktop that finally succeeded after years of effort. That's almost unheard of in our failure-adverse society.

Some argue Windows was a stolen knock-off of the Mac interface. No doubt that's true. Steve Job's trips to Xerox PARC, though, make Apple look not so innocent. The free flow of ideas out of PARC benefited both Microsoft and Apple, and more than anyone, their customers.

Windows brought computing to the masses. The Mac was arguably better and certainly had a cleaner and more reliable design. But their closed architecture philosophy doomed the machine to a small niche market. In this business, technical excellence is not what determines who wins.

Windows won. Despite its flaws, it dominates. Is Windows the optimum OS? Of course not. It's a god-awful mess, too complex, too fragile, and at the moment unsuited for the security threats implicit in a networked and mean-spirited world. But it sure beats that old CP/M system I had years ago that crashed constantly, the PDP-11 that used impossibly slow mag tape as the only mass storage medium, and the 1108 hidden behind secure walls and protected by a priesthood of operators who gave us 24-hour turnarounds on our runs—if we were lucky.

Today I exchange Word files with people all over the world; the universality of this file format (Okay, these file formats—I hate it when they change formats with every upgrade) makes such communication possible and easy. It works across Apple and PC platforms and sometimes even on Linux boxes running various open-source office platforms. That's pretty cool—and essential for universal computing.

Computing has never been better, but our expectations are so high that it has never been more frustrating. We expect the machines to perform flawlessly and are disappointed and angry when troubles hit. When my ISP glitches and Internet access disappears for an hour, I'm left adrift, not quite sure how to cope. Rebooting Windows prophylactically to avoid resource depletion and unexpected crashes is absurd and annoying.

Despite all of the flaws in their products, Microsoft truly enabled the age of the computer for everyman. I credit them for having the vision and patience to pursue a GUI despite so many failed versions and for generating (eventually) an integrated Office suite that offers more capability than I could ever use, yet works amazingly well. Perfect? Nope. But really, really good.

And by the other measure of success in our capitalistic economy, profits and net worth, they've excelled beyond all expectation. For their stockholders (I'm not one and kick myself for not having jumped in when they first went public), they are doing the right things.

What next?
The computer world is changing, though. It's saturated; most Americans have easy access to all the computing power they require. The replacement market isn't as healthy as it has been, since new generations of machines now offer only incremental, not revolutionary, performance improvements. Windows and Office are mature and probably can't offer much more that the usual customer would care about. New growth opportunities will have to come elsewhere for Microsoft to retain its dominance.

The world has been searching for the next “killer app” when it's right under their noses: embedded. That's where all of the market growth will be. Our lives will be so computationally rich in the future we won't be able to imagine a processor-deprived existence. Plenty of PCs or PC-like devices will be around, but a fabric of interconnected computing will facilitate and guide our every action. This is where the action will be, and Microsoft seems to recognize it.

Microsoft has to be successful in embedded systems or it will wither. Linux and as-yet-unknown products will arise to challenge Microsoft's desktop hegemony. Since the company has traditionally had practically 100% market share, any new product with any reasonable level of acceptance will only cut into its revenues. And 100% market share translates to near zero growth opportunities.

The market is ready for Microsoft. The cost of electronics keeps falling. A pretty decent 32-bit embedded computer is quite cheap today. I expect that as LCD screen prices fall, GUIs will be as common in embedded systems as they are on desktops. Will Windows CE fill that niche?

CE is a pretty good idea—a nice GUI with an API everyone knows. Perfect it ain't, but legions of competent Windows developers are ready to crowd out those of us who grew up on traditional embedded OSes like VxWorks.

Redmond cannot succeed in this market using the marketing tactics that worked so well in PCs. Their most important customers aren't consumers now; they're engineers. Most want technical elegance. Embedded apps demand reliability. Though the cost of transistors continues to plummet asymptotically towards zero, many, many embedded systems will remain in a domain where memory, power, and CPU cycles are all in short supply. Microsoft successfully gambled that code bloat would be eclipsed by cheap CPU cycles and disk space. But that won't work in the embedded space.

Moments before we went to press, Microsoft announced that all of CE's source code will be available under their “Shared Source Initiative.” The news is too new to interpret completely, but it appears that some or maybe all Microsoft customers will be able to get and modify the source. This is a great move, a necessary step in creating trust in CE's reliability. The company claims they will accept suggestions for improvements—a good thing, if not the community-embracing spirit of the GPL, which thrives through developer involvement. I think Microsoft has come very close to a decent business model for CE. I hope they form a community relations group that provokes, demands, and embraces changes and suggestions from us users.

Next, the company must make CE comply with the DO-178B safety-critical standards. Most other embedded RTOSes are headed that way. While DO-178B doesn't prove correctness and is itself a long way from the perfect certification tool, it does have a lot of support. Compliance is, at the very least, another way to demonstrate reliability. A “reliable computing initiative” that seems driven by the marketing department is not.

Despite all the hoopla about Linux, I think the future is far too hazy to predict its dominance in the embedded world. Linux, too, is big and resource intensive, which limits the domain of apps it's appropriate for. It has been used in flight software and other high-rel applications, but it could also profit from DO-178B certification.

Twenty years ago we couldn't even dream of where we are today; the next decade is just as hard to predict. But we're at a crossroads in the industry and Microsoft is facing challenges beyond any they've seen before. They saved the computer world, putting a lot of power on everyone's desks. But their old model won't scale to the new world order.

Still, I'd never predict their demise. With deep pockets and a lot of smart employees, there's little doubt they can, if they choose, produce the embedded framework that powers the future.

We live in interesting times!

Jack G. Ganssle is a lecturer and consultant on embedded development issues. He conducts seminars on embedded systems and helps companies with their embedded challenges. Contact him at .

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.