For the first time ever, many of the major desktop computer system manufacturers are experiencing negative growth.
Their tribulations have led to one of the largest potential mergers in the history of that market. Hewlett-Packard's proposed acquisition of Compaq would make the combined organization the second largest computer company in the world, only slightly smaller than IBM.
There have been dozens of reasons given for the continuing losses in the computer business: general market malaise, the emergence of alternative portable computing platforms, and simple market maturation. An equal number of strategies have been suggested to reverse the trend, including getting bigger through mergers.
A large part of the problem is that the underlying computing model is all wrong. A desktop computer for business and another for the home will be useful accoutrements for a long time to come, but the hardware and software architecture must undergo a radical change to survive.
What should that new desktop computing architecture look like? The answer came to me during the last few months as I have been writing about a new breed of embedded CPUs called network processors, whose main functions are data movement and flow, not data processing.
Within the switches and routers of the inter-network and within the Internet data centers, it's easy to see this shift. Elsewhere, such as on the desktop, it is a bit more subtle and less obvious to someone who isn't looking for it.
Writing on these topics has caused me to look at the desktop in my home office with new eyes, in the context of the applications I run, the kinds of processing I do, the amount of time I spend doing them, and what goes into my decision-making process when I am investing in a new piece of hardware or software.
As much writing and editing as I do as managing editor at EE Times , most of what I do really has to do with input and output. Maybe 20 percent, if that much, is spent on word and data processing and 80 percent concerned with data movement: moving files in here, out there and moving Web pages and files out there, in here. Even when I was limited to the 33k bps bandwidth of standard phone lines, I was moving stories and graphics out on the Internet via e-mail; receiving stories back from the main editorial office; sending and receiving literally thousands of e-mails a month; and in the off hours when there was less traffic, browsing the Web.
Now that my link to the Internet has increased to about 500 kbps, my use of the Web for browsing has increased 100-fold. Rather than run downstairs to my library to look up something, I search for it on the Web. I also participate in a lot of on-line audio and video conferences. The desktop is also my main conduit for broadband audio and video information for which I previously depended on the TV and radio.
All the decisions I now make as to software and hardware upgrades to my desktop computer all have to do with making it easier to upload and download a variety of textual, graphics audio and video. And the last thing on my list of upgrades is the main processor or the operating system.
In order to speed up my Web browsing I didn't buy a new processor; I increased the amount of DRAM so I could set aside some of it as a cache storage area for faster URL access to Web pages. When I wanted to improve my system's ability to handle video and audio I didn't upgrade my main processor, I upgraded the graphics subsystem with an onboard video processor optimized for network delivery of MPEG2 and MPEG4 streaming media. When my cable provider moves into the megabit/s delivery range, again I won't be considering upgrades to my CPU or to the operating system. I'll look for ways to boost the flow of data into and out of the desktop, including replacing the motherboard with one based on the new point-to-point switched-fabric successor to the shared PCI bus I now use.
I put the CPU and the operating system last on my list of upgrades because I don't see much there that will improve the performance of my desktop unless there is a major shift to more of a data flow or input/output processing architecture.
Whether in slow, cautious steps or all at once, the OS and CPU architecture of the new desktop computer will eventually emerge. To figure out what it will look like, all we have to do is recall the late '90s, when the so-called network computer was being viewed as the successor to the PC.
That was before the major PC companies pulled out all the stops and dropped prices so low that the network computer didn't emerge as a viable option. It still survives in the corporate environment, but mostly as a thin client, slightly more intelligent than the traditional mainframe- or server-based data entry terminal.
What interested me at the time was the architecture of the network computer, which can be described as a stateless I/O device focused on data movement and able to access a shared pool of computational resources on a server, over a high-speed interconnection fabric.
At the time, this required a dedicated corporate LAN capable of data rates anywhere from 1 Mbps to 100 Mbps. The network computer had as its main function moving data into and out of the desktop computing environment and driving a sophisticated graphical/multimedia. Most of the applications a user would run did not reside on the local unit but out on the network on servers.
With the emergence of a wide-area networking environment, in which bandwidths into the home and small business are already reaching the 500 kbps rate, the time is ripe for the re-emergence of this lean and mean, incredibly low-cost and fast I/O machine as the mainstream alternative to the traditional PC as the average consumer's desktop computer.
If you're worried about the application environment in which such a machine would operate, don't. Major corporations such as Intel, IBM, Microsoft, and Sun are doing their best to build a Web services environment in which most computing will not be done on clients, but on application servers of various sorts. If Web servers are where the majority of the applications run and where the data is generated and stored, that means that the operating system as Microsoft defines it and the CPU as Intel conceived it will become things of the past. For everyone except a few dinosaurs like me, that is, who likes the idea of an independent, highly intelligent desktop under my control.
But for the vast majority of potential computer users who don't really care about the underlying technology, an extremely low-cost, I/O oriented stateless personal computer, which has as its only job the movement of data in and out, will probably be more than enough.
What does this all mean for the embedded market? For one thing, it is the '70s all over again. The PC of the early '80s owed its existence to tools, processors, and operating systems being used in the late '70s to build, debug, and maintain the processors that were replacing electromechanical processes in systems.
In a new totally OS/CPU-agnostic desktop I/O machine, environment, the processors, operating systems, tools and languages that will be used will not necessarily emerge from the desktop environment. PCs didn't use the processors from the minicomputers, but turned instead to the 8080s and 6800s used in the embedded market. So too the processors and operating systems that may be likely candidates for the I/O machine are the ones being used in the embedded space for handling data flow in a variety of embedded communication systems.
What do you think is the future of the desktop? And are there any opportunities in the soon-to-be data flow I/O machines for embedded hardware and software?
Bernard Cole is the managing editor for embedded design and net-centric computing at EE Times. He welcomes contact. You can reach him at or 520-525-9087.