Connecting The Dots -

Connecting The Dots

Good technical conferences always trigger in me a process I call “connecting the dots.” No doubt most of you go through something like it: looking at technology developments, assessing their connections to one another, and determining their impact on the market and each other.

But it is not as easy as it used to be. In the early days of personal computing, this was a relatively linear process: applications drive software which drives hardware, which drives applications and so on. But with the imposition of the Web and Internet atop the mainstream of computing, things have gone nonlinear. Numerous hot spots are emerging all over the map, somewhat like the amorphous collection of dots in a child's drawing book: indicative of a shape or shapes but no indication of which lines between the dots will be the right connections.

So, as I was walking down the hallway of the hotel in San Jose, CA, where the conference was being held, trying to make sense of the most recent announcements–and having a hard time of it–I heard a voice say “So, Bernie, what is the next big trend in computing?” It was Rick Merritt, my editor in chief, and I started to respond automatically with my beliefs about “net-centric computing,” the process through which every aspect of computing is being transformed due to virtually ubiquitous connectivity to the Internet, the Web, and networks in general.

I then realized that he was only half serious and I struggled to come up with a suitably humorous and witty remark in response. But I was too late and Rick had already passed by me with a wave. But the effort to be witty threw me over — as it always does — into that non-linear and tangential mode of thinking which I have always found conducive to “connecting the dots.”

Out of that emerged the following, which should give you some idea of where I will be going with this column. So, what's the next big thing in computing? Some possibilities:

1. Net-centric computing, of course. There will always be segments of the embedded market that will be dealing with hard real time and determinism, closed systems, and a high degree of reliability. And there will be segments that will put up with the relatively non-deterministic, not-so-reliable desktop. But the mainstream of net-centric computing, as bandwidths go up and as we move toward interactive, networked multimedia, will be more like the embedded market in its need for a high degree of reliability, fast response and interrupt times, and a reasonable level of predictability and determinism. If you define “embedded” as anything an embedded developer does, the mainstream of net-centric computing is embedded.

2. Truly personal computing. For years it has always irked me that somehow the desktop computer has been termed the “personal computer,” or “home computer,” to make it sound, no doubt, more warm and fuzzy than it really is. For most of its life it has not been either of those, but a corporate desktop, or a business computer. It is only recently that children and students are using them. While some play games and others surf the Web, most of the activity has to do with work–homework. I suppose, broadly speaking, because it IS in the home, it could be called a home computer. But the real personal computers and home computers are the many embedded 4-, 8-, and 16-bit microcontrollers in home/consumer appliances that are part of our everyday life. They are the truly personal computers and will become more so as they migrate to 32 bits and become more net-connected.

3. Reliable computing. As computers, large and small, become more and more a part of our life and economy as the means by which we connect to the information superhighway, reliability, uptime, fault tolerance and resistance, code correctness, security and ability to sniff out and quickly diagnose and correct problems remotely, in the field, will move to the top of the computing agenda.

4. I/O and/or dataflow computing. As more applications and data reside “out there” on the network rather than “in here” on the desktop and information appliance, the ability of computers at all levels of the network infrastructure to shove data in, through, and out will become much more important than the ability of a processor to operate at a higher data rate or higher frequency.

5. Adaptive or configurable computing. The nature of the computing market in the post-PC market is that many market segments are emerging that will eventually be as large or larger than the PC market: wireless internet and information appliances, digital set-top boxes, digital TV, home networking, and interactive networked multimedia. All of them have cost, size, performance, power, feature, and instruction set requirements that are unique to their particular requirements. This dictates a much more flexible, adaptive, or configurable computing environment in which the IP is implemented in software on a high-performance CPU, or in a system-on-chip, or in one of the new processors or PLDs that allow a high degree of fine-grained customization.

6. Novice-proof computing. The graphical user interface has gotten a good rap all these years. As processors have become more powerful, Microsoft and its minions have taken a relatively simple user environment and over burdened it with features and a GUI that has not kept up. For those of us who have lived with the GUI we simple go with the flow, but to the average non-computer user, the GUI is pretty intimidating and in my opinion, non-intuitive and difficult to use. If the feature-rich GUI is to remain the norm, I think that more work will have to be done making the computers we use more expert in how we work rather than requiring us to become more expert in operating them.

Still left on my list are intuitive computing, hypermedia computing, associative computing, natural language computing, multiprocessor and thread computing, immersive computing, and I hope, more open computing.

But what they all will share is an environment that is much more like the embedded space than the traditional computing environment: small footprint, real time, deterministic, dedicated functionality, and high reliability. All these areas of computing will require the skills, services, tools, and software that the embedded market provides.

Whether on some aspect of technology or some business development, what these columns will all have in common is the attempt to assess what is going on now and determine how it will affect things in the future: connecting the dots. What has always interested me as much as the “news” is what is “new.” The two are not always the same.

For that I will need your help. I want this column to be a dialog between us, a participatory process, with topics triggered by, or based on, comments from you. And like all conversations, my opinions and views, as well as yours, may change. Sometimes as part of the dialog I will convince you and sometimes you will convince me. So call me any time or send me an e-mail. Oh, did I ever come up with a suitably witty response to Rick's passing question? Yeah, I think so. “Ego-centric computing.” Funny, huh?

I'd like to hear from you. That is one reason I am writing this column. Send me an e-mail to , or call me at 520-525-9087.

EETimes embedded/net-centric computing managing editor Bernard Cole is the author of “The Emergence Of Net-Centric Computing ,” (Prentice-Hall PTR).

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.