The Internet: maybe you've heard of it. It's the great equalizer and the great enabler. Part of its beauty is that it's global, it's egalitarian, and it's free. Internet standards like HTML, POP, and SSL were designed to be hardware-, software-, language-, OS-, and byte-ordering neutral. It doesn't matter what computer you're using or what operating system you prefer. Internet standards don't care if you're a PC or Mac user, big-endian or little-endian, using a fast new machine or an old slow one.
That glorious impartiality comes at a price, though. It's hard to imagine a less-efficient way to transport data than with 7-bit ASCII-encoded HTML. But efficiency wasn't the goal in defining the Internet's internal standards; universality was. We're willing to give up a little (okay, a lot) in transport efficiency in exchange for a single global network that everyone can share. Or can they?
The answer, as with so many things, is yes and no. Although the Internet's network plumbing is hardware-agnostic, most Web content is not.
Web content has become less portable and more client-specific over time. This didn't happen by accident. It's the result of both commercial forces and simple human nature. The upshot is that today's Web is becoming an Intel- and Microsoft-centric medium, just like PC software is. And there's no reason why this situation will change any time soon. Whether that's a good thing or a bad thing depends mostly on your job.
Getting there from here
First things first: is the Web really platform-neutral or has it become PC-centric? Anyone who's spent an hour surfing the Web on two different computers has seen how pages often look different on different machine. Folks with a PC at work and a Mac at home are familiar with this effect. Even changing browsers on the same computer (from Internet Explorer to Opera, for example) can highlight dozens of differences in what was supposed to be a standard presentation format. Anything more exotic than an all-text page with blue underlined links is liable to look and behave differently on different clients. Viewing a favorite site on your BlackBerry or iPhone is also likely to lead to frustration–or amazement that it works at all.
How did Web content get this way? It's a combination of simple human nature mixed in with a fair bit of commercial self-interest. After the initial “gee whiz” wore off around 1994, most of us got bored with plain-text Web pages. Some Webmasters studded their pages with blinking text or flashing borders. That was about as creative as early browsers would allow. Remember, HTML wasn't created with today's slick e-commerce economy in mind. It was intended for academics.
Today, any decent commercial Web site uses Flash, Java, QuickTime, Postscript, Real Audio, and any number of other semi-standard extensions to the basic HTML underpinnings. These extensions have all become part of the Web's lingua franca : the language of global Internet commerce. Increasingly, you can't even read a site's basic content–never mind browse the catalog or log in–without installing the requisite helper applications to read PDF or Flash files.
Here's where the problem starts. Like any program, these helper applications had to be written for (or ported to) each distinct combination of processor and operating system. The Adobe PDF reader for Internet Explorer 7 running on Windows XP running on an Intel Core 2 Duo processor isn't the same as the PDF reader for Safari running on Mac OS running on a PowerPC chip, and so on. Just like any other team of programmers, the folks who develop these helper apps must look at the cost/benefit. Is it worth the effort to port this program to Platform X if said platform accounts for only 5% of the market? Do we really need to write, debug, and support a RealAudio decoder for the Commodore 64, Silicon Graphics Indigo, or Cray Y-MP if hardly anybody will ever use it?
A good example is Flash and the iPhone. Millions of happy iPhone users know that their favorite toy can't render Flash animation. There just isn't a Flash helper application for iPhone and, until recently, no hope of getting one. Apple has since released an iPhone software-development kit, so Adobe will presumably be quick to fill this gap in its product coverage. But in the meantime, no iFlash.
So browsers and their helper apps behave just like any other software. They must be ported and supported, and that means the most popular processors and operating systems get supported first. The less-popular platforms catch up later, if at all. Your Amiga might be able to load Google's basic home page, but good luck getting YouTube to work.
That's the practical side. There's a commercial side at work, too. Microsoft, like any good software company, wants to protect and extend its customer base. It has done this in part by encouraging a number of “non-standard” extensions to Web content. Active server pages (ASP), ActiveX, Exchange Server, and .NET are a few examples of Microsoft's contribution to Web content cacophony. Oracle, Real Networks, and other companies have all done likewise, promoting extensions in which they had a commercial interest. Some of these efforts have been more successful than others, but all of them erode the independence and client-agnosticism that was inherent in the early Internet. The more bells and whistles we add, the more locked-in we become.
Meet the new boss
We now have a situation where Web content is developed for the PC first, with Macintosh, Linux, and other clients coming later. In contrast, few Web sites are deliberately Mac-specific, even Apple's. If your site doesn't work on a PC running Windows and Microsoft's latest browser, it's effectively broken. The Intel/Microsoft hegemony rules the Internet just as it does the desktop, and certainly not by coincidence.
But what about mobile platforms, you say? Intel's footprint is very small in handheld devices, where ARM is considered the hands-down winner. Is the mobile Web going to be ARM-specific? Quite probably.
It's fair to say that ARM-based chips are the most popular processors for smart phone designs, just as Intel's chips dominate the desktop/laptop/server business. But ARM's popularity doesn't make all smart phones identical. A CPU instruction set does not a standard make. The handheld software market is too fragmented to consider it one platform. Programmers working on mobile applications must navigate through a forest of choices. But regardless of operating system and other software concerns, we can guess what kind of compiler they'll use.
But wait, there's more
Deeply embedded systems are the exceptions that prove the rule. The proverbial Internet-enabled Coke machine and plenty of other embedded systems now sport full- or part-time Internet connections for monitoring, feedback, or content delivery. Then there are the routers, packet inspectors, encryption appliances, firewalls, and all the other embedded-systems plumbing that makes the network work. None of these needs a traditional browser or any helper applications. Coke machines don't need to render Flash animation (yet) and file servers don't need an Acrobat reader.
So although these more deeply embedded systems greatly outnumber the PC-type clients, they have little effect on the actual Web content. Routers route data regardless of content, for the most part. Unless they're sniffing packets for viruses or doing QoS massaging, embedded network boxes generally don't care what the traffic is bearing. They don't affect what is on the Web, only how it's delivered.
Déjà vu all over again
This whole situation puts programmers in a strong negotiating position. Their helper applications are what make the Web interesting. Without the full assortment of helper apps, new client hardware is nearly useless. Witness the dismal failure of WebTV and other “network appliances.” Sure, they could render basic HTML, but without all the extra content bells and whistles, they weren't much fun. And isn't that what the Web is all about?
Web content relies on specific helper applications just like desktop PCs rely on productivity applications. Without the full array of browser plug-ins, users are disappointed and Web publishers aren't happy. The same commercial and technical issues that shaped the PC market have conspired to make the Web processor-specific.
Jim Turley is the founder of Silicon Insider and an authority on microprocessors, embedded systems, and semiconductor IP. He is the author of seven books, the former editor of Embedded Systems Design and Microprocessor Report, and previous host of the Embedded Systems Conference and Microprocessor Forum events. Contact Jim Turley at .