The road to wireless is bumpy, but early planning can prevent disaster
With companies recognizing the freedom and flexibility that comes from untethering their systems, designers are being pushed to accelerate their migration to wireless connectivity as a means to both add value and simplify the installation and end-user experience for customers in industrial, commercial, or consumer applications. However, as they do so, they're rediscovering why their predecessors didn't go wireless in the first placebecause it's a hard and rocky path fraught with pitfalls for the nave and uninitiated.
While the pervasiveness of proprietary and industry standards for cellular, wireless local area networks (WLANs), wireless personal area networks (WPANs), and wireless sensor networks would at first glance seem to smooth that path, their multiplicity in itself causes problems. The most obvious is the confusion that comes from choice. More harmful, however, is the belief that derives from the pervasiveness of a technology that it must be relatively easy to get up and run with it. However, as increasing numbers of designers are finding out, that's not the case. Wireless is not shrink-wrappedyet.
Like a fairground game of “Whack-a-Mole,” the problems keep popping up, starting with the new ground rules for certification by the Federal Communications Commission (FCC). While all electronic systems must go through some kind of FCC approval process, type approval for wirelessly enabled systems is particularly onerous, expensive, and time consuming. The complications arise from the fact that the system is now moving from a nonintentional electromagnetic radiation emitter to an intentional emitter. As such, system emissions that were once acceptable must now meet even stricter limitsregardless of whether the wireless subsystem is turned on or not. Regulatory issues compound if the system is intended for a global market where regulations vary dramatically.
The remaining design and implementation issues that crop up are mostly derived from a basic lack of understanding of the subtleties of wireless propagation and the tradeoffs associated with its implementation in terms of power consumption, range, data rates, cost, size, security and link reliability and robustness, and software integration. The latter is particularly important. While software and code optimization can streamline a design and reduce processing requirements, too little software on the user interface and connectivity layers can send ease of useand by extension the system itselfinto a death spiral.
A number of trends are also exacerbating what already seem sometimes to be intractable problems. These include the use of multi-gigahertz processors and the interference caused by their sub-harmonics; the proliferation of wireless technologies that's leading to mutual interference and coexistence issuesespecially with the trend toward multiple wireless interfaces co-located in the same system; shrinking form factors that are pushing the RF/mixed-signal and digital circuits closer together; the variations in power-supply rails due to process-node differences; and last, but not least, the functionality-creep in mobile devices that continues to force downward the wireless subsystem's power-consumption limits.
The subtleties of wireless propagation are numerous. For example, almost all surfaces either absorb or reflect electromagnetic radiation, with the human body itself being the most notorious example of an absorber. While absorption obviously attenuates a signal, reflections can cause a signal to get routed to a destination via multiple paths, any two or more of which can mutually amplify or cancel each other out. Over longer ranges, blooming trees can block a signal that was once strong, giving rise to the truism that sometimes the best microwave engineering is accomplished with a chainsaw.
For the embedded designer, the propagation characteristics of wireless signals greatly influence antenna placement. While this can be accounted for in new designs, wireless upgrades to current systems are a different matter. These can require modifications to the boards and chassis in order to avoid interference from on-board processors and provide the shortest path possible from antenna to chip. In addition, care must be taken to avoid nearby aluminum framing that can detune the antenna. Attention must also be paid to protective metallic chassis coatings. While these coatings were initially intended to prevent electromagnetic interference (EMI), they now attenuate the deliberate transmission and reception of signals, so openings must be made that might also affect the rest of the system. These openings can proceed to further complicate FCC type approval and certification as they now compromise the original design's EMI integrity.
While some noise-reductions solutions seem sensible, such as moving the antenna to a laptop's lid to take it as far away as possible from the main-board electronics while maximizing transmission and reception power levels, that solution too can run into problems with noise from high-frequency display drivers. In effect, no matter where a radio is located, it habitually discovers noise sources all on its own. Insidiously, these sources, which can be innocuous in and of themselves, can combine to create intermodulation effects that can drop a spur right on the third harmonic of the signal of interest. That's not an easy problem to isolate.
The AVS 802.11 wireless LAN development platform from AbsoluteValue Systems has everything needed to get up and running with an embedded design. The Linux-based kit includes embedded source code, development tools, documentation, and reference hardware.
Despite the myriad traps and “gotchas,” the Siren call of wireless connectivity continues to entrance the adventurous. To get there though, all will quickly identify their own pain threshold. While only masochists and those with lots of spare time will start from scratch and design their own wireless scheme, some brave souls will take the difficult path of learning all about wireless, identifying “best of breed” parts along the signal chain and integrating them on a board with all associated software using off-the-shelf design tools. These two paths reside at the very top of the ladder of difficulty, a ladder that few embedded designers are willing to climb.
In reality, the optimum path for embedded designers looking to get wirelessly enabled is to learn enough about it so that they can carry on an intelligent conversation with: (A) an experienced RF design consultant that they should hire and/or (B) a major OEM such as Philips, Atmel, Analog Devices, Texas Instruments or STMicroelectronics, or an ODM such as Flextronics, that they can rely upon to guide them through to final product. Viable implementation options along the way include going with tested chipsets, full reference designs with layouts and antenna-placement recommendations or full-out modules from the likes of Murata or Taiyo Yuden. The latter can even be FCC certifiedthough that doesn't obviate the need for full system certification once completed.
After the wireless consultant or partner OEM/chip vendor has been selected, the first piece of sage advice they should offer is to budget 50% higher than initial estimates in both time and material. Even then, the process can't begin until the designer has stepped back and clearly identified the usage model for the particular system in terms of data rates, range, security needs, mobility requirements and life expectancy. This is the first step toward narrowing the field of options, with the optimum choice derived from an analysis of cost, power consumption, space requirements, complexity, reliability, and availability.
As a general rule of thumb, the wireless link itself won't add value or differentiation from the wireless link itself, so a link that does just what's requiredand no moreis the way to go. Anything beyond that unnecessarily raises complexity and cost and wastes time.
The Sensor Application Reference Design (SARD) board is one of two such boards that come as part of Freescale Semiconductor's ZigBee-capable development kit. The kit contains everything needed to create proprietary and standards based peer-to-peer and star networks.
After the basic usage model of the system and the requirements of the link have been decided, the next phase is technology selection. For anyonebeginners or expertssticking to standard and mature technologies is advisable, especially if time-to-market pressures are mounting on an 18-month-or-shorter design cycle. An exception can be made if a company is looking to establish itself through differentiation or is working on next-generation products. These sometimes require more implementation risk. In addition, once the application's maximum data rate has been established, it's advisable to select a wireless technology that is specified for rates of at least twice that, in order to leave tolerance for the vagaries of the connection.
For long-range applications such as remote data-acquisition, sensor, or on-site vending machines, cellular standards such as GSM/GPRS/EDGE and CDMA/CDMA2000 are obvious options. Ranges of up to several miles can be achieved with data rates into the hundreds of kilobits (EDGE has a max of 384Kbits/s, for example).
However, range and rates depend almost entirely on the number of subscribers and the type of environment. As a result, care must be taken to ensure adequate carrier coverage in the intended area of system use, though few of these applications require high data rates.
In addition, if a system such as a vending machine is to be installed inside a building, it may be necessary to install a roof antenna and run a coax cable down to the machine. This alternative offsets the advantage of using a wireless option and also may not deliver the required reception enhancement due to losses in the cable itself.
Finally, many remote systems tend to be in the field for some time, so reliability and longevity are paramount. Choosing a low-cost cellular interface that may be phased out in a year or two is counterproductive. A modular approach to the design is advisable so the air-interface can be swapped out if need be.
For shorter ranges and where maximum data throughput is a priority, Wi-Fi WLANs based on the many IEEE 802.11 variants are the most obvious choice. Ranges of up to 300 meters or further can be achievedat rates into the multiple megabits/second. For shorter-range applications, the standards specify maximum rates of between 11Mbits/s (802.11b) to 54Mbits/s (802.11g in 2.45GHz band and 802.11a in the 5GHz band). Proprietary enhancements extend these rates to over 108Mbits/s, but these are specific, closed-loop implementations so those maximum rates will not be achieved with radios from other Wi-Fi vendors.
Thanks to pending and recently ratified extensions to the base 802.11 standard that have enhanced quality-of-service and security, Wi-Fi networks are accelerating their penetration in both the enterprise and home. Applications are now migrating from basic high-speed data to voice over Internet Protocol (VoIP) and wireless audio/video (A/V) and multimedia distribution for the home. With the high chip volumes have come lower costs and even more applications, in a self-perpetuating cycle. As a result, the rush is on to embed Wi-Fi capability into everything from smartphones and PDAs to digital camcorders and cameras and now set-top boxes, TVs, DVD, and CD playersboth fixed and mobile.
The key takeaway for embedded designers is that with proliferation comes lower cost. While lowest cost of implementation has historically meant moving to chip- or system-on-chip implementations, the cost of 802.11b chips, for example, are falling so fast that for many embedded applicationswhere space isn't too constrainedit's wiser to embed a low-cost 802.11b PCMCIA card. You take the hit on the enclosurebut you get to market rapidly with a prequalified and Wi-Fi certified design.
While 802.11b radios are cheap, for mobile designs the power consumed in transferring a given amount of data has been shown in a number of vendor studies to be prohibitive, relative to higher-speed 802.11g/a radios. As a result, many recommend the use of an .11g or .11a radio.
Also, with the commoditization of Wi-Fi chipsets, the emphasis has moved up to the system level in terms of how it handles dropped connections, prioritization, location and roaming and security. That is where system differentiation can be achieved with standard commodity chipsets.
Further down the power, range, cost and data-rate curve is Bluetooth, with a typical range of 10m and baseline data rates of 720 to more-recent enhancements that will extend that to 3Mbits/s (payload of 2.1Mbits/s). Bluetooth is finding a home in headsets for cellphones and human interface devices such as keyboards and mice. Latencies down to fewer than 15ms also make it useful for gaming. Unlike Wi-Fi, which defines the transport mechanism, Bluetooth control extends all the way up the protocol stack to the application. While this level of control gives it interoperability, it also touches those parts of the embedded application that other wireless technologies don't.
For industrial control and home automation, ZigBee should finally be emerging as a specification next month (October). Using the IEEE 802.15.4 standard as its foundation, ZigBee adds to that security, networking and application-specific layers toward an end goal of interoperable, pervasive, low-cost wirelessly enabled devices. Rates range from 20Kbits/s to 250Kbits/s and its low duty cycle and low performance requirements are intended to make it a shoe-in for battery-powered applications such as light switches and sensors. In a switch, for example, a ZigBee radio can run off a lithium battery for its specified shelf life of 10 years. Though 802.15.4 chips have been available from sometime from the likes of Freescale, Atmel, Oki, and Chipcon, fully ZigBee-compliant devices won't be available until after interoperability testing proceeds over the next few months.
The MC13191/92 developer's starter kit is intended for wireless network designs based on the IEEE 802.15.4 standard. Though it also supports ZigBee, the kit allows development of both standard and proprietary peer-to-peer and star networks
Standard vs. proprietary
Some companies, such as Cypress Semiconductor, will push proprietary or “de facto standard” technologies such as Wireless-USB for applications that it believes aren't being met by established technologies. These include ultra-low-power human interface and remote-control and gaming applications. The argument can sometimes be compelling in that the features of the chipset exactly meet a specific corner case, in this instance in terms of low power, low protocol overhead, and the ability to dynamically change its frequency to avoid interference. However, the counter arguments are that a customer could be left hanging if the technology fails and that proprietary schemes miss the cost savings associated with going with a mass-market, high-volume technology such as Bluetooth or ZigBee, are even more compelling. To its credit, however, Cypress has recently signed up Atmel as a second source for its technology. Another proprietary scheme of note is the modulated magnetic field technology from Aura Communications. Primarily aimed at headsets, the company says it overcomes the power and interference issues associated with Bluetooth.
Neophytes are generally advised to shy away from newer technologies such as WiMax on the last-mile- and metro high-speed data side, and ultrawideband (UWB) on the short-range end. However, these technologies are emerging rapidly, with silicon for both expected by year's end. Freescale, which is pushing a direct-sequence form of UWB that can reach rates of up to 1Gbit/s over a few inches, said it has a mini-PCI-card implementation of its design. Reference designs are also available on a 4 x 4 board with full drivers and host software for Linux. The cost is $20,000.
After choosing a technology, the next stage is the evaluation of an ODM that can take a chipset, reference design or module and either work directly with you or the vendor to realize the final product. These ODMs can vary in their RF expertise from not much knowledge to quite sophisticated. Though they might cost more up front, the latter is obviously preferred as they can pick off problems as the crop up. Also, for custom work and form factors, a good ODM will know where to cut cornersliterallyon a design to make it fit. For example, simply cutting the length of a board to make it fit a design can destroy the impedance matching between antenna and chip and throw off the transfer of power. In addition, cutting cost through cheaper components can result in a capacitor that was originally vetted for a 10MHz design being used in a 2.45GHz radiowhere it behaves like an inductor. Other issues that can throw off a production line include inductors that are reversed in the pick-and-place operation. The change in resonance frequency won't prevent operation but can reduce output power or sensitivity.
When finalized later this year, ZigBee is expected to wirelessly enable everything from light switches in the home to factory-floor sensors. But no one size fits all, so it is but one of a wide array of technologies a designer must evaluate carefully on the path to wirelessly enabling their embedded system.
Patrick Mannion , (), is senior editor for wireless and DSP at EE Times . Mannion writes news and edits EE Times ' CommWeek and In Focus sections. He holds an engineering degree and was formerly editor in chief of Communication Systems Design and communications editor at Electronic Design Magazine .
Resources and companies that contributed to this article
Analog Devices Inc
Atheros Communications Inc.
Cambridge Silicon Radio (CSR) Plc.
Cypress Semiconductor Corp.
Freescale Semiconductor Inc.
|IceFyre Semiconductor Inc.
Ottawa, Ontario, Canada
Texas Instruments Inc.