Embedded test offers unique value for serial I/O - Embedded.com

Embedded test offers unique value for serial I/O


Although incorporating high-speed serial buses into embedded systems solves many problems, the design and validation processes differ and aren't well understood.

As technology progresses, the electronics industry continually reinvents itself. Embedded systems designers know this story well, many having developed applications across generations of evolving electronics technology and microprocessors.

Along the way, as basic hardware and software have evolved, so too have the methods for developing and debugging systems. Today, most microprocessors incorporate on-chip debug resources that enable the use of a low-cost hardware interface for development and testing. This type of debugging, called embedded test , is significantly aiding the growth of embedded systems and will make designing systems with high-speed serial I/O more efficient.

The economics of silicon is now making it possible for the electronics industry to take advantage of some of the advances made in the communications industry over the past 30 years, specifically the use of serial interfaces. As digital systems struggle to keep pace with the bandwidth of optical systems for the large-scale, high-speed data transmission, the ever-increasing need for speed and overall processing throughput has driven the evolution of parallel-bus structures to their practical limits. To gain more processing bandwidth, the PC industry is looking at high-speed serial interfaces, evidenced by the rapid growth of bus standards like PCI-Express.

As the PC industry adopts serial interfaces, these technologies are becoming more accepted and entrenched. Implementation costs start dropping, which means serial interfaces are now making in-roads into lower-cost PC products and mainstream digital products–in other words, embedded systems. Once again, we see that evolutionary process: as embedded systems and their associated microprocessors pick up the new technology, design teams must adopt new development and debug methods to take advantage of high-speed serial interfaces.

Adopting new methods
Most of today's digital designers are still accustomed to working with parallel buses and system clock speeds around 100 to 200 MHz. Well-developed standards, practices, and tools support such choices. However, high-speed (multi-gigabit) serial is another matter altogether. Design teams who successfully deploy high-speed serial are often now employing engineers with specialized skill sets focused on the physics of high-speed signal transmission (signal integrity). While this approach helps get products to market successfully, more development team changes are needed to successfully incorporate this advanced digital technology into designs destined for the mainstream digital electronics market. Teams need more knowledgeable designers along with the necessary tools and methods to handle this very different type of design problem.

The first step is to understand the design problem. How does designing a digital high-speed serial interface differ? Perhaps the most significant difference is signal integrity. As the signal rates of key interfaces move into the gigabit range, behavior occurs that has typically been the realm of the analog (or more likely RF/microwave) designer. Rather than being concerned about signal timing parameters, such as set-up-and-hold and rise time, designers must deal with parameters such as eye opening, bit-error ratio, and jitter.

Also different is the ability to probe the signal one wishes to observe. This is a function of both the high integration levels seen in today's silicon as well as the need to manage the integrity of the signal path very carefully. As speeds rise above 3 Gbits/s, it becomes necessary to apply pre-transmission conditioning to the signal to compensate for the transmission medium's lossy effects; handling the signal at the receiver thus requires the corresponding filtering to accurately recover the signal. Also, because these signals often operate in the low-power environments of sub-micron digital silicon, voltage swing is small. This means that simply attaching a physical probe, in traditional test and measurement fashion, becomes virtually impossible because the probe itself significantly disrupts the signal.

Testing and debugging these interfaces must allow for the real-world effects these factors create. The need to focus on signal integrity indicates that digital designers must incorporate new measurement types (and tools) into their standard tool box of tests they use to validate these designs. Sophisticated tools that measure signal integrity and characterize things such as eye metrics, bit-error ratio (BER), and jitter tolerance are becoming more common and must evolve from their once-specialized role into more mainstream offerings. Approaches must evolve to allow for probing these critical signals given their sensitive nature and the high integration levels seen in today's silicon implementations.

Embedded test is the answer
As with the emergence of on-chip debug tools and techniques in the microprocessor world, the answer, at least to the probing problem, is to implement more of the test functions in the silicon itself. Because the signal path is by definition managed carefully by the chip developers, incorporating the ability for the application designer to make key measurements and observe the serial interface's behavior can best be handled in this way. This method, called embedded test, eliminates the need to attach an external probe (with its associated problems) and allows access to information about the signal (such as the actual eye metrics recovered by the receiver) that wouldn't be available externally.

A real example of this is shown in Figure 1. Here, measurements made on a serial link operating at 6.25Gbits/s show that even if physical probing limitations can be overcome, observing the signal at the device's pins yields confusing information due to the application of pre-transmission signal conditioning. Simply by looking at just this information, one might conclude that the link isn't operating because no signal eye can be observed. However, by incorporating on-chip measurement, as seen in the view on the right of Figure 1, engineers can determine that a signal is indeed being recovered by the receiver.

View the full-size image

The FPGA's role
As serial technology emerges in the realm of embedded systems, the FPGA plays a significant role. FPGAs have long been a key implementation technology for embedded systems designers, and the role of FPGAs is increasing as they evolve to new price-performance levels. FPGAs are increasingly becoming an integration platform with system-on-chip (SoC)-like capability implemented in a programmable infrastructure. This gives embedded systems designers lots of flexibility and a relatively low-cost approach for achieving high levels of integration in their designs.

FPGA vendors also realize the trend toward serial interfaces and are working to bring usable high-speed serial technology into the hands of more developers. Most high-end FPGA families now incorporate multi-gigabit serial I/O capability, a feature that's making its way into some of the lower-cost device families offered by FPGA suppliers. The FPGA's inherent reprogrammability also provides a unique opportunity for tools that implement test functions. Development and test tools are emerging that give designers new ways to get insight into the behavior and quality of their serial interfaces. These new tools use measurement types corresponding to high-speed serial requirements (such as BER measurement), making them available to entire classes of designers who may not have considered them previously due to domain knowledge and the cost to obtain the associated instruments.

Testing high-speed serial I/O in FPGAs
Tools are available to allow FPGA developers to measure serial I/O. A block diagram of such a tool is shown in Figure 2.

The tool consists of three basic components:

1. A test core that implements on-chip test pattern generation, BER measurement, and access to transmitter and receiver control registers;

2. Measurement software; and

3. A simple hardware interface, in this case implemented with the JTAG programming cable.

View the full-size image

With this architecture, one can see that by configuring these three components properly, a measurement can be set up to get insight into the operation of a high-speed serial link implemented with Xilinx FPGAs.

Such a tool provides the ability to make three basic link measurements, all based on BER, which is widely accepted as the definitive measure of high-speed serial interface operation. At its simplest, the tool can provide link BER measurement. This measurement is made internally and reflects the actual conditions seen by the receiver inside the FPGA, not using a probe attached at the pins of the device as with a more conventional measurement.

Another measurement of interest, is eye mapping (shown in Figure 3), which provides a simple way to quickly assess link margin. By making repeated BER measurements across the unit interval of the data eye, the user is presented with a graph of the BER versus the position in the eye. And finally, by combining the eye mapping capability with access to the transmitter and receiver control registers, the ability to actively tune the link for optimal BER is provided.

View the full-size image

Implications of embedded test
The adoption of serial I/O in embedded systems will have an impact on both how the design teams are formed and what tools they use. I believe the specific application of embedded test to that technology will be a valuable offering for developers. Many additional possibilities are associated with this concept within the realm of high-speed serial and beyond. It's clear that as silicon technology continues to progress in complexity and capacity as well as speed, embedded test approaches will offer unique opportunities to provide insight for system designers, both hardware and software.

Embracing embedded test requires the flexibility to incorporate new test topologies and new approaches to creating tests that span silicon providers and test and measurement providers. Although these collaborations represent a challenge for the industry, the potential to bring new and highly valuable measurement capability to designers at compelling price points is an undeniable economic force.

Currently responsible for marketing and strategic planning for Agilent's embedded test products, Bill Schulze has held various marketing and product management positions throughout his 20-year career in the electronics industry. Schulze holds a BSEE degree from the University of Missouri–Rolla, and has completed post-graduate coursework at Harvard Business School and the University of Michigan. He can be reached at .

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.