Maximize high-speed signal integrity with the right choice of cables, layout, and equalizer ICs (Part 1)

Eric Sweetman, Senior Applications Engineer, Vitesse Semiconductor Corp.

January 27, 2009

Eric Sweetman, Senior Applications Engineer, Vitesse Semiconductor Corp.

The widespread growth of high-speed networking has introduced a new level of signal integrity issues for the system designer. Signal integrity engineering is now a required component in any successful high-speed design. What were once simple wire paths are now becoming complex circuits of parasitic elements from every transition the wire makes, from connectors to vias to solder pads. Even the wire itself is becoming a filter, as the signal harmonics reach into the range of skin effect losses, which is only compounded as the wire pitch is compressed.

The good news is that there are relatively simple semiconductor solutions that can address the bulk of the signal integrity engineering issues a designer faces today. These semiconductor solutions represent an economical alternative to the more expensive choices of PCB materials and connectors. In many cases, these IC solutions can allow an older generation design to step up in performance, or allow a new design to attain even better levels of packing density.

There are several types of silicon-based equalization solutions that address the entire range of signal integrity barriers. Most applications can be addressed by very simple asynchronous equalizers that come in compact form factors intended for easy inclusion in a board layout. Their asynchronous operation means that these components only require power and a few control pins to operate. Asynchronous equalizers also feature low latency and transparency to data rates and protocols, making them a near-universal solution.

The material presented in this article was compiled through years of information-gathering from engineers on common system-design issues and design decisions that affected the overall high-speed performance of end-products. The perspective of a semiconductor supplier is important, as high-speed designs were (in the past) completely dependent on the semiconductor performance. However, with increasing speeds, a more collaborative approach is needed, as some of the barriers at higher speeds are outside the space of the semiconductor. Thus, a common approach helps both semiconductor suppliers and system designers make the appropriate decisions in breaking down and solving problems.

The problem: high speeds and getting higher
Communications design requirements are driven by industry trends, such as consumer, government, and business demand for video, Internet TV, and new geographic markets. As the demand for bandwidth continues to grow at exponential rates, networking and storage equipment continue to evolve to meet this growth. This includes faster standards and protocols for inter-operability, such as newer SAS/SATA standards in storage, a transition to 10 Gbps in Ethernet, HDTV moving to 3 Gbps 1080p, and faster PCIe protocols in server equipment.

The backbone to support these higher-rate data transfers is the copper interconnect (or backplane) found in a majority of communications systems. The bulk of what used to be considered "high-speed" backplanes operate in the 1 to 3 Gbps range, which was largely driven by the technology available at that time. Today, a migration is underway to move these data rates to 10 Gbps and beyond, placing a tougher set of restrictions on both legacy and new backplanes.

Going faster is not just an IC problem anymore. As protocol data rates and silicon serial speeds increase beyond 5 Gbps, ICs are now outstripping the PCB capabilities to handle these speeds. As an industry, semiconductor vendors tend to design up to the point of their own package pins. This places the burden on board layout engineers to successfully cross over to the next technology level.

Things go haywire at high speeds
We start with a simple modeling example to frame the discussion. A simple modeling reflects the compounded complexity of high-speed signal transmission. A model of a transmission path at low speeds looks like this, Figure 1.1:


Figure 1.1: Transmission path at low speeds


Increasing the frequency, and modeling the associated effects, would change the same path to this, Figure 1.2:


Figure 1.2: Same Transmission path modeled at higher frequency


To understand the effects further, we take a look at characteristics and loss effects of various media commonly used to transport multi-Gbps signals.

Figure 1.2 shows the level of complexity when moving to multi-Gbps level speeds. In addition, there are several 'cliffs' in reference to loss characteristics that occur beyond 5 Gbps. Plotting signal loss versus frequency shows fairly dramatic loss effects in the Gbps range that do not exist in the Mbps range, Figure 1.3.


Figure 1.3: Signal Loss vs. Frequency
(Click on image to enlarge)

To understand the effects further, we take a look at characteristics and loss effects of various media commonly used to transport multi-Gbps signals.

Material issues: PCBs
An obvious starting point is printed circuit board (PCB) material. This is the base material used to carry electronic signals. Of note are the dramatic loss effects that occur with PCBs when they are transmitting higher-frequency signals. Even moderate PCB trace lengths create signal integrity problems as data rates increase, Figure 1.4.


Figure 1.4: PCB loss effects at high frequencies
(Click on image to enlarge)

Figure 1.4 shows a usable eye opening, but the signal does pick up a noticeable amount of noise over a moderate trace length. Signals become more susceptible to noise once a signal is compromised.

Material issues: cables
Copper cabling is the primary means of bridging signals between computing or communication platforms. As the protocols and speeds being carried by these cables are increased, an increasing number of loss and signal integrity issues need to be taken into accounted. Good, properly shielded cables, are better than bad cables, but used without any equalization aids, these are not good enough for multi-Gbps signal transmission. Most of the improvements are related to EMI and impedance control, whereas the underlying physics behind a copper wire is largely unchanged.

Figure 1.5 shows a real-life example of high-performance cable degradation. A 12m trace of high-performance Twinax cable was used with input stimulus swept from 250 Mbps to 10 Gbps. Scope shots were taken at the various steps and then spliced together. The result shows that a reasonable eye opening makes it up to about 1 Gbps and then closes up very fast. Cable itself is not very useful much beyond 1 Gbps.


Figure 1.5: Signal Degradation in copper cables
(Click on image to enlarge)

As stated earlier, most protocols in networking, storage and video are in the 3-10 Gbps range and above. This cable would not be of much use at those data rates. Beyond the obvious loss characteristics of copper cable, it is important to note that a much higher price premium for what are called 'higher-quality cables' yields marginal performance improvements.

Since there is limited available range of dielectric-loss materials that can provide significant gain, this leaves the basic carrier material, copper, to do the heavy lifting. However, the base copper material does not change much, if at all, between the lower and 'higher quality cables'. Therefore, to achieve the lower loss, much more material, and much higher costs enter the picture, while achieving marginal results. In the end, taking a purely passive solution can be very expensive. Shown later will be methods to augment 'lossy' and relatively 'cheap' copper cabling with cost-effective electronics and equalization (EQ) techniques to achieve the higher performance levels.

In Figure 1.6, the vertical axis is an arbitrary metric of usable quality, where higher means good, and lower means not as good.


Figure 1.6: Comparison of cable performance and cost
(Click on image to enlarge)

The horizontal axis shows the cost of that material per unit length. As expected, Cat-5 cable is inexpensive with minimal quality. Going up and to the right shows better quality at much higher costs. Since this is shown in log scale, it is important to note that in order to get a factor of 2x or 3x performance out of cable, the cost would come at a 10x premium.

Complexities of backplanes
Next, we follow some signal paths down a typical backplane to establish a few observations on backplane complexity. This will help frame the discussion of addressing issues through EQ techniques and best design practices later.

Rise Time Degradation
A backplane setup was used in the lab to demonstrate the damaging discontinuity effect on rise time. In this backplane example, a test backplane intended for Gbps rates was used, Figure 1.7.


Figure 1.7: Backplane TDR test set up
(Click on image to enlarge)

A TDR Test was performed using pulses generated by a scope to measure reflections along a signal path, with manual probe of the pulse at selected points along the signal path, Figure 1.8.


Figure 1.8: Illustration of TDR Test parameters
(Click on image to enlarge)

A low-capacitance, high-frequency scope probe shows the transmitted pulse from the oscilloscope and into the backplane, with the reflected pulse. Stepping through this, we can analyze what happens to the signal. Starting with the initial launch, the waveforms below show pulses as they arrive at the card from the backplane.

At this point, a very short, very steep discontinuity is seen, resulting from a via with a short time constant that is interacting with the sharp edge of the input pulse, Figure 1.9.


Figure 1.9: Backplane test--Reflected pulse from via
(Click on image to enlarge)

As it hits more discontinuities, the amplitude will gradually degrade as it reaches the end of the backplane, Figure 1.10.


Figure 1.10: Backplane test--Launch into backplane
(Click on image to enlarge)

Launching into the line card, a tilt from the originating pulse results. Similar effects are exhibited when ASICs and FPGAs generating signals are moved further away from the connector. We now move the probe tip further away, reaching the other side of the backplane connector, here we see comparable losses as the PCB trace tilts the signal further, Figure 1.11.


Figure 1.11: Backplane test--Back side of connector
(Click on image to enlarge)

The next step is to get across to the other side of the backplane using a span of about two feet of copper trace. Before the signal gets to the connector, the signal is measured, Figure 1.12. The tilt in the edge rate is now severely impaired due to resistive losses.


Figure 1.12: Backplane test--Results after spanning backplane
(Click on image to enlarge)

The signal is now terminated on the switch card that is on the other side of backplane. What has happened to the waveform after completing the journey, is that the tilt and rise time are severely degraded.

Another place to look for discontinuities are cable launches. Figure 1.13 shows that high edge rate can trigger ringing and is sensitive to even minor impedance discontinuities.


Figure 1.13: Backplane test--Combined results of multiple discontinuities
(Click on image to enlarge)

Crosstalk
Crosstalk resulting from discontinuities should also be considered, since energy reflected off of discontinuities creates a "spill." Figure 1.14 shows the same backplane, with crosstalk between adjacent pairs.


Figure 1.14: Backplane with crosstalk
(Click on image to enlarge)

With hundreds of signals common in some backplanes, the resulting environment can become quite challenging. An important safety tip to remember is that a high-performance equalizer can also overcome this issue.

Part 2 of this article will discuss PCB design guidelines, managing discontinuities, choosing PCB material, transmission line design, ground via spacing, via design, and related issues.

About the author
Eric Sweetman, Ph.D., is a Senior Applications Engineer for the Serial Data Solutions Product group at Vitesse Semiconductor Corporation. Eric has worked in various research & development roles covering signal integrity, radio frequency identification (RFID) and Interconnect technologies. Prior to joining Vitesse in 2003, Eric held signal integrity & RFID R&D positions at Accelerant Networks and Lucent Technologies. He holds several patents in RFID technology, along with PhD and MS in physics from the University of Michigan and an SB in physics from MIT.

Loading comments...

Most Read

  • Currently no items

Most Commented

  • Currently no items

KNOWLEDGE CENTER