Basics of real-time measurement, control, and communication using IEEE 1588: Part 5
Applying IEEE 1588 to Wireless
Networks
The synchronization of wireless networks is one of the more compelling
and most difficult applications of IEEE 1588 in telecommunications. All
wireless cellular protocols require either frequency or epoch
synchronization to prevent frequency interference and dropped calls due
to handover failures. Table 9.1 below
lists the requirements for several wireless protocols in current use.
![]() |
| Table 9.1. Timing requirements for selected wireless telecommunication protocols |
TheW-CDMA and WiMAX protocols have both frequency division duplex (FDD) and time division duplex (TDD) modes of operation. There are two synchronization techniques used in today's wireless networks.
The first is to install a GPS receiver at each cell site. The GPS receiver, often in combination with a rubidium clock, is used to establish the needed frequency and epoch timing.
A GPS system has a substantial installation and equipment cost, and is subject to lightning strikes. In many cases, the installation must be located where there is not a clear view of the sky, which further increases the installation cost because lengthy cables must be run to the GPS antenna.
GPS signals require averaging over several minutes to achieve accuracies in the range of 108. There is also considerable reluctance, particularly by non-American telecommunications companies, to rely on the GPS system. GPS synchronization is widely used in CDMA2000 base station applications.
The second technique, widely used for GSM and W-CDMA (FDD mode) applications, is to provide synchronization over the backhaul links to the base station controllers. These links may be T1 or E1 links, microwave links, or, increasingly, some form of Ethernet.
If IEEE 1588 can adequately provide timing information over Ethernet links, then there will be considerable savings due to the higher tariffs on the other forms of backhaul links. Since a backhaul is always present, a successful application of IEEE 1588 could eliminate the need for a GPS system at each base station.
Instead, the timing would be derived from a GPS installed at one of the base stations or at the base station controller, and distributed throughout the cluster using IEEE 1588. Alternatively, the timing could be derived from the network terminating at the controller, and distributed to the base stations.
![]() |
| Figure 9.2. IEEE 1588 timing distribution in wireless base station boundary clusters (courtesy of Symmetricom) |
These alternatives are illustrated in Figure 9.2 above, adapted from drawings provided by Doug Arnold of Symmetricom. Figure 9.2A and B illustrate the cases where the GPS installation is at the base station controller and at the base station itself. Figure 9.2C illustrates the case where the IEEE 1588 timing information is derived from the larger network, as shown in Figure 9.1.
Rodrigues [87] notes that the backhaul links between base stations and base station controller are generally not simple single-hop links, but usually involve several links in the packet-based networks, as illustrated in Figure 9.3 below. While currently a variety of protocols are used in backhaul links, future installations will increasingly use Ethernet links, as illustrated.
The use of IEEE 1588 to provide the timing references for the base stations using any of the schemes of Figure 9.2 remains to be demonstrated. The data presented later in this article are encouraging, particularly for applications such as GSM and W-CDMA (FDD). Timing for CDMA2000 and other applications that require epoch and frequency is yet to be demonstrated in a telecommunications environment.
![]() |
| Figure 9.3. IEEE 1588 in wireless backhaul boundary links (courtesy of Zarlink Semiconductor) |
Using 1588 to link SONET Rings via
Ethernet
This section is based on material provided by Doug Arnold of
Symmetricom. As noted earlier, telecommunications companies are
increasingly using Ethernet backbones to replace existing network
links. However, it will be some time before all of the existing
backbone links, typically implemented with SONET rings, will be
replaced.
As a result, there will be cases in which two SONET rings are joined by an Ethernet link. To preserve the timing, the two SONET rings must agree in frequency to one part in 1011. It is possible that IEEE 1588 can be used to convey the timing information between the two rings.
Both rings will be equipped with high-quality oscillators, so that long averaging times may be used to reduce the timing fluctuations introduced by the Ethernet link that would otherwise preclude the use of IEEE 1588. This, plus careful design of the Ethernet links, will no doubt be necessary to achieve one part in1011.
How IEEE 1588 benefits the Cable TV
Infrastructure
The current architecture of a cable TV distribution system is
illustrated in Figure 9.4 below.
Shown is a typical head-end installation. Digital data from the
internet are received from a wide-area network (WAN) connection to the
head-end, combined with VoIP traffic from the public-switched telephone
network (PSTN), and transferred to the cable modem termination system
(CMTS) via the head-end switch.
![]() |
| Figure 9.4. Current cable TV architecture (courtesy of Zarlink Semiconductor) |
It is possible for the CMTS to be located in a master head-end if the CMTS functions can be shared by users connected to hubs in the normal head-end locations.
Currently, the digital link between a customer's cable modem and the head-end terminates at the CMTS. Digital data to be placed on the link are transmitted via the quadrature amplitude modulator (QAM), and are combined with video traffic by the distribution hub.
The physical link is either cable or hybrid fiber cable. These links use one of two modulation techniques specified in CableLabs specification, Data-Over-Cable Service Interface Specification 2.0, each with a specification on timing jitter at the output of the downstream transmission convergence sublayer.
For the Advanced Time Division Multiple Access (A-TDMA) technique, the jitter must be less than 500 ns peak-to-peak, whereas for Synchronous Code Division Multiple Access (S-CDMA) the jitter must be less than 2 ns peak-to-peak.
Both coding schemes require compensation for the latency between the CMTS and the individual modems. This is accomplished by ranging algorithms that calibrate and analyze both the downstream and upstream channels. In the current architecture, all of these functions are implemented on the line cards of the CMTS.
To simplify installation and reduce costs, the cable operators would like to combine expensive functions of the head-end into fewer installations, and replace the current local head-end installations with simpler devices.
The operators also want to utilize their cables to provide so-called triple-play service that combines voice, video, and data, and to make use of lower-cost Ethernet links wherever possible. To accomplish both these goals requires a change in the current architecture to an architecture closer to the one illustrated in Figure 9.5 below.
![]() |
| Figure 9.5. Future cable TV architecture (courtesy of Zarlink Semiconductor) |
In this future architecture, the CMTS functions are located in regional head-end installations. Combined voice, video, and data are delivered to local edge-QAM devices via gigabit Ethernet (GigE) links, and then to the home via hybrid fiber channel links.
The uplink for data is provided via a separate Ethernet network directly from the modems to the CMTS. The synchronization requirements remain as before, but are now separated between the CMTS-to- QAM link, the QAM-to-modem link, and the uplink. Synchronization of the Ethernet portions of these links is an application addressable by IEEE 1588.







Loading comments... Write a comment