It is widely recognized that advanced driver assistance systems (ADAS) and autonomous driving (AD) can be successful with effective sensing of the environment surrounding the vehicle feeding into the algorithms enabling autonomous navigation. Given the absolute reliance on sensing in life-critical situations, multiple sensor modalities are used with the data being fused together to augment each other and provide redundancy. This allows each technology to play to its strengths and deliver a better-combined solution.
The three modalities that will be prominent for the sensor used in vehicles for ADAS and AD going forward are image sensors, radar, and LiDAR (Light Detection and Ranging). Each of these sensors has its own strengths and together they can comprise a complete sensor suite delivering data to enable the autonomous perception algorithms to make decisions with sensor fusion — the ability to provide color, intensity, velocity, and depth for every point or kernel in the scene.
Figure 1: Sensor fusion takes advantage of the strengths of each modality to provide complete information about the vehicle’s surroundings.
Of these three principle modalities, LiDAR is the most nascent technology to be commercialized for mass-market use, even though the concept of using light to measure distance goes back decades. The market for automotive LiDAR is set to show spectacular growth rising from $39 million in 2020 to a projected $1.75 billion in 2025, according to Yole Développement, driven by the proliferation of autonomous systems requiring the complete sensor suite. The opportunity is so large that there are well over 100 companies working on LiDAR technology, with cumulative investments into these companies exceeding $1.5 billion dollars by 2020 — and this was prior to the deluge of SPAC-driven initial public offerings by more than a handful of LiDAR companies that began in late 2020. But when there are so many companies working on a single technology — some of which are fundamentally different such as the wavelength of light being used (prominent examples being 905nm and 1550nm) — it is inevitable that there will be a winning technology and consolidation, as has been seen time and time again, whether it was Ethernet for networking or VHS for video.
When one looks at the users of LiDAR technology — the automotive vehicle manufacturers, along with the companies that design and build autonomous robotic vehicles for transporting people and goods — the most important thing in their minds is their requirements. Ultimately, these companies want suppliers to provide them with LiDAR sensors that are low-cost with a high degree of reliability while meeting the performance specifications of ranging and detection of low-reflectivity objects. Though all engineers have strong viewpoints, these companies are likely to be agnostic to the implementation of the technology if the supplier can meet the performance and reliability requirements at the right cost. And that leads to the fundamental debate that this article aims to help settle: Which wavelength will prevail for automotive LiDAR applications?
To begin to address this question, it is necessary to understand the anatomy of a LiDAR system, of which there are different architectures. Coherent LiDAR, a type of which is referred to as frequency-modulated continuous wave (FMCW), mixes a transmitted laser signal with reflected light to compute the range and velocity of objects. FMCW offers some advantages but it remains relatively uncommon when compared to the most common LiDAR approach, direct time-of-flight (dToF) LiDAR. This implementation measures distance to an object by timing how long it takes for a very short pulse of light sent out from an illumination source to be reflected off an object and returned to be detected by the sensor. It uses the speed of light to directly calculate the distance to the object using the simple mathematical formula relating time, speed, and distance. A typical dToF LiDAR system has six major hardware functions, although the choice of wavelength mostly impacts the transmit and receive functions.
click for full size image
Figure 2: A block diagram of a typical dToF system with green portions representing some focus areas of ON Semiconductor products.
Table 1 shows a list of the various LiDAR manufacturers that range from known automotive Tier-1s to startups across all regions of the globe. Based on market reports and public information, the vast majority of these companies operate their LiDARs at near-infrared (NIR) wavelengths, as opposed to short wave infrared (SWIR) wavelengths. Furthermore, while the SWIR-focused suppliers working on FMCW are restricted to those wavelengths, most of those with a direct time-of-flight implementation have a path to making a system with NIR wavelengths, should they choose, while being able to leverage a lot of their existing IP around functions such as beam-steering and signal processing.
Given that the majority, but not all, of these manufacturers have chosen NIR wavelengths, how they came to this decision and what the implications are should be considered. At the heart of the discussion is some basic physics related to the properties of light and semiconductor materials making up the components used in LiDAR.
Photons fired by the laser in a LiDAR system, which are intended to be bounced off objects and received by the detector, have to compete with ambient photons coming from the sun. Looking at the spectrum of solar radiation and taking into account atmospheric absorption, there are “dips” in the irradiance at certain wavelengths that would reduce the amount of photons existing as noise for the system. At 905nm, there is about 3x higher the amount of solar irradiance than at 1550nm, meaning a NIR system has to contend with more noise that can interfere with the sensor. But this is just one of the factors to take into account when choosing a wavelength for a LiDAR system.
Figure 3: Atmospheric absorption of light results in clear peaks.
The components responsible for sensing the photons in the LiDAR system are different types of photodetectors, so it is important to explain why they may be made up of different semiconductor materials depending on the wavelength to be detected. In a semiconductor, a band gap separates the valence and conduction bands. Photons provide the energy to help electrons overcome that band gap and make the semiconductor conductive, thus creating a photocurrent. Every photon’s energy is related to its wavelength, and a semiconductor’s band gap is related to its sensitivity — this is why different semiconductor materials are needed depending on the wavelength of light that is to be detected. Silicon, which is the most common and cheapest semiconductor to manufacture, is responsive to visible and NIR wavelengths up to about 1000nm. To detect wavelengths beyond that in the SWIR range, alloying of more exotic group III/V semiconductors can be done to make materials like InGaAs capable of detecting those wavelengths of light, from 1000nm to 2500nm.
Early LiDARs used PIN photodiodes as sensors. PIN photodiodes have no inherent gain and as a result, are not able to detect weak signals easily. Avalanche photodiodes (APDs) are the most prominent type of sensor used in LiDAR today and provide a moderate amount of gain. However, APDs also need to operate in linear mode like PIN photodiodes to integrate signal from photon arrivals and also suffer from poor part to part uniformity, while requiring very high bias voltages. The newest types of sensors that are increasingly being used in LiDARs are built on single photon avalanche diodes (SPADs), which have a very large gain and are able to produce a measurable current output from every single photon detected. Silicon photomultipliers (SiPMs) are arrays of silicon-based SPADs that come with the added advantage of being able to distinguish single photons from multiple photons by looking at the amplitude of the generated signal.
Figure 4: Different types of photodetectors used to detect signals in a LiDAR
Circling back to the relevance to the topic of wavelengths, all of these types of photodetectors can be built on silicon (for NIR detection) or III/V semiconductors (for SWIR detection). On the other hand, manufacturability and cost are key to viability for the technology, and CMOS silicon foundries allow for high-volume and low-cost manufacturing of such sensors. This is a primary reason why SiPMs are being increasingly adopted for LiDAR on top of allowing for higher performance. While APDs and SPADs for SWIR exist, it is difficult to integrate them with readout logic due to the fact that the processes are not silicon-based. Lastly, III/V-based SPAD arrays and photomultipliers (analogous to SiPMs) for SWIR have not yet been commercialized, so the ecosystem availability favors the NIR wavelengths.
Generating photons involves an entirely different process. A semiconductor p-n junction as the gain medium can be used to make a laser; this is done by way of pumping a current through the junction causing the resonant emission of photons as atoms go to lower energy bands, resulting in a coherent laser beam output. Semiconductor lasers are based on direct band gap materials like GaAs and InP, which are efficient for the generation of photons that happens when atoms go to a lower energy band, unlike indirect band gap materials such as silicon.
There are two main types of lasers used in LiDAR: edge-emitting laser (EEL) and vertical cavity surface-emitting laser (VCSEL). EELs are more widely used today, owing to their lower cost and higher output efficiency than VCSELs. They are more difficult to package and build into arrays and also suffer from a wavelength shift across temperature which causes the detectors to have to look for a wider band of photon wavelengths, allowing for more ambient photons as noise to also be detected. Despite the higher cost and lower power efficiency, the newer VCSEL technology has the advantage of easy and efficient packaging since the beam is generated from the top. The market adoption of VCSEL is increasing as its costs will continue to decrease significantly and the power efficiency will improve. EELs and VCSELs exist for both NIR and SWIR wavelength generation, with a key difference — NIR wavelengths can be generated with GaAs, while SWIR wavelengths require the use of InGaAsP. GaAs lasers are able to use larger wafer size foundries leading to lower cost, again pointing to an advantage for the ecosystem of NIR LiDAR manufacturers from both a cost and supply chain security perspective.
click for full size image
Figure 5: Different types of lasers used in a LiDAR.
Laser Power and Eye Safety
While talking about the wavelength debate, it is imperative to consider the eye safety implications of a LiDAR system. The concept of dToF LiDAR involves using short laser pulses with a high peak power over a certain angle of view to be emitted to the scene. A pedestrian standing in the path of a LiDAR’s emission path needs to be assured that their eyes will not be damaged by a laser being fired in their direction, and IEC-60825 is a specification that dictates how much the maximum permissible exposure is across the different wavelengths of light. While NIR light, similar to visible light, is able to pass through the cornea and reach the retina in the human eye, SWIR light is mostly absorbed within the cornea, and as a result, is able to be exposed at higher levels.
Figure 6: IEC-60825 specification for eye-safe laser exposure.
Being able to output multiple orders of magnitude higher laser power is an advantage for a 1550nm-based system from a performance perspective, as it allows for more photons to be sent out and thus be returned to be detected. Higher laser powers also come with a thermal tradeoff though. It should be noted that proper eye-safe design has to be done regardless of wavelength by clearly taking into account the energy per pulse and the size of the laser aperture. With a 905nm-based LiDAR, the peak power can be increased by either of these factors, as shown in Figure 7.
Figure 7: Eye-safe laser design for a NIR LiDAR given different optics and laser parameters.
Comparison of NIR and SWIR LiDAR Systems
The above focus on the amount of laser power able to be output brings us back to the sensors being used. A higher-performance sensor that is able to detect weaker signals will clearly benefit the system in multiple ways — in being able to achieve longer range or being able to use less laser power to achieve the same range. ON Semiconductor has developed a series of SiPMs for NIR LiDAR driving the photon detection efficiency (PDE) — a key parameter indicating sensitivity — to a market-leading 18% with its latest RDM-Series sensors.
Figure 8: Process roadmap of ON Semiconductor SiPMs.
To compare the performance of a NIR dToF LiDAR with a SWIR dToF LiDAR, we performed system modeling for identical LiDAR architectures and environmental conditions with differing parameters for the lasers and sensors. The LiDAR architecture is a coaxial system with a 16-channel detector array and a scanning mechanism to spread across the field of view, as shown in Figure 10. This system model has been validated with hardware and allows us to accurately estimate the performance of LiDAR systems.
Figure 9: System model for a dToF LiDAR sensor.
Table 2: LiDAR sensor and laser parameters for NIR and SWIR system model simulation.
The 1550nm system uses a higher amount of laser power, as well as a higher PDE sensor owing to its use of high-PDE InGaAs alloys, which should yield better-ranging performance in our system simulation. Using system-level parameters of 100klux ambient light filtered by a 50nm bandpass on the sensor lens (centered around 905nm and 1550nm respectively), a 0.1° x 5° angle of view scanned over 80° horizontally at 30fps, a 500kHz laser repetition rate with 1ns pulse width, and a 22mm lens diameter, the results are shown in Figure 10.
Figure 10: Simulation results for similar LiDAR systems based on 905nm and 1550nm.
As expected, the 1550nm system is able to range further for a low-reflectivity object, going up to 500m with 99% ranging probability. However, the 905nm-based system still achieves well over 200m of ranging, showing both types of systems can achieve automotive long-range LiDAR requirements in typical environmental conditions. When poor environmental conditions like rain or fog are introduced, the water absorption properties of SWIR light cause its performance to degrade more rapidly than a NIR-based system, which is another factor to take into account.
Having looked extensively at the technology behind LiDAR systems, as well as the implications of using different wavelengths, we now go back to the cost considerations factor. We earlier explained that the sensors being used for NIR-based LiDARs come from native CMOS silicon foundry processes, which enable the lowest possible cost for semiconductors. In addition, they also enable integration of CMOS readout logic with the sensor into one chip by use of stacked die technology, which is readily available at foundries today, further collapsing the signal chain and reducing cost. Conversely, SWIR sensors use III/V semiconductor foundries like InGaAs which are higher cost and new hybrid Ge-Si technology — which may enable lower-cost SWIR sensors — making integration with readout logic easier but are still estimated to be more than 5x more expensive than traditional CMOS silicon even after reaching maturity. On the laser side, the difference in wafer size between the GaAs wafers used for making the laser chips in NIR systems versus the InGaAs wafers used for making the laser chips in SWIR systems again leads to a cost disparity, and the fact that NIR systems have a path to using VCSELs with a much more readily-available supplier base also enables lower-cost integration.
The sum total of the above factors led to an analysis done by IHS Markit (Amsrud, 2019), which showed that for the same type of component (the sensor or laser), the cost for a SWIR system would be 10 to 100 times higher than a NIR system. The average combined component cost for the sensor and laser for a NIR system was estimated to be between $4 to $20 per channel in 2019 and decreasing to $2 to $10 by 2025. By contrast, the equivalent average component cost for a SWIR system was estimated to be $275 per channel in 2019 and decreasing to $155 per channel by 2025. That is a tremendous cost difference when considering the fact that LiDAR systems contain multiple channels, even if using a 1D-scanning approach since a vertical array of single point channels is still required.
Table 3: Summary of cost considerations. (Image source: IHS Markit)
The LiDAR market dynamics also do not favor the SWIR camp. The autonomous driving market has not ramped as quickly as market expectations five years ago, and Level 4 and Level 5 autonomy systems, for which LiDAR is a must, are years away from widespread mass deployment. In the meantime, the industrial and robotics markets making use of LiDAR are even more cost-conscious and have no need for the ultra-high-performance advantages of a SWIR system, so these manufacturers do not have a way in the meantime to bring component costs down by ramping volume as is often claimed. There is a “chicken and the egg” problem of getting the lower cost when the volume ramps but needing the lower cost to get the volumes.
After doing a deep dive into the technology and the differences between NIR and SWIR systems, it is clear why the vast majority of LiDAR systems in existence today are using NIR wavelengths. While the outlook for the future is never 100% certain, it is apparent that the cost and availability of ecosystem suppliers are key factors, and NIR-based systems will certainly always be cheaper due to the technology advantage and economies of scale for CMOS silicon. And while SWIR does allow for a longer-ranging LiDAR system, NIR-based LiDARs can also achieve desired automotive long-range requirements, while also performing extremely well for short- to medium-range configurations also needed in ADAS and AD. The existence of NIR-based LiDARs in mass production for the automotive market today shows that the technology has been commercialized and proven out, but it will still take some time for consolidation to happen and for the winners and losers to shake out. After all, the automobile industry at the turn of the 20th century contained 30 different manufacturers, and that increased to nearly 500 over the next ten years — but it only took a few years after that for most of them to disappear. It is expected that a similar dynamic may happen with LiDAR manufacturers by the end of this decade.
Yole Développement (2020). LiDAR for Automotive and Industrial Applications – Market and Technology Report 2020
Amsrud, P. (2019 September 25). The race to a low-cost LIDAR system [Conference Presentation]. Automotive LIDAR 2019, Detroit, MI, United States. IHS Markit.
— Bahman Hadji, director of business development, automotive sensing division, ON Semiconductor
>> This article was originally published on our sister site, EE Times.
- ADAS experts ponder sensor integration in future vehicles
- Smart cities: the case for lidar in intelligent transport systems
- What’s driving change in automotive electronics systems
- Time-of-flight technology promises enhanced accuracy
- Why FIR sensing technology is essential for achieving fully autonomous vehicles
For more Embedded, subscribe to Embedded’s weekly email newsletter.