Lidar tech features long-range detection - Embedded.com

Lidar tech features long-range detection

AEye’s ability to detect tiny objects at a distance of 120 meters using multiple measurement points is crucial for autonomous cars and trucks.

Along with cameras and radar, LiDAR sensors are an important technology for the development of autonomous driving. AEye, situated in Dublin, California, has created a long-range LiDAR system that combines an amplifiable 1550 nm laser with a proprietary scanner with a microelectromechanical system (MEMS). This technique may be customized and optimized for certain vehicles and applications using software. Indu Vijayan, AEye’s head of product management for ADAS solutions, answered key questions about the prospects for autonomous vehicle in an interview with EE Times Europe, which analyzed her recent keynote presentation at DesignCon 2021 held in Silicon Valley in August.


AEye’s Indu Vijayan

AEye claims its LiDAR can detect vehicles at a distance of 1,000 meters and people at a distance of up to 200 meters. And its ability to detect tiny objects (such as bricks) at a distance of 120 meters using multiple measurement points is crucial for autonomous cars and trucks.

With their large mass and longer distance needed to stop, commercial vehicles face particular challenges in providing safe autonomous driving. Automation for these vehicles will involve the adoption of a high-performance, long-range sensor to ensure sufficient processing time for automated decisions and actions.

EE Times Europe: What is the current opportunity for the autonomous vehicle market and what are the growth factors?

Vijayan: If we’re talking about automotive, the current opportunity is in ADAS, specifically delivering next-level advanced safety features to OEMs. We’re all familiar with ADAS features like cruise control, emergency braking and lane-keep assist. These features, which have historically leveraged radar and/or camera sensors, increase a car’s safety rating while enabling OEMs to deliver value-add to their customers.

Automakers are now looking to deploy more advanced ADAS features, and to do so safely, they need LiDAR. Cameras have great resolution and color information but are limited in certain daytime lighting conditions, don’t work well at night, and can only estimate the distance and placement of objects. Meanwhile, radar has good performance in poor weather conditions but doesn’t provide sufficient resolution at range, nor deal well with the certainty of objects’ locations due to multipath. LiDAR fills in these performance gaps and is the only deterministic sensor that can provide absolute certainty that an object is in your way, so that the car’s path-planning system can make the safest driving decision.

AEye’s LiDAR specifically is ideal for applications like highway autopilot and hub-to-hub autonomous trucking, which require long-range, small object detection at speed. Our use of an amplifiable 1550 nm wavelength incorporated into our novel architecture allows AEye’s LiDAR system to achieve industry-leading performance at range – seeing objects like vehicles and road signs at a thousand meters. We project the automotive ADAS market for long-range LIDAR to reach $3 billion in size by 2025 and continue growing rapidly at a CAGR of 43% from 2025 through 2030, while we anticipate the mobility market to grow rapidly, with a projected CAGR of 72% from 2025 through 2030.

EE Times Europe: Infrastructure and regulation are two hurdles that need to be addressed in order to roll out autonomous vehicles. Where do we stand with regard to these obstacles?

Vijayan: On the regulation front, the U.S. has been criticized for the lack of a federal regulatory framework to address AV testing and deployment. To date, regulation is done at the state level, without uniformity. The Federal Government appears to be working towards clear regulations to govern this industry, but we are seeing public and private coalitions stepping in. Meanwhile, countries like Germany are taking a more aggressive stance toward autonomy – adopting legislation that will allow Level 4 autonomous driving (driverless vehicles) on public roads by 2022, without human safety operators.

On the infrastructure front, we are seeing the digitalization of infrastructure – such as collision avoidance cameras, intelligent streetlights, RFID-equipped lane markers and signs, curb sensors, and advanced traffic management systems, as well as experimentation with 5G and other infrastructure-based technologies to connect with intelligent vehicles. Still, a lot of work remains on the infrastructure side to reduce the “Roughness Index”. Potholes, striping errors and lack of regular maintenance (i.e., fading lane markers) can be a major cause of accidents, especially for trucks. And minute alterations on roads can have a huge impact on the ability to scale AV rollouts on public roads.

Smart infrastructure, connectivity, and autonomous vehicles fitted with smarter, edge-intelligent sensors will combine to provide AVs the information needed to make smart decisions. The data collected can also be used to learn and adapt the AI running on AVs, which can then be applied as software upgrades to enable the AVs to react to dynamic situations.

click for full size image

AEye‘s System

EE Times Europe: A primary challenge that auto manufacturers face is keeping up with sensor and data development. What is AEye doing to address this challenge?

Vijayan: AEye has developed a software-definable system, which means it is flexible and adaptable. Other LiDAR sensors are anchored in the hardware-first way of thinking. AEye moves the complexity from the hardware to the software. In our system approach, hardware and software have an iterative, adaptable relationship that can be continuously optimized through built-in feedback loops. The world isn’t static, and AEye’s sensor provides the ability to dynamically adapt from one situation to the next in real-time. We believe our software-driven hardware platform will enhance any automotive system.

Another challenge auto manufacturers face is sourcing automotive-grade sensors that can stand the test of time in their vehicles for 10-15 years. We license our reference architecture to Tier 1s, enabling them to manufacture their own custom products and sell these solutions to their automotive OEM customers. This ensures OEMs receive high-quality, reliable products that meet their unique specifications at the lowest possible cost from proven automotive suppliers with whom they have long, well-established relationships.

EE Times Europe: What other challenges are preventing the widespread production and adoption of autonomous vehicles?

Vijayan: A challenge historically has been price, but LiDAR has seen a massive price drop in the past three years, and as we see the automotive market ramp up volume production, we’ll continue to see cost-downs and economies of scale that will parallel that of radar, with costs dropping to $100-$1000 for ADAS deployments.

We’ve also seen that it takes technology maturation, a mature business model, and an established automotive supply chain to bring self-driving to market. We will see autonomy roll out gradually as Tier 1 automotive suppliers introduce LiDAR-powered advanced safety features to the market.

>> This article was originally published on our sister site, EE Times Europe.


Related Contents:

For more Embedded, subscribe to Embedded’s weekly email newsletter.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.