Lumotive, a developer of solid-state lidar systems has introduced an open lidar API (application programming interface) to accelerate the adoption of what it calls Lidar 2.0 – software-defined lidar enabling scalable, ubiquitous, and lower cost 3D sensing.
With multiple lidar technology and product announcements at CES 2022 this year, we talked to Lumotive’s CEO, Sam Heidari to understand more. He said the lidar landscape is set to change with much of it going solid state. He added, “Also, software-defined lidar will enable lots of things that can’t be done, such as having different frame rates for different portions of the field of view (FoV).” He said industry experts and observers consider Lidar 2.0 to be the inevitable future of lidar, as a next-generation 3D sensing technology that is software-defined, solid-state, and scalable — helping to make lidar ubiquitous by lowering costs, speeding innovation, and improving user experience.
In order for the industry to harness the benefits of Lidar 2.0, Lumotive said it is leading the effort to define an open lidar API to allow application developers, perception software developers, and lidar hardware developers to use a common and future-proof interface for software-defined lidar.
At the core of the open lidar API is the ability to control Lidar 2.0 hardware in real time with ultra-low latency, creating regions of interest in the field of view with a user-defined frame rate, resolution, range, and other parameters. The output of multiple regions of interest from a single physical Lidar 2.0 sensor can be virtualized as multiple independent sensors. The flexibility of the open lidar API provides solution developers access to improved performance through real-time object tracking and application specific-scan modes, which reduces system cost and complexity through virtualized sensors.
The open lidar API also provides a common format interface for point cloud data and support for basic lidar control functions of legacy lidar hardware. In addition, Lumotive said it is committed to having all its current and future products compatible with the open lidar API.
In conjunction with the interface release, Lumotive also announced close partnerships with three sensing product development companies — the first members of the Open Lidar API ecosystem. Adaptive 3D edge perception platform company Cron AI, environmental perception safety solution provider LAKE FUSION Technologies, 3D computer vision developer Seoul Robotics, together with Lumotive, will jointly leverage the Open Lidar API to include advanced Lidar 2.0 features and ensure its portability across products and industries.
“Providing the Open Lidar API through our Lidar 2.0 solution will enable advanced solutions which are easily software customized for distinct applications,” said Sam Heidari, CEO of Lumotive. “Not only can our customers take full advantage of all the powerful software features of our Meta-Lidar platform, but they also get access to an ecosystem of software partners developing differentiated 3D sensing solutions where their ease of portability across different hardware platforms will provide significant future leverage.”
“The availability of the Open Lidar API with more access to controlling operating modes and data streams is extraordinary,” said Christian Meyer, CEO of LAKE FUSION Technologies. “The interaction with LFT’s deterministic perception software forms an excellent basis for a highly attractive mass market introduction, especially for environmental safety applications.”
“Open Lidar API is going to be a real game changer,” said Neil Huntingdon, CSO of Cron AI. “It opens up the opportunity for new and innovative adaptive perception strategies where Lumotive’s Meta-Lidar and Cron AI’s SenseEdge deep-learning-first perception processing platform combine as a system that can adapt in real time across the constantly changing real-world of contexts to deliver consistently reliable and accurate object data for smart infrastructure, intelligent transport and autonomous mobility.”
“Until now, we have been limited by proprietary lidar APIs,” said HanBin Lee, CEO of Seoul Robotics. “With Lumotive’s Open Lidar API, we have more control over the operation of the lidar-based systems we are developing, and we can more easily port our leading lidar 3D perception stack to compatible hardware platforms.”
Lidar integrated in ZKW headlight
Meanwhile, in a separate announcement, Lumotive also said it was demonstrating a functional implementation of its lidar technology integrated in a vehicle headlight from ZKW Group. The ZKW integration features a prototype of the Lumotive M30 module, the workhorse of the Meta-Lidar platform, which uses pulsed laser beams to measure distances between objects and the sensors around the vehicle. The Meta-Lidar platform generates accurate and precise spatial data that can be used by a driver to avoid collisions or to further automate driving scenarios.
“Integrating lidar technology with vehicle lighting systems is the ideal application for Lumotive’s tiny form-factor lidar module, and this collaboration between Lumotive and ZKW enables a new and exciting era in lidar use cases,” said Alexis Debray, senior technology and market analyst, emerging technologies at Yole Développement (Yole). “The market for automotive lidar is expected to reach $2.3 billion in 2026 and then $6.2 billion in 2032, including lidar integrated in next-generation lighting systems that incorporate 3D sensing capabilities.”
Lumotive’s products are based on the company’s light control metasurface (LCM) solid-state beam steering chips, which significantly reduce the complexity, cost and size of lidar systems while improving performance and reliably operating in the challenging headlamp environment. Manufactured in proven and scalable CMOS semiconductor processes, LCM chips eliminate the need for bulky mechanical moving parts that challenge the cost and reliability of traditional lidar devices while delivering new levels of perception, detection and navigation in autonomous systems.
In addition, LCM technology enables software-defined lidar capability, allowing the lidar scan pattern, frame rate and resolution to be customized for specific use cases in real time. The company’s M30 lidar has a range of up to 20 meters, high resolution and a wide field of view, in a golf ball-sized form factor suited for integration around vehicles for autonomy and safety cocoon applications. The M30 is the first in a series of Lumotive products based on the Meta-Lidar Platform enabling a diversity of applications requiring sensing ranges up to 200 meters, such as autonomous driving, or form factors less than 1 cm3 for integration into smartphones, AR devices, and wearables.
“Our collaboration with ZKW is a true technological breakthrough that brings the power and usefulness of scalable, solid-state lidar to world-class vehicular lighting solutions,” said Axel Fuchs, VP business development at Lumotive. “Our Meta-Lidar Platform is the most cost-effective approach that provides the performance and flexibility needed for volume production and mass adoption of sensor-enabled headlights, and we look forward to continuing to work closely with ZKW to commercialize the solution.”
- Silicon photonics is key to ubiquitous 3D sensing with lidar on chip
- 3D lidar sensors deliver remote mobile surveillance
- Chip-integrated FMCW lidar sensors deliver coherent 3D vision
- Blaize and LeiShen integrate lidar and AI for automotive and smart cities