Industry partners unveil cloud-to-car reference architecture - Embedded.com

Industry partners unveil cloud-to-car reference architecture

When carmakers think of new-generation vehicles these days, they envision a car-to-AI model under which a car can progressively scale from Level 2+ to Level 4, with autonomy features driven by machine learning data that perpetually improves and dynamically updates throughout its lifecycle.

This is, in essence, a “connected, software-defined vehicle” — a template pioneered by Tesla.

But to design a connected vehicle architecture that follows this evolutionary upgradable path from Level 2+ to Level 4, “There is a big gap” to fill, said Daniel Richart, co-founder & CEO at Teraki. He notes the huge difference between “a concept model that works in a lab” and a connected software-defined model “that can work on flexible, scalable E/E architectures in lower-cost production model.”

Teraki on Wednesday unveiled its Fusion Project, which it has been developing with partners Airbiquity, Cloudera, NXP Semiconductors and Wind River for almost a year. The five companies have come up with a “pre-integrated hardware and software solution” to enable carmakers to “efficiently collect, analyze, and manage connected vehicle data for continuous feature development, deployment, and evolution.”

Phil Magney, founder and president of VSI Labs, described the Fusion Project as “a good reference architecture for the cloud-to-car technologies involved in developing, deploying, and maintaining AI-based ADAS and AD driving applications.”

He stressed, “The key here is that development is never complete with AI-based applications. The machine is always learning. But managing the data cycle requires a vast collection of technologies, from the curation of event-based sensor data, to the transport of the data, to training the model, and to the deployment of new algorithms.”

click for full size image

(Source: Teraki)

Five partners

In the Fusion platform, Airbiquity is responsible for over-the-air (OTA) software management. Cloudera contributes cloud-independent machine learning bench tools, NXP Semiconductors vehicle processing platforms (Bluebox and Goldbox), and Teraki edge data AI. Wind River’s role is intelligent edge system software.

The goal is an efficient data lifecycle platform that does everything from data ingestion to machine learning model updates, “without degrading data, while maximizing AI accuracy,” explained Richart.

One of the big challenges facing carmakers today is handling the data from the growing number of sensors inside an autonomous vehicle, an array that generates 5 to 20 terabytes of data per day, per vehicle. The limited ability to ingest real-time data from vehicles creates an obvious problem. The inability to combine all types of data to build machine learning models is another roadblock. More important, data management across the machine learning lifecycle is fragmented among the ingestion, learning and deployment phases.

Teraki’s edge solutions are designed to manage ML-based data handling requirements. Magney explained, “It starts with knowing what to look for and curating the associated sensor data to support the learning. The Teraki solution also purports efficiency and can aggregate data packages into a smaller footprint that can be sent to the cloud for ML-training.”

A year ago, when EE Timesfirst met with Teraki at the 2020 Consumer Electronics Show, Richart said his company’s software technology would focus on car OEMs’ biggest current problem: the lack of a vehicle CPU powerful enough to process and send the exploding volume of data to the cloud for AI training.

Making good on that mission in a year, Teraki, based in Berlin, has pulled together an ecosystem of partners and assembled the technologies necessary to develop car-to-cloud for ML-based solutions. Their efforts attacked “edge processing to handle sensor data ingestion, event curation and then packaging for transport,” Magney observed. “There are also network challenges such as bandwidth and latency that are addressed with Fusion Project partner, NXP.”

In Richart’s mind, the biggest achievement is that the Fusion Project has proven that its first AI algorithm — for lane-change detection — can continuously improve through its car-to-cloud platform. “We trained the AI model that initially achieved a 90- to 95% accuracy. Then, we retrained it to 98% accuracy.”

He stressed, “The whole point of this exercise is that OEMs can now take that accuracy up to 99 percent plus, by continuously training and using a set of sensors they prefer. They can apply this for any models they want to try. We’ve just made it feasible for OEMs to do the AI training quickly, while they can also use it and implement it on production-grade hardware, not on specific expensive lab cars.”

click for full size image

(Source: Teraki)

Geert-Jan van Nunen, CCO at Teraki, added that the Fusion Project provides an opportunity for OEMs to train their own AI models and “get ownership of IP again,” rather than depending on others. Take Mobileye, he said. Noting that Mobileye supplies a black-box solution that gives OEMs only high-level information, van Nunen said, “We provide an open system where we are bringing back IP in the hands of OEMs.”

Is the Fusion Project meant to let every OEM develop its own AV stack? Magney said. “From a development standpoint OEMs have a lot of options in terms of developing their own AV stack.” However, he cautioned, “The field is getting more crowded as there is growing interest in optimized solutions that can scale up from ADAS to AD.”

Magney stressed, “But the issue at hand here has less to do with the AV stack than the management of that AV stack, particularly for AI-based solutions. The Fusion project purports to assemble the technologies necessary to support this.”

>> This article was originally published on our sister site, EE Times.


Related Contents:

For more Embedded, subscribe to Embedded’s weekly email newsletter.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.