AI at the edge fits many needs

As the adoption of artificial intelligence (AI) and machine learning (ML) grows, the ability to process large amounts of data in the form of algorithms for computational purposes becomes increasingly important.

To help make the expanding use of data applications across billions of connected devices more efficient and valuable, there is growing momentum to migrate the processing from centralized third-party cloud servers to decentralized and localized on-device, commonly referred to as edge computing. According to SAR Insight & Consulting’s latest artificial intelligence/machine learning embedded chips database, the global number of AI-enabled devices with edge computing will grow at a CAGR of 64.2% during the period 2019-2024.

Data computation at the edge, no network needed

Edge AI takes the algorithms and processes the data as close as possible to the physical system, in this case, locally on the hardware device. The advantage is that the processing of data does not require a connection. The computation of data happens near the ‘edge’ of a network, where the data is being developed, instead of in a centralized data-processing center. Determining the right balance between how much processing can and should be done on the edge will become one of the most important decisions for device, technology and component providers.

Given the training and inferencing engines that produce deep learning predictive models, edge processing usually requires an x86 or an Arm processor from suppliers such as Intel, Qualcomm, Nvidia and Google, an AI accelerator, and be able to handle speeds of up to 2.5 GHz with 10–14 cores.

Getting real-time results for time-sensitive applications

Given the expanding markets and expanding service and application demands placed on computational data and power, there are several factors and benefits driving the growth of edge computing.  Because of the shifting needs of reliable, adaptable, and contextual information, a majority of the data is migrating locally to on-device processing resulting in faster performance and response time (in less than a few milliseconds), lower latency, power efficiency, improved security by retaining data on the device, and cost savings by minimizing data center transports.

One of the biggest benefits of edge computing is the ability to secure real-time results for time-sensitive needs. In many cases, sensor data can be collected, analyzed and communicated straightaway, without having to send the data to a time-sensitive cloud center.

The scalability across various edge devices to help speed local decision-making is fundamental. The ability to provide immediate and reliable data builds confidence, increases customer engagement and, in many cases, saves lives. Just think of all of the industries — home security, aerospace, automotive, smart cities, healthcare — where the immediate interpretation of diagnostics and equipment performance is critical.

AI edge developments

Innovative organizations such as Amazon, Google, Apple, BMW, Volkswagen, Tesla, Airbus, Fraunhofer, Vodafone, Deutch Telekom, Ericsson, and Harting, are now embracing and hedging their bets for AI at the edge. A number of these companies are forming trade associations such as the European Edge Computing Consortium (EECC) to help educate and motivate small, medium-sized and large enterprises to drive the adoption of edge computing within manufacturing and other industrial markets.

The goals of the EECC initiative include the specification of a reference architecture model for edge computing, the development of reference technology stacks (EECC edge nodes), the identification of gaps and recommendation of best practices by evaluating approaches within multiple scenarios, and synchronization with related initiatives/standardization organizations.


Proposed reference architecture model for edge computing (Image: European Edge Computing Consortium).

Looking over the edge

The advancement of AI and machine learning is providing numerous opportunities to create smart devices that are contextually aware of their environment. The demands placed on smart machines will benefit from the growth in multi-sensory data that can compute with greater precision and performance. Edge computing provides an opportunity to turn AI data into ‘real-time’ value across almost every industry. Intelligent edge is the next stage in the evolution and success of AI technology.

>> This article was originally published on our sister site, EE Times.


Dennis Goldenson is director, artificial intelligence & machine learning, at SAR Insight & Consulting.

 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.