Almost 18 months on from the initial announcement of the Cloud AI 100 AI accelerator, Qualcomm has released a few further details of the solution form factors this chip will be available in, and a few performance figures for those cards. The mobile silicon giant revealed that Cloud AI 100 final silicon is in production and will ship in the first half of 2021.
The Cloud AI 100 was launched almost 18 months ago in April 2019. While Qualcomm dominates the smartphone processor market with its Snapdragon line, Qualcomm’s offerings for servers had previously not taken off — its Arm-based Centriq line was pulled in 2018 just a year after launch. The mobile silicon maker therefore sees the Cloud AI 100 as a way into the edge server market.
Qualcomm’s Cloud AI 100 is targeted at near-edge applications including enterprise data centers and 5G infrastructure (Image: Qualcomm)
While many details of the Cloud AI 100 chip are still under wraps, we now know that it will be available on three types of card initially. These are a dual M.2 edge (DM.2e) form factor which offers more than 50 TOPS at 15 Watts, a dual M.2 (DM.2) card which is configured for 200 TOPS at 25 Watts, and a PCIe card which comes in around 400 TOPS at 75 Watts. (Qualcomm noted that these are “raw” TOPS figures, the theoretical maximum, and don’t represent the compute that might be achieved in a real application).
Qualcomm seems to have transferred its skill in low-power processor design from the mobile space to the edge AI accelerator market. While the Cloud AI 100 design is not based on the AI acceleration blocks found in Snapdragon processors, the performance per Watt figures seem impressive. Slides shown by Qualcomm during its media briefing contained a graph that had the PCIe card version of the Cloud AI 100 outperforming many of the industry’s most popular solutions while using only a fraction of the power. These are measured performance numbers compared to publicly noted numbers from other suppliers, according to Qualcomm.
click for larger image
Resnet-50 AI inferences per second for popular AI inference hardware versus power consumption, according to Qualcomm’s figures. Qualcomm’s cards appear to perform particularly well when it comes to power efficiency. Batch size is 8 for all points on the graph except Nvidia A100. (Image: Qualcomm)
“At Qualcomm, we have a long heritage of AI R&D,” said John Kehrli, senior director of product management for Qualcomm. “We’re actually in our fifth-generation solution from the mobile side, we have over 11 years of very active R&D. So we are leveraging that knowledge, that industry expertise, but this is a different AI core, it’s not the same as mobile, but we are leveraging from that space.”
The Cloud AI 100 is an inference accelerator has up to 16 AI processor cores which support INT8, INT16, FP16 and FP32. It’s built on a 7nm FinFET process technology. There is up to 144 MB of on-die SRAM – 128 MB is shared across the cores with each core having an additional 1 MB. The card supports up to 32 GB of DRAM at the card level with memory bandwidth of 4x 64 LPDDR4x running at up to 2.1 GHz.
The Cloud AI 100 can be used for computer vision, speech, autonomous driving, language translation and recommendation systems. Today, Qualcomm is positioning its Cloud AI 100 silicon for AI inference in four key markets. They are data centers outside the cloud, ADAS, 5G edge boxes and 5G infrastructure.
AI inference in data centres at the edge of the cloud powers everything from recommendation engines used to serve ads and personalize newsfeeds to more application-specific compute-hungry AI computation.
What Qualcomm calls “5G edge boxes” are more like on-premise embedded standalone devices. These might be deployed as part of smart city infrastructure on telegraph poles, or in enterprises, and used to drive smart city, public safety and traffic management applications.
A derivative of the Cloud AI 100 is used to power AI processing for autonomous driving in the Qualcomm Snapdragon Ride platform (this derivative shares a common architecture and software tool chain, Kehrli said).
The Cloud AI 100’s application in 5G infrastructure would be to accelerate complex algorithms such as beam-forming, which now use AI to efficiently manage 5G base stations.
Qualcomm also announced its new Cloud AI 100 development kit. This includes a reference design for a 5G edge box powered by the Cloud AI 100 alongside a Snapdragon 865 application and video processor as a host processor (which provides a full video pipeline supporting up to 24 streams of full HD video decode, with headroom for customer app development). The reference design also uses a pre-certified 5G module based on a Snapdragon X55 5G modem.
“This is really a greenfield opportunity for us that we’re very excited about,” said Kehrli. “The objective [for the development kit] is for customers that are interested in this space to quickly and easily run a demo app right out of the box – it even comes with a pre-compiled ResNet-50 as a demo.”
A 5G edge box like this might be doing on-premise analytics for security video streaming in a shopping mall or a safety application in a manufacturing facility, Kehrli said.
The first customers to adopt the Cloud AI 100 will most likely be at the edge, in smart city, retail and manufacturing, Kehrli said.
“I expect our first commercial deployments to be more on the edge side than on the data center side, where there’s a much, much longer cycle to get it into production,” he said. “That’s not to say we don’t have significant traction and opportunities there, but I expect more that 5G edge deployments will be much faster and things like our pre-certified 5G module makes that much easier. A lot of these customers that we work with are not your traditional mobile customers. So providing them a solution that’s already pre-certified, they can quickly go to market. So [applications] in that space will come up faster.”
Final silicon for the Qualcomm Cloud AI 100 has gone to production, is sampling now to multiple customers and will ship during the first half of 2021. The edge development kit will sample next month but to select customers only.
>> This article was originally published on our sister site, EE Times.