Advertisement

ARM faces AI chip competition

April 16, 2018

rick.merritt-April 16, 2018

SAN JOSE, Calif. — Nearly a dozen processor cores for accelerating machine-learning jobs on clients are racing for spots in SoCs, with some already designed into smartphones. They aim to get a time-to-market advantage over processor-IP giant Arm that is expected to announce its own soon.

The competition shows that much of the action in machine-learning silicon is shifting to low-power client blocks, according to market watcher Linley Gwennap. However, a race among high-performance chips for the data center is still in its early stages, he told EE Times in a preview of his April 11 keynote for the Linley Processor Conference.

“Arm has dominated the IP landscape for CPUs and taken over for GPUs as well, but this AI engine creates a whole new market for cores, and other companies are getting a head start,” said Gwennap.

The new players getting traction include:

  • Apple’s Bionic neural engine in the A11 SoC in its iPhone
  • The DeePhi block in Samsung’s Exynos 9810 in the Galaxy S9
  • The neural engine from China’s Cambricon in Huawei’s Kirin 970 handset
  • The Cadence P5 for vision and AI acceleration in MediaTek’s P30 SoC
  • Possible use of the Movidius accelerator in Intel’s future PC chip sets

The existing design wins have locked up many of the sockets in premium smartphones that represent about a third of the overall handset market. Gwennap expects that AI acceleration will filter down to the rest of the handset market over the next two to three years.

Beyond smartphones, cars are an increasingly large market for AI chips. PCs, tablets, and IoT devices will round out the market.

To keep pace, Arm announced in February a blanket effort that it calls Project Trillium. But “what they need to be competitive is some specific hardware accelerator to optimize power efficiency,” said Gwennap.

““Arm is developing that kind of accelerator and plans to release its first product this summer...The fact is that they are behind, which has created an opportunity for the newer companies to jump in.”

Last October, Arm announced it had formed a machine-learning group. In February, it provided a few details of its plans.

Arm is likely to provide product details at its annual October event in Silicon Valley. But there’s no guarantee that Arm will make up lost ground because there’s not necessarily a close tie between neural net engines and CPUs.

 

Raw performance numbers of client inference accelerators announced so far are just part of the story. (Chart: The Linley Group)
Raw performance numbers of client inference accelerators announced so far are just part of the story. (Chart: The Linley Group)


Continue reading page two on Embedded's sister site, EE Times: "ARM under attack in AI."

 

Loading comments...

Most Commented