Advertisement

New chips look to accelerate AI in sensor devices

February 11, 2019

rick.merritt-February 11, 2019

SAN JOSE, Calif. — With a $13.2 million Series A round, startup AIStorm publicly enters an already crowded market for machine learning at the edge. It claims that its novel techniques of processing neural-networking tasks at the level of sensor signals will undercut rivals in power and cost.

The startup already has working 65-nm silicon for analog front ends (AFEs) and aims to sample chips by the end of the year addressing as many as six different markets. It claims that its 7 × 7-mm devices will deliver 2.5 TOPS and 11.1 TOPS/W.

Virtually every vendor of embedded processors has ramped up an AI strategy to ride an expected wave of machine-learning applications, many with their first products already available.

AIStorm hopes to leverage investors who will also act as partners and customers to take its products to market in volume in 2020. However, this year, it will have to build and demonstrate programming techniques for its new approach.

“Because they were available when the AI math emerged, the industry lurched into GPUs, but we want to eliminate GPUs at the edge because we believe we can deliver the equivalent to GHz+ digital processing at lower power levels using much older process technologies,” said David Schie, an analog chip veteran who is chief executive and co-founder of AIStorm.

To achieve that result, the startup works in what Schie calls the charge domain, processing raw signals before they are digitized from sensors such as CMOS imagers.

“We are working at the electron level, multiplying electrons,” while digital rivals convert signals for processing in steps that “add noise, power, and latency,” he said. But AIStorm is not using a “conventional analog multiplier, it’s a different way to do it” than used by analog rivals Mythic and Syntiant, who do analog processing in flash arrays.

For example, the startup’s chips work with CMOS sensors from investor TowerJazz. “The TowerJazz pixel is part of our input layer, so the charge comes from sensors, they produce electrons, and we multiply and move them,” Schie added.

>> Continue reading page two of this article on our sister site, EE Times: "Startup Accelerates AI at the Sensor."

Loading comments...