Advertisement

AI moving slowly to factory floor

November 14, 2018

rick.merritt-November 14, 2018

SAN JOSE, Calif. — Deep neural networks are crawling toward the factory floor.

For several early adopters, neural nets are the new intelligence embedded behind the eyes of computer-vision cameras. Ultimately, the networks will snake their way into robotic arms, sensor gateways, and controllers, transforming industrial automation. But the change is coming slowly.

“We’re still in the early phases of what’s likely to be a multi-decade era of advances and next-generation machine learning algorithms, but I think we’ll see enormous progress in the next few years,” said Rob High, chief technology officer for IBM Watson.

Neural networks will nest in growing numbers of Linux-capable, multicore x86 gateways and controllers appearing on and around the factory floor. Emerging 5G cellular networks will one day give neural nets ready access to remote data centers, said High.

Auto and aircraft makers and health-care providers are among those taking early steps, mainly with smart cameras. Canon is embedding Nvidia Jetson boards in its industrial cameras to switch on deep learning. Industrial camera vendor Cognex Corp. is ramping up its own offerings. And China startup Horizon Robotics is already shipping surveillance cameras that embed its deep-learning inference accelerators.

“All the early adopters have deployed deep learning for visual perception, and others are starting to notice them,” said Deepu Talla, general manager of autonomous machines at Nvidia. “Perception is reasonably easy to do, and researchers see it as a solved problem.

“Now the big problems are in using AI for interaction with humans and more detailed actuation — these are 10-year research problems. In areas such as drone and robot navigation, we are more in the stage of prototypes.”

Talla calls robotics “the intersection of computers and AI,” but many industrial uses of deep learning will be less glamorous — and will arrive sooner.

Factory robots are not using AI yet, said Doug Olsen, chief executive of Harmonic Drive LLC, a leading supplier of robotic components. In the short term, don’t watch for smart robotic arms so much as embedded “machines on the factory floor that can predict failures, gathering data about daily use to determine when systems need preventative maintenance,” said Olsen. “That’s where AI can take hold first.”

Some big chipmakers agree. Renesas started experimenting three years ago, putting microcontrollers supporting AI at end nodes to detect faults and predict maintenance needs in production systems at one of its semiconductor fabs.

In October, the Japanese chip giant rolled out its first MCUs with dynamically reconfigurable processor blocks for real-time image processing. It aims to follow up with controllers that can support real-time cognition in 2020 and incremental learning in 2022.

Rival STMicroelectronics is taking a similar approach with its STM32 chips. In February, it announced a deep-learning system-on-chip and an accelerator under development, aimed in part at fault detection on the factory floor.

The smart robots will come eventually. Startup covariant.ai, for one, is working to enable them with reinforcement learning. “Equipping robots to see and act on what they see will be one of the biggest differences that deep learning will make in the next few years,” said Pieter Abbeel, an AI researcher who founded covariant and runs a robotics lab at the University of California at Berkeley.

Abbeel shows jaw-dropping simulations of robots learning to run using neural-net techniques, but it’s still early days. “In fact, we started covariant in part because the industrial AI space is not that crowded yet,” he said.

>> Continue reading page two of this article on our sister site, EE Times: "AI Edges to Factory Floor."

 

 

Loading comments...