SAN FRANCISCO — The AI revolution is just getting started and it will need a wide variety of much more powerful semiconductors, a pioneer of the field told 3,000 chip designers in a keynote opening the International Solid-State Circuits Conference here.
Today’s supervised neural networks are getting wide use but are limited to processes their human creators set in motion. “In my opinion, the future of AI is self-supervised learning,” said Yann LeCun, who is considered the father of convolutional neural nets (CNNs) now used widely in computer vision and other systems.
Generalized Adverserial Networks are showing promise as one technique to let systems make their own predictions. LeCun showed examples of GANs used for designing fashionable clothes and guiding self-driving cars.
Future algorithms will demand much larger models requiring more horsepower than today’s already limited silicon. Tomorrow’s neural nets also will be more dynamic and sparse, using new kinds of basic primitives such as dynamic, irregular graphs.
Today’s chips using multiply-accumulate arrays to process tensors likely won’t be useful for the new kinds of operations algorithm designers are developing. The good news is a variety of low cost, low power inference accelerators for embedded systems will be the biggest opportunity, he predicted.
An algorithm LeCun and others presented last year uses predictors at each feature level. Click to enlarge. (Source: ISSCC)
LeCun got his start in neural networks designing systems at AT&T Bell Labs in 1988, leading to a widely used banking system for reading checks. He wrote one of the first papers on CNNs in 1989. “Now CNNs will be everywhere,” including cars, camera, and robots, said LeCun who now conducts AI research at Facebook.
Since it got started in the 1950s, the technology has endured two neural network winters. In the wake of the last one, LeCun helped design an FPGA-based guidance system for a robot that was rejected for presentation at a 2011 conference.
“Most people did not believe a system they never heard of could work so well,” he quipped.
>> This article originally appeared on our sister site, EE Times: “AI Pioneer Sees Chip Renaissance.”