Advertisement

GDDR memory finds growing use in AI

July 31, 2018

garyhilson-July 31, 2018

TORONTO — The “G” still stands for “graphics,” but new use cases driving the need for GDDR memory technology have nothing to do with pixels.

In fact, applications such as artificial intelligence (AI) and machine learning, which need ultra-fast memories, have shorted gamers of their GDDR supply, so it’s probably a good idea that makers of the technology are ramping up delivery. Micron Technology recently began volume production of its 8-Gb GDDR6 memory, which, of course, is aimed at the graphics market but also automotive and networking segments.

Some of the emerging uses cases for GDDR memory are still graphics-driven. In the growing automotive memory market, it’s to support increasingly visual dashboards and advanced driver assistance systems (ADAS) that must be responsive to a driver’s actions immediately, while autonomous vehicles need high-performance memory to process the vast amounts of real-time data. Other emerging applications include augmented reality (AR) and virtual reality (VR). Finally, video is always hungry for memory as 4K gets more widely adopted and 8K nips at its heels.

Andreas Schlapka, director of Micron’s compute networking business unit, said that in the last three years, traditional GDDR applications such as graphics cards and game consoles have been joined by more networking applications and autonomous driving. In modern vehicles, high-performance memory must deal with large amounts of data generated by sensors and cameras. Similarly, advanced networking technologies use GDDR to power network interface cards (NICs), he said. “Five years ago, the biggest traffic in data centers was in-and-out traffic. Something was done with your information on a specific node, and it came back to your terminal.” Now there’s much more traffic within the data center from one node to another, which drives enormous needs in terms of throughput.

text

GDDR memory, including Micron’s, has consistently improved from generation to generation as more memory is getting into the same package, and it’s running faster while using less power.


Speaking of throughput, there’s cryptocurrency applications, said Schlapka, as cryptocurrencies and crypto-mining have become more mainstream. There’s a lot of data that must be run through quickly, and memory has to keep up with either an A6 or GPU. Similarly, high-performance computing, especially AI, benefits from the bandwidth of GDDR, he said. “If you train a neural network, you run through as much data as you can — ideally, pin to byte. And you need to load them into a memory and then compute them pretty fast. I think that’s when GDDR becomes a big deal and a very good solution.”

GDDR has seen a steady evolution with faster speed per pin, said Tien Shiah, specialty DRAM marketing manager for Samsung Semiconductor, Inc., and is traditionally used alongside GPUs for graphics applications. “Now the GPUs are finding their way, especially the high-end ones, into machine-learning-type applications because those types of AI algorithms are very well-suited for the parallel architecture of a GPU.” However, he said, some applications need to go further and use high-bandwidth memory (HBM). Samsung began production of 16-Gigabit GDDR6 for advanced graphics systems at the beginning of the year.

>> Continue reading page two of this article on our sister site, EE Times: "New uses vie for GDDR6 supply."

 

Loading comments...