A memory equipped with its own calculation units, for systems that are up to twice as efficient and consume 70% less energy.
–
Samsung is developing a new type of HBM memory that integrates AI processing units: the HBM-PIM (Processing In Memory). This memory would therefore be capable of carrying out certain calculations itself which are traditionally incumbent on CPUs, GPUs or other FPGAs; it is about a power of 1.2 TFLOPS FP16. The architecture would double the performance compared to a system with “traditional” HBM memory and reduce power consumption by 70%.
Each bank embeds a PCU (Programmable Computing Unit Block) clocked at 300 MHz: in all, there are 32 PCUs per chip. Naturally, these PCUs take up space: they reduce the capacity by half (4 Gb instead of 8 Gb for HBM2 memory without PCU). Samsung partially remedies this problem by combining four chips with 4 Gb PCU and four chips without 8 Gb PCU, for a total capacity of 6 GB. By the way, you have surely noticed that the technical documentation refers to FIMDRAM technology (Function-In Memory DRAM); this is the internal name, the final name for this memory is indeed HMB-PIM.
SK Hynix introduces HBM2E memory, faster than Samsung’s
Avoid unnecessary data transfers
The idea behind this architecture is as follows: to avoid data transfer between memory and processor, which is costly in terms of energy and time. Thus, integrating PCUs directly into the memory eliminates these paths; this saves time, reduces consumption and therefore the heat generated. A decisive attribute for HBM chips, which are often stacked on top of each other in systems where cooling is not easy.
In addition, Samsung indicates that this HMB-PIM memory does not “Requires no modification of conventional memory controllers and their control protocols, making FIMDRAM ideal for rapid adoption by industry.”
HMB-PIM memory is currently being tested in AI accelerators. All validations will be made by the end of the first half of 2021. However, do not expect to see it land in our graphics cards quickly: it will be intended for the HPC, data center and mobile applications AI sectors in a first time.
Kwangil Park, senior vice president of memory product planning at Samsung Electronics, says: “Our revolutionary HBM-PIM is the industry’s first programmable PIM solution suitable for various AI-related workloads such as HPC, training and inference. We plan to take advantage of this advancement by collaborating more with providers of AI solutions for even more advanced PIM applications ”.