Samsung is the world’s biggest semiconductor memory chip maker and is often the first to launch new technologies in the segment. Today, the company unveiled its new CXL (Compute Express Link) memory modules and the HBM3E memory. These products are meant to be used in cloud servers and supercomputers for AI and other high-performance computing needs.
At the Memory Con 2024 expo happening at Santa Clara’s Computer History Museum in Silicon Vallery, California, USA, Samsung introduced the CXL Memory Module – Box (CMM-B), CXL Memory Module – DRAM (CMM-D), CXL Memory Module Hybrid for Tiered Memory (CMM-H TM) and HBM3E 12H chip.
Samsung showcases a CXL-based DRAM memory module and box form factor
CMM-B
CXL Memory Module – Box (CMM-B) is a DRAM product. It can accommodate eight CMM-D devices with the E3.S form factory, offering up to 2TB of DRAM capacity. It has a bandwidth of 60GB/s and a latency of just 596ns. It can be used in applications such as AI, Data Analytics, Generative AI, and In-Memory Database (IMDB).
CMM-D
But what is CMM-D? It is a CXL Memory Module – DRAM, which uses Samsung’s DRAM chips with the CXL open standard interface. It offers efficient and low-latency connectivity between the CPU and memory expansion devices.
Red Hat, the global leader in open-source software, validated Samsung’s CMM-D devices with its enterprise software last year. The two companies will continue their collaboration to develop CXL open-source and reference models. They will also partner on a range of other memory and storage products.
Samsung partnered with Supermicro to showcase the industry’s first Rack-Level memory solution based on the CXL Memory Module Box. It is highly scalable and is used to increase memory bandwidth and capacity for data centers that handle demanding workloads. It is highly flexible and can replace standard architectures that lack efficiency and flexibility for modern applications.
CMM-H TM
In partnership with VMWare by Broadcom, Samsung introduced Project Peaberry. It is the world’s first Field Programmable Gate Arrays (FPGA)-based tiered memory solution for hypervisors (machines that run multiple virtual machines). It is called CXL Memory Module Hybrid for Tiered Memory (CMM-H TM). It is a hybrid solution that combines DRAM and NAND flash memory in the form of an Add-In Card (AIC). It can tackle memory management challenges, improve performance, reduce downtime, and optimize scheduling. It also reduces the total cost of ownership (TCO).
Samsung’s HBM3E 12H memory chip offers the highest HBM capacity ever
Samsung also showcased its HBM3E 12H chip at Memcon 2024. It is the world’s first 12-stack HBM3E DRAM memory chip, offering the highest capacity ever achieved using HBM technology. HBM3E 12H chips feature Samsung’s advanced thermal compression non-conductive film (TC NCF) technology to enhance the vertical density of the chip by over 20% compared to its predecessor. It also offers better product yield. Samsung is sampling HBM3E 12H chips to customers and is planning to start mass production in the first half of this year.
Jin-Hyeok Choi, SangJoon Hwang, Paul Turner, and Gunnar Hellekson were present on stage during the announcements. SangJoon Hwang is the Corporate Executive Vice President and Head of DRAM Product and Technology at Samsung Electronics. Jin-Hyeok Choi is the Corporate Executive Vice President of Device Solutions Research America – Memory at Samsung Electronics. Paul Turner is the Vice President of the Product Team in the VCF Division at VMware by Broadcom. Gunnar Hellekson is the Vice President and General Manager at Red Hat.
Jin-Hyeok Choi, said, “AI innovation cannot continue without memory technology innovation. As the market leader in memory, Samsung is proud to continue advancing innovation – from the industry’s most advanced CMM-B technology, to powerful memory solutions like HBM3E for high-performance computing and demanding AI applications. We are committed to collaborating with our partners and serving our customers to unlock the full potential of the AI era together.“