Categories
Featured

¿Quieres acceso a 96 TB (sí, terabytes) de RAM? Consulte este cuadro de expansión CXL que muestra cómo será el futuro de la memoria

[ad_1]

¿Está buscando aumentar la capacidad del servidor? tecnología financiera 96 demócrata La caja CXL podría ser lo que estás buscando. Fue presentado en la Cumbre OCP 2024 junto con Astera Labs CXL La caja de expansión permite a los usuarios conectar hasta 96 DIMM DDR5 a un solo servidor, proporcionando capacidades de memoria masivas de hasta decenas de terabytes por servidor de memoria.

Como se informó Servicio a domicilio,La caja de expansión se puede conectar de las siguientes ocho maneras Intel Servidor Xeon 6 Granite Rapids-SP, que ofrece mayor rendimiento.

[ad_2]

Source Article Link

Categories
Featured

This tiny box from Samsung can hold 2TB of a special kind of RAM worth tens of thousands of dollars — CXL Memory Module Box hailed as the future of expansive server memory in the age of AI

[ad_1]

At MemCon 2024, Samsung showcased its latest HBM3E technology, talked about its future HBM4 plans, and unveiled the CXL Memory Module Box, also known as CMM-B, the latest addition to its Compute Express Link (CXL) memory module portfolio.

CMM-B is essentially a memory pooling appliance for rack computing leveraging CXL. It supports disaggregated memory allocation, allowing memory capacity available in remote locations to be shared across multiple servers. Through this, CMM-B enables independent resource allocation in the rack cluster and allows for larger pools of memory to be assigned as needed. With up to 60GB/s bandwidth, Samsung says CMM-B is ideal for applications like AI, in-memory databases, and data analytics.

[ad_2]

Source Article Link

Categories
Business Industry

Samsung CXL upgrade – SamMobile

[ad_1]

Samsung is the world’s biggest semiconductor memory chip maker and is often the first to launch new technologies in the segment. Today, the company unveiled its new CXL (Compute Express Link) memory modules and the HBM3E memory. These products are meant to be used in cloud servers and supercomputers for AI and other high-performance computing needs.

At the Memory Con 2024 expo happening at Santa Clara’s Computer History Museum in Silicon Vallery, California, USA, Samsung introduced the CXL Memory Module – Box (CMM-B), CXL Memory Module – DRAM (CMM-D), CXL Memory Module Hybrid for Tiered Memory (CMM-H TM) and HBM3E 12H chip.

Samsung showcases a CXL-based DRAM memory module and box form factor

Samsung CXL Memory Module Box

CMM-B

CXL Memory Module – Box (CMM-B) is a DRAM product. It can accommodate eight CMM-D devices with the E3.S form factory, offering up to 2TB of DRAM capacity. It has a bandwidth of 60GB/s and a latency of just 596ns. It can be used in applications such as AI, Data Analytics, Generative AI, and In-Memory Database (IMDB).

CMM-D

But what is CMM-D? It is a CXL Memory Module – DRAM, which uses Samsung’s DRAM chips with the CXL open standard interface. It offers efficient and low-latency connectivity between the CPU and memory expansion devices.

Red Hat, the global leader in open-source software, validated Samsung’s CMM-D devices with its enterprise software last year. The two companies will continue their collaboration to develop CXL open-source and reference models. They will also partner on a range of other memory and storage products.

Samsung partnered with Supermicro to showcase the industry’s first Rack-Level memory solution based on the CXL Memory Module Box. It is highly scalable and is used to increase memory bandwidth and capacity for data centers that handle demanding workloads. It is highly flexible and can replace standard architectures that lack efficiency and flexibility for modern applications.

CMM-H TM

In partnership with VMWare by Broadcom, Samsung introduced Project Peaberry. It is the world’s first Field Programmable Gate Arrays (FPGA)-based tiered memory solution for hypervisors (machines that run multiple virtual machines). It is called CXL Memory Module Hybrid for Tiered Memory (CMM-H TM). It is a hybrid solution that combines DRAM and NAND flash memory in the form of an Add-In Card (AIC). It can tackle memory management challenges, improve performance, reduce downtime, and optimize scheduling. It also reduces the total cost of ownership (TCO).

Samsung’s HBM3E 12H memory chip offers the highest HBM capacity ever

Samsung HBM3E 12H DRAM Chip

Samsung also showcased its HBM3E 12H chip at Memcon 2024. It is the world’s first 12-stack HBM3E DRAM memory chip, offering the highest capacity ever achieved using HBM technology. HBM3E 12H chips feature Samsung’s advanced thermal compression non-conductive film (TC NCF) technology to enhance the vertical density of the chip by over 20% compared to its predecessor. It also offers better product yield. Samsung is sampling HBM3E 12H chips to customers and is planning to start mass production in the first half of this year.

Jin-Hyeok Choi, SangJoon Hwang, Paul Turner, and Gunnar Hellekson were present on stage during the announcements. SangJoon Hwang is the Corporate Executive Vice President and Head of DRAM Product and Technology at Samsung Electronics. Jin-Hyeok Choi is the Corporate Executive Vice President of Device Solutions Research America – Memory at Samsung Electronics. Paul Turner is the Vice President of the Product Team in the VCF Division at VMware by Broadcom. Gunnar Hellekson is the Vice President and General Manager at Red Hat.

Jin-Hyeok Choi, said, “AI innovation cannot continue without memory technology innovation. As the market leader in memory, Samsung is proud to continue advancing innovation – from the industry’s most advanced CMM-B technology, to powerful memory solutions like HBM3E for high-performance computing and demanding AI applications. We are committed to collaborating with our partners and serving our customers to unlock the full potential of the AI era together.

[ad_2]

Source Article Link

Categories
Featured

Startup claims to boost LLM performance using standard memory instead of GPU HBM — but experts remain unconvinced by the numbers despite promising CXL technology

[ad_1]

MemVerge, a provider of software designed to accelerate and optimize data-intensive applications, has partnered with Micron to boost the performance of LLMs using Compute Express Link (CXL) technology. 

The company’s Memory Machine software uses CXL to reduce idle time in GPUs caused by memory loading.

[ad_2]

Source Article Link