Micron has showcased its colossal 256GB DDR5-8800 MCRDIMM memory modules at the recent Nvidia GTC 2024 conference.
The high-capacity, double-height, 20-watt modules are tailored for next-generation AI servers, such as those based on Intel‘s Xeon Scalable ‘Granite Rapid’ processors which require substantial memory for training.
Tom’s Hardware, which got to see the memory module first hand, and take the photo above, says the company displayed a ‘Tall’ version of the module at the GTC, but it also intends to offer Standard height MCRDIMMs suitable for 1U servers.
Multiplexer Combined Ranks DIMMs
Both versions of the 256GB MCRDIMMs are constructed using monolithic 32Gb DDR5 ICs. The Tall module houses 80 DRAM chips on each side, while the Standard module employs 2Hi stacked packages and will run slightly hotter as a result.
MCRDIMMs, or Multiplexer Combined Ranks DIMMs, are dual-rank memory modules that employ a specialized buffer to allow both ranks to operate concurrently.
As Tom’s Hardware explains, “The buffer allows the two physical ranks to act as if they were two separate modules working in parallel, thereby doubling performance by enabling the simultaneous retrieval of 128 bytes of data from both ranks per clock, effectively doubling the performance of a single module. Meanwhile, the buffer works with its host memory controller using the DDR5 protocol, albeit at speeds beyond those specified by the standard, at 8800 MT/s in this case.“
Customers keen to get their hands on the new memory modules won’t have long to wait. In prepared remarks for the company’s earnings call last week, Sanjay Mehrotra, chief executive of Micron, said “We [have] started sampling our 256GB MCRDIMM module, which further enhances performance and increases DRAM content per server.”
Micron hasn’t announced pricing yet, but the cost per module is likely to exceed $10,000.