Categories
Featured

This tiny box from Samsung can hold 2TB of a special kind of RAM worth tens of thousands of dollars — CXL Memory Module Box hailed as the future of expansive server memory in the age of AI

[ad_1]

At MemCon 2024, Samsung showcased its latest HBM3E technology, talked about its future HBM4 plans, and unveiled the CXL Memory Module Box, also known as CMM-B, the latest addition to its Compute Express Link (CXL) memory module portfolio.

CMM-B is essentially a memory pooling appliance for rack computing leveraging CXL. It supports disaggregated memory allocation, allowing memory capacity available in remote locations to be shared across multiple servers. Through this, CMM-B enables independent resource allocation in the rack cluster and allows for larger pools of memory to be assigned as needed. With up to 60GB/s bandwidth, Samsung says CMM-B is ideal for applications like AI, in-memory databases, and data analytics.

[ad_2]

Source Article Link

Categories
Featured

Everything you need to know about random access memory

[ad_1]

At its simplest, RAM (Random Access Memory) is a type of computer memory, often referred to as short-term memory because it is volatile, meaning that the data is not saved when the power is turned off.

When business users switch on the computer, the operating system and applications are loaded to the computer RAM which is directly connected to the CPU, making the data quickly accessible for processing.

[ad_2]

Source Article Link

Categories
Featured

Samsung confirms next generation HBM4 memory is in fact Snowbolt — and reveals it plans to flood the market with precious AI memory amidst growing competition with SK Hynix and Micron

[ad_1]

Samsung has revealed it expects to triple its HBM chip production this year.

“Following the third-generation HBM2E and fourth-generation HBM3, which are already in mass production, we plan to produce the 12-layer fifth-generation HBM and 32 gigabit-based 128 GB DDR5 products in large quantities in the first half of the year,” SangJoon Hwang, EVP and Head of DRAM Product and Technology Team at Samsung said during a speech at Memcon 2024.

[ad_2]

Source Article Link

Categories
Featured

This is what a single 256GB DDR5 memory module looks like — but you won’t be able to fit this Micron RAM in your desktop or laptop and it will almost certainly cost more than $10,000 if you can buy it

[ad_1]

Micron has showcased its colossal 256GB DDR5-8800 MCRDIMM memory modules at the recent Nvidia GTC 2024 conference.

The high-capacity, double-height, 20-watt modules are tailored for next-generation AI servers, such as those based on Intel‘s Xeon Scalable ‘Granite Rapid’ processors which require substantial memory for training.

[ad_2]

Source Article Link

Categories
Featured

Panda Memory Foam Bamboo Pillow review

[ad_1]

[ad_2]

Source Article Link

Categories
Featured

Samsung is going after Nvidia’s billions with new AI chip — Mach-1 accelerator will combine CPU, GPU and memory to tackle inference tasks but not training

[ad_1]

Samsung is reportedly planning to launch its own AI accelerator chip, the ‘Mach-1’, in a bid to challenge Nvidia‘s dominance in the AI semiconductor market. 

The new chip, which will likely target edge applications with low power consumption requirements, will go into production by the end of this year and make its debut in early 2025, according to the Seoul Economic Daily.

[ad_2]

Source Article Link

Categories
Featured

Startup claims to boost LLM performance using standard memory instead of GPU HBM — but experts remain unconvinced by the numbers despite promising CXL technology

[ad_1]

MemVerge, a provider of software designed to accelerate and optimize data-intensive applications, has partnered with Micron to boost the performance of LLMs using Compute Express Link (CXL) technology. 

The company’s Memory Machine software uses CXL to reduce idle time in GPUs caused by memory loading.

[ad_2]

Source Article Link

Categories
Featured

TikTok owner is quietly doing an ‘Apple’ — ByteDance invests in Chinese memory pioneer, months after a similar move in a GPU vendor, as it plans for an Apple Vision Pro VR rival

[ad_1]

TikTok’s parent company ByteDance has reportedly quietly invested in Xinyuan Semiconductors, a Chinese memory chip manufacturer. 

A report from Pandaily, a tech media site based in Beijing, the move reportedly positions ByteDance as the third-largest shareholder in the chip maker, holding an indirect stake of 9.5%.

[ad_2]

Source Article Link

Categories
Featured

Samsung’s biggest memory rivals are plotting a tie-up with Kafkaesque implications — SK Hynix and Kioxia could build lucrative HBM chips for Nvidia, AMD and others

[ad_1]

South Korean chipmaker SK Hynix, a key Nvidia supplier, says it has already sold out of its entire 2024 production of stacked high-bandwidth memory DRAMs, crucial for AI processors in data centers. That’s a problem, given just how in demand HBM chips are right now.

However, a solution might have presented itself, as reports say SK Hynix is in talks with Japanese firm Kioxia Holdings to jointly produce HBM chips.

[ad_2]

Source Article Link

Categories
Featured

Nvidia RTX 5090 could have up to 77% more memory than 4090, a win for gamers

[ad_1]

The Nvidia GeForce RTX 5090 has long since been a hot topic in the tech rumor mill as possibly the best graphics card in the future market, with the latest ones revealing even more information about its memory specifications.

According to well-known and reliable Twitter hardware leaker Kopite7kimi, the RTX 5090 will feature a 512-bit memory bus that is 33% wider than the one on Nvidia’s RTX 4090. This would allow for higher levels of memory bandwidth and increased GPU memory capabilities. It could even have 32GB of VRAM thanks to its two 16GB GDDR modules.



[ad_2]

Source Article Link