The AI boom is pushing memory demand well beyond high-bandwidth memory (HBM). Low-power DRAM is now under pressure too, with ...
JEDEC is nearing the completion of an LPDDR6 Processing-in-Memory (PIM) standard. Essentially, by baking processing capabilities directly into the memory itself, this tech reduces the need to ...
Nvidia's (NVDA) plan to use smartphone-style memory chips in its AI servers could cause server-memory prices to double by late 2026, Reuters reported, citing a report by Counterpoint Research. In the ...
SOCAMM2 replaces soldered LPDDR memory with detachable, upgradable modules that combine LPDDR efficiency with server-class ...
SEOUL, South Korea--(BUSINESS WIRE)--Samsung Electronics Co., Ltd., the world leader in advanced memory technology, today announced it has begun mass production for the industry’s thinnest 12 ...
TL;DR: NVIDIA is ramping up production of LPDDR-based SOCAMM memory, targeting 600,000 to 800,000 units in 2024 for AI PC and server products. SOCAMM offers superior power efficiency, modular upgrades ...
Taiwan's United Daily News (UDN) and others reported the same day that Nanya will supply products for Nvidia's ...
Samsung on Tuesday said that it had validated its LPDDR5X memory at operating speeds of 8.5 Gbps. Samsung expects the new DRAM to be used for bandwidth hungry applications beyond smartphones and ...
As AI workloads continue to diversify, the systems that support them are evolving just as quickly. AI is no longer confined to the hyperscale data center. It is moving to the factory floor, into ...
Rambus has launched a SOCAMM2 chipset designed to support power‑efficient, high‑performance LPDDR5X memory modules in AI ...
Interactive LLMs (chat, copilots, agents) with strict latency targets Long‑context reasoning (codebases, research, video) with massive KV (key value) cache footprints Ranking and recommendation models ...