Nvidia’s decision to switch to smartphone-style memory chips for its AI servers could drive server-memory prices up to twice their current levels by late 2026, according to a report by Counterpoint Research. Recently, global electronics supply chains have faced shortages of traditional memory chips as manufacturers focused on high-end memory for AI semiconductors.
Nvidia plans to replace DDR5 server memory with LPDDR, a low-power memory type commonly used in phones and tablets, aiming to reduce AI server energy costs. However, each AI server requires far more chips than a smartphone, creating sudden demand that the market may struggle to meet. Memory suppliers like Samsung, SK Hynix, and Micron are already experiencing shortages of older DRAM products after cutting production to focus on high-bandwidth memory for AI accelerators.
Counterpoint warns that the LPDDR shift represents a major disruption, likening Nvidia to a smartphone-scale customer and predicting that this could strain the supply chain. As a result, server-memory costs could double by 2026, increasing expenses for cloud providers and AI developers already facing rising GPU and power infrastructure costs.
