Dynamic Random Access Memory (DRAM) demand has shot through the roof. This explosion is most acutely felt in artificial intelligence (AI) data centers. Orchestrated and Liedtke Overview Industries are quickly adopting AI technologies. Today, they are relying on high bandwidth memory to boost the speed of graphics processors (GPUs) and other accelerators. This unprecedented increase in demand has pushed prices for DRAM to record levels. This increase is even more exacerbated by diversion of supply to other uses. Much of the current cycle of DRAM shortages begins with the COVID-19 pandemic-induced chip supply panic. Today, the commercial market can barely respond to the rapidly expanding appetite for computational power.
Businesses such as Samsung are intensifying their own production. This time, the industry needs to figure out if technical innovations along the supply chain or new capacity will address the perennial issues. With a new facility expected to crank out chips in Pyeongtaek, South Korea by 2028, Samsung hopes to strengthen its production dominance. The pressing question remains: how long will this memory chip shortage persist, and what factors will influence its resolution?
The Surge in Demand and Rising Prices
The explosion in demand for DRAM, especially for mixed compute workloads including GPUs and AI accelerators, has had an unprecedented impact. This increased scrutiny is a result of the quick integration of AI technologies into many industries, such as finance, healthcare, and manufacturing. As a result, businesses are working to get enough memory to supply all their data centers.
In parallel with booming demand, DRAM pricing has shot up — in many cases, even doubling. This inflation is primarily the result of a re-routing of supply chains. They’re focusing on all the critical needs for AI applications first, before non-critical uses. Manufacturers are increasingly under pressure as they try to align on-supply with this sudden shift in demand.
“In general, economists find that prices come down much more slowly and reluctantly than they go up. DRAM today is unlikely to be an exception to this general observation, especially given the insatiable demand for compute,” – Mina Kim.
This view risks losing sight of the pain points that manufacturers are facing while trying to steer a market that still seems to be pushing back against price cuts.
Origins of the Current DRAM Cycle
Before we can understand what’s happening in the DRAM market today, we must first look at where it all began. Their wave of shortages started at the onset of COVID-19 when a worldwide chip supply frenzy set in. Often, that meant businesses moving quickly to find remote work and digital solutions. This change led to dramatically increased demand across many types of chips, including DRAM.
The supply chain’s vulnerability was exposed during the pandemic, leading manufacturers to rethink their overall production strategies. While some companies managed to weather the storm by adjusting their operations, others found themselves struggling to meet increased demands. While the industry is poised to rebound, companies are moving forward with care. Most memory and storage companies have been reluctant to invest in new production capacity.
“Relief will come from a combination of incremental capacity expansions by existing DRAM leaders, yield improvements in [advanced packaging], and a broader diversification of supply chains,” – Shawn DuBravac.
This realization can point us toward the idea that it might take a multifaceted approach to replenish existing supply.
Innovations and Future Production Plans
Looking ahead, Samsung’s plans to initiate production at a new facility in Pyeongtaek by 2028 signal a hopeful turn for the industry. This new plant will be dedicated to the production of cutting-edge advanced memory solutions that are essential in fulfilling the increasing needs of AI applications. Not to be overlooked, researchers at Samsung have contributed amazing breakthroughs in memory technology.
In 2024, they proved they could ship a 16-high stack of DRAM dies using hybrid bonding methods. We hope this innovation leads to the development of even more advanced memory solutions in the years ahead. The industry norm for High Bandwidth Memory (HBM4) is 16 stacked DRAM dies. By comparison, today’s chips only take advantage of 12 dies. Samsung’s learnings show that 20 dies isn’t unrealistic in making with additional improvements to hybrid bonding technology.
The newly released B300 reflects these improvements. This card has eight of those HBM chips, but each chip has a 12-high stack of DRAM dies. This elaborate construct is only about 750 micrometers thick. It has been engineered to be able to mount a mere one millimeter distance from a GPU or other AI accelerator.
The urgency around DRAM production goes beyond leading-edge technologies to addressing supply chain constraints. After all, it can take a year and a half to build new fabrication facilities and they’re conservatively estimated to cost $15 billion. Thus, manufacturers face a critical decision: pursue innovation or invest in expanding production capabilities.

