Bloomberg Tech→ original

AI is creating 'insatiable' demand for memory: Alger CEO on the semiconductor boom

Alger CEO Dan Chung called demand for memory 'insatiable' because of the AI boom. Each new AI model requires exponentially more memory for training and inferenc

AI is creating 'insatiable' demand for memory: Alger CEO on the semiconductor boom
Source: Bloomberg Tech. Collage: Hamidun News.
◐ Listen to article

CEO of investment fund Alger Dan Chung told Bloomberg that demand for memory is 'insatiable' and directly linked it to the explosive development of artificial intelligence. According to Chung, this is redefining the entire landscape of technology investments and forcing a reassessment of where to look for profits in the AI boom.

Why AI Requires So Much Memory

Modern large language models require petabytes of data for training. Each model parameter is a chunk of memory. A single training run across hundreds of thousands of GPUs requires coordination between petabytes of DRAM and fast high-bandwidth memory (HBM). To simultaneously serve millions of users, data centers deploy tens of terabytes of VRAM. Transformer architecture results in linear growth of memory consumption with each doubling of model size — a scaling law that cannot be repealed. Data centers running inference 24/7 consume memory like a black hole, devouring it in massive quantities.

Who Feeds the Machine

Demand comes not only for GPUs like NVIDIA H100, but also for DRAM, HBM (High Bandwidth Memory) and long-lived storage. Companies like Micron, Samsung and SK Hynix are seeing record orders for memory chips — contracts stretching 2-3 years ahead. Memory manufacturers are experiencing the strongest surge in demand in two decades. Factories are operating at maximum capacity, struggling to keep up with demand.

  • Model training: petabytes of memory for each update
  • Inference at scale: multiple copies of models distributed across servers
  • Vector databases: embedded representations for RAG consume memory exponentially
  • Fine-tuning: custom models for each corporate client

Investment Trend

Dan Chung emphasizes that the value in the AI stack is not in the processing chips themselves, but in the memory ecosystem around them. Investors who previously looked only at NVIDIA and new AI startups are now watching memory manufacturers, controllers and cooling solutions more closely. This is a long-term demand cycle — not a trend and not speculation, but an architectural necessity built into the very nature of large language models.

"AI cannot develop without memory.

This is not an option, it's a limitation," he says in the interview.

What This Means

Semiconductor memory is becoming the new 'railroad' — the basic infrastructure on which all other technologies are built. Investments are shifting not toward the most brilliant AI startups, but to companies that provide shovels and pickaxes for the AI gold rush. This means long-term demand, protection against speculation and visibility of revenues for many years to come.

ZK
Hamidun News
AI news without noise. Daily editorial selection from 400+ sources. A product by Zhemal Khamidun, Head of AI at Alpina Digital.
What do you think?
Loading comments…