MediaTek's CEO recently made a groundbreaking statement that reshapes our understanding of the AI hardware market.
The new economic reality for AI chips is that memory now constitutes roughly 50% of the total cost, or Bill of Materials (BOM). For a long time, the industry was laser-focused on the processing power—the 'brains' of the chip. Now, however, the primary performance bottleneck and the single largest expense is the memory required to feed those brains. This signals a fundamental shift from a 'compute-led' to a 'memory-led' industry.
So, what's driving this dramatic change? There are three key factors at play. First, the type of memory needed for AI is incredibly advanced and expensive. Training large AI models requires moving massive datasets at extreme speeds, a task that depends on High-Bandwidth Memory (HBM). Independent cost breakdowns of top-tier accelerators like NVIDIA's B200 show that HBM alone already accounts for 30-40% of the chip's total cost.
Second, the demand is simply exploding. Cloud giants like Google are investing hundreds of billions of dollars into their AI infrastructure. This massive spending spree is creating a surge in demand for both HBM for training their custom Tensor Processing Units (TPUs) and slightly less expensive, but still costly, DDR5 memory for handling AI inference tasks. This dual demand pushes the overall share of memory in the total system cost even higher.
Finally, the advanced packaging technology needed to connect the processor and HBM stacks remains a major bottleneck. Companies like TSMC are struggling to meet the soaring demand for their CoWoS packaging services. This scarcity adds to the overall cost, further cementing memory and its related components as the dominant expense. MediaTek has a front-row seat to this evolution, thanks to its growing partnership with Google on developing next-generation TPUs.
- XPU: A general term for a category of processors optimized for data-intensive workloads, including AI accelerators like GPUs and TPUs.
- HBM (High-Bandwidth Memory): A high-performance type of RAM that stacks memory chips vertically to achieve significantly faster data transfer speeds, crucial for AI training.
- BOM (Bill of Materials): A comprehensive list of all the raw materials, components, and assemblies required to manufacture a product.