SK Group Chairman Chey Tae-won recently made a bold statement, suggesting SK hynix's operating profit could surpass $100 billion in 2026.
This projection, while ambitious, is rooted in the explosive growth of the AI industry. First, the demand for AI infrastructure is immense. Tech giants like Microsoft, Google, and Meta are expected to spend over $600 billion on AI systems in 2026 alone. These systems are incredibly hungry for high-performance memory, creating a massive, sustained demand for SK hynix's flagship products.
However, immense demand is only half of the story. The other crucial factor is a persistent supply bottleneck. The production of advanced AI chips is limited by a critical process called advanced packaging, where companies like TSMC struggle to keep up. This packaging shortage acts as a gatekeeper, rationing the total number of AI accelerators that can be built, which in turn limits the supply of the HBM memory attached to them.
This supply-demand imbalance creates a "seller's market," giving SK hynix significant pricing power. It’s not just for their top-tier HBM, either. The profitability of conventional server memory, like DDR5, is also surging, sometimes even surpassing HBM. This diversified strength provides a powerful cushion for the company's earnings.
Finally, there's a new, emerging bottleneck: electricity. AI data centers are voracious energy consumers, and the power grid is struggling to keep pace. This has led SK to explore building data centers with their own on-site power sources, highlighting another layer of the complex AI infrastructure puzzle.
So, is the $100 billion target achievable? It represents a best-case scenario that would require everything to go right—sustained AI spending, continued supply tightness, and no major disruptions. A more moderate outcome of $70-80 billion seems plausible, which would still be an incredible achievement. The chairman's forecast isn't just a number; it's a reflection of the powerful, interlocking forces shaping the AI era.
- HBM (High Bandwidth Memory): A type of high-performance memory used in AI accelerators like GPUs, known for its speed and efficiency in processing large amounts of data.
- Operating Profit: A measure of a company's profitability from its core business operations, calculated as revenue minus operating expenses.
- CoWoS (Chip-on-Wafer-on-Substrate): An advanced packaging technology by TSMC that stacks multiple chips together to improve performance and save space, critical for modern AI accelerators.