TrendForce has adjusted its 2026 forecast for NVIDIA's AI GPUs, signaling a significant shift in the tech giant's product roadmap and execution risks.
The core change isn't a dramatic drop in overall growth, which was only trimmed from 26.8% to 26.0%, but a major rebalancing of the product mix. The forecast now expects the current Blackwell platform to make up a much larger share (71%) of 2026 shipments. In contrast, the next-generation Rubin platform's expected share has been cut from 29% to 22%, with the older Hopper platform also seeing a decline.
So, what's causing this shift? The primary reason is that the Rubin platform, slated for late 2026, is facing several technical and supply chain hurdles. First, the advanced HBM4 memory it relies on requires a lengthy validation process, which is now expected to extend into the second quarter of 2026. Second, Rubin introduces a completely new system design that requires 100% liquid cooling and new networking components like ConnectX-9. These changes increase integration complexity, making a smooth, high-volume launch in 2026 less certain.
Meanwhile, the older Hopper (H200) platform has its own challenges. Its shipments to China are tangled in US-China geopolitical tensions. While sales are permitted under specific licenses, the process is fraught with uncertainty, making it difficult to forecast stable volumes. With both Rubin and Hopper facing headwinds, the mature and already-in-production Blackwell platform is poised to fill the gap. It's the safer, more reliable bet for 2026, which explains its increased share in the revised forecast.
However, NVIDIA isn't just passively waiting. The company is actively building a second engine of growth centered on AI inference. It's launching new products like LPU (Language Processing Unit) racks, specifically designed for running AI models rather than just training them. This strategic pivot creates a new demand stream that is less dependent on the Rubin GPU timeline, providing a buffer against potential delays.
In essence, the forecast revision reflects a delay, not a derailment. The market risk for NVIDIA in 2026 is shifting from a question of demand to one of execution—can its supply chain deliver these complex new systems on time? For now, the market seems to be pricing in this uncertainty, as NVIDIA's valuation remains relatively low despite strong underlying demand.
- Glossary
- HBM4 (High Bandwidth Memory 4): The next generation of high-performance memory stacked vertically, crucial for feeding data to powerful AI chips.
- AI Inference: The process of using a trained AI model to make predictions on new data. This is different from AI training, which is the process of teaching the model.
- Blackwell / Rubin / Hopper: Codenames for different generations of NVIDIA's AI GPU architectures. Hopper is the oldest, Blackwell is the current generation, and Rubin is the next generation.
