Nvidia's quarterly earnings, announced on February 25, 2026, were monstrously good, but the stock market's reaction was surprisingly muted.
This signals a shift in the market's focus toward the core bottleneck in the AI accelerator supply chain. Previously, the primary concern was TSMC's advanced packaging technology, 'CoWoS'. Now, however, the supply shortage of high-performance DRAM, known as 'HBM (High Bandwidth Memory),' has emerged as a critical new variable.
This shift is significant for three main reasons. First, rising HBM prices are beginning to shake the successful formula that "increasing production protects margins." With both HBM and server DDR5 memory in tight supply, 2026 price forecasts have been revised upward, which could pressure Nvidia’s high profit margins. Second, it raises questions about Nvidia's ability to pass on costs. HBM is estimated to account for 40-50% of the manufacturing cost of the latest B200 chips. Simple calculations show that if HBM prices rise by 20% and Nvidia cannot reflect this in its selling price, its gross margin could fall by about 1.2 percentage points. While Nvidia has a history of successfully passing on costs, the market is now starting to worry about the 'limits' of this power. Third, this is evidenced by the valuation gap with memory semiconductor companies like SK Hynix, which share the same AI demand. The recent stronger stock performance of memory companies, which are more sensitive to supply and price fluctuations, indicates that the market is once again applying the traditional logic of the "semiconductor cycle."
Of course, the demand for AI infrastructure itself remains explosive. Both Google's parent company, Alphabet, and Meta have significantly increased their 2026 capital expenditure (CapEx) plans, showing a strong commitment to building 'AI Factories.' However, such massive investments face practical constraints like power and land availability. They could also conflict with a company's cash flow or shareholder return policies, potentially capping stock price appreciation.
Ultimately, this earnings report reaffirmed that AI demand remains robust, but it also showed the market's center of gravity shifting from 'growth' to the cyclical logic of 'supply constraints and costs.' The key for 2026 will be how successfully Nvidia can pass on rising costs to its customers. When memory and packaging capacity expansions eventually ease these bottlenecks, the valuation gap between Nvidia and memory companies will likely be re-evaluated.
- Glossary
- HBM (High Bandwidth Memory): A high-performance memory standard that involves vertically stacking multiple DRAM chips to achieve significantly faster data processing speeds. It is essential for AI accelerators.
- CoWoS (Chip-on-Wafer-on-Substrate): An advanced packaging technology from TSMC where chips are mounted on a silicon wafer, which is then placed on a substrate, enabling high-performance computing.
- Semiconductor Cycle: The phenomenon where the semiconductor industry experiences periodic booms and busts. It is characterized by high price volatility due to mismatches between supply and demand.