An Nvidia executive's meeting in Seoul with Samsung and SK hynix is a pivotal move to secure the future of its AI ambitions.
This meeting is driven by Nvidia’s recent strategic shift toward 'physical AI'—a term for AI that interacts with the real world through robotics and autonomous systems. At its GTC 2026 conference, Nvidia declared that “Physical AI has arrived,” signaling that its focus is expanding beyond data centers. This new frontier of robots and smart devices requires immense memory bandwidth and density, just like their data center counterparts, dramatically increasing the demand for cutting-edge components.
However, the supply chain for these critical components is under strain. The most advanced memory, known as HBM4, and the advanced packaging technology needed to integrate it with GPUs, are the primary bottlenecks. SK hynix has been the leading supplier, but Samsung is quickly catching up. With packaging capacity at firms like TSMC already heavily booked by Nvidia, securing a stable, multi-source supply from these two Korean giants has become strategically essential to prevent production delays.
The context for this meeting has been building for months. The chain of events is clear:
First, the race for HBM4 began in late 2025. Reports emerged of Korean suppliers delivering samples to Nvidia and planning price hikes, while Samsung successfully passed key qualification tests, establishing itself as a credible second source to SK hynix.
Second, Nvidia’s GTC conference in March 2026 officially put physical AI at the forefront. This wasn't just a new product line; it was a fundamental expansion of the AI market, which in turn placed even greater pressure on the already tight memory supply chain.
Third, events in April 2026 confirmed the urgency. SK hynix reported record profits driven by overwhelming AI memory demand, and reports of TSMC expanding its packaging capacity underscored that the bottleneck is real and persistent. These developments set the stage for this week's direct negotiations in Seoul.
Ultimately, this meeting is about more than just securing parts; it's a strategic maneuver to de-risk Nvidia's entire roadmap. By fostering competition and collaboration between SK hynix and Samsung, Nvidia hopes to lock in the volume, pricing, and technology co-development needed to power the next generation of AI, from massive data centers to intelligent robots.
- HBM (High Bandwidth Memory): A type of high-performance computer memory used in high-end GPUs and accelerators, essential for processing the massive datasets required by AI.
- Physical AI: AI systems that can perceive, reason, and interact with the physical world, such as robots, self-driving cars, and smart drones.
- Advanced Packaging: A set of techniques, like TSMC's CoWoS, used to combine multiple chips (like a GPU and HBM) into a single, powerful package, which is a key bottleneck in AI hardware production.
