A groundbreaking partnership between Span, Nvidia, and PulteGroup is set to bring AI data centers right into our backyards.
The explosive growth of AI has created an insatiable demand for computing power, but building traditional, hyperscale data centers is hitting a wall. Recent reports show that 30-50% of planned data center projects in the U.S. are facing delays, primarily due to a lack of available power and grid connection bottlenecks. This 'speed-to-power' gap is creating a major hurdle for AI development.
To solve this, Span has introduced the XFRA program, a creative solution that decentralizes computing power. The core idea is to build a distributed network of 'mini data centers,' called XFRA Nodes, installed in residential homes. This approach cleverly sidesteps the gridlock facing large-scale facilities.
First, the partnership leverages an existing channel. PulteGroup, a major homebuilder, already had a partnership to install Span's smart electrical panels in its new homes. This provides a ready-made pathway to deploy XFRA Nodes at scale across new communities. Second, the technology is tailored for this environment. Instead of power-hungry datacenter accelerators, each node uses 16 Nvidia RTX Pro 6000 Blackwell GPUs. These professional-grade GPUs offer powerful performance but with a thermal and power profile that can be managed within a home's electrical system.
So, how does a home support a mini data center? Each XFRA Node can draw up to 15 kW of power, a significant load for a standard residence. This is where Span's smart panel becomes the brain of the operation. It constantly monitors the home's energy usage and the node's demand, intelligently drawing power from the grid's behind-the-meter headroom—the unused capacity in the home's connection. A 15 kWh backup battery also helps smooth out power peaks, ensuring the system operates safely without overloading the circuit.
This initiative isn't meant to replace massive data centers but to complement them by creating a new layer of grid-edge inference capacity. It reflects Nvidia's broader strategy to find innovative ways to deploy AI compute wherever power is available. With an ambition to achieve 'gigawatt scale' by 2027—equivalent to roughly 100,000 homes—this project could pioneer a new model for a more resilient and distributed AI infrastructure.
- Grid-Edge Inference: Performing AI calculations (inference) on devices located at the 'edge' of the network, close to the end-user, rather than in a distant, centralized cloud data center.
- Behind-the-Meter Headroom: The unused portion of a home's or building's electrical service capacity that is available after accounting for all its current power needs. Span's system taps into this spare capacity.
- Hyperscale Data Center: An exceptionally large data center facility, often the size of several football fields, designed to support the massive computing needs of tech giants like Google, Amazon, and Microsoft.
