Featured News
All community news

Beyond the Buildout — AI's missing power supply is already built

April 21, 2026
Wireframe
Paul Straub
Lily Bernicker
Wireframe panel during SF Climate Week 2026 - AI & Energy Extremes

Subscribe to our newsletter

For occasional news & updates from Wireframe Ventures.
Thank you! Your submission has been received!
Something went wrong. Please try again.
By signing up you agree to our privacy policy

Beyond the Buildout — AI's missing power supply is already built

Details

Two years ago, the working title for this panel was Will AI Break the Grid? A senior executive from a large energy company asked us to change it, uncomfortable with the implication that AI was putting the grid at risk.

In 2026, nobody needs to ask.

Community resistance to data center development, rising electricity rates, and grid constraints are making international headlines. The conversation has gone mainstream. And with it, a consensus has hardened: AI's demand for power is limitless, speed to power is the #1 strategic priority, and since the grid can't keep up, the answer is to go around it.

At AI & Energy Extremes, our SF Climate Week 2026 panel, we brought together four leaders building and operating at the center of this problem, across grid software, power electronics, compute hardware, and distributed infrastructure.

On demand, there was no disagreement. Ryan Harris, CRO at Span, put it plainly: the market is "underestimating the demand, but overestimating the ability to deliver." Large customers book infrastructure at 80-90% confidence, and real need runs consistently ahead of published forecasts.

Underestimating the demand, but overestimating the ability to deliver.

Nobody in the room foresaw an end to Jevons paradox. Rory McInerney was direct: as long as more compute means more revenue, every efficiency gain just feeds more demand for tokens. Nothing on the visible horizon breaks that loop.

But the panel pushed back on what has become the consensus response: when the grid can't keep up, go around it.

Amit Narayan, CEO of Gridcare, stated the contrarian case plainly: "We don't have a capacity issue on the system. We actually have a coordination issue."

We don't have a capacity issue on the system. We actually have a coordination issue.

The US grid runs at roughly 30% utilization. Data centers use 30 to 40% of delivered power, because operators systematically overbuild for peak scenarios that almost never materialize. Run the full chain: we extract roughly 25% of the theoretical capacity that already exists.

Gridcare exists to unlock that capacity today. No new concrete required. Amit pointed to their project in Hillsboro, Oregon, one of the most constrained data center markets in the country, where they are unlocking 400 megawatts by accelerating a decade-long interconnection queue.

Utilities were never paid to maximize utilization. They were paid to guarantee reliability. As Amit put it: "If you are in the apple orchard business, and you get paid for planting more trees, but not for how many apples you harvest, then you will just plant more trees."

Bypassing the grid and building your own is what the biggest players are now doing. Some of those projects will succeed. But Rory McInerney, who has spent decades inside compute hardware and now advises across the AI infrastructure stack, cut through the headline commitments: "When you look in detail at the contracts of some of these mega data centers, there are lots of outs — lots of outs. And the guys with the biggest money are covered."

When you look in detail at the contracts of some of these mega data centers, there are lots of outs — lots of outs. And the guys with the biggest money are covered.

The risk doesn't sit evenly. Capital deployed far from the revenue generator, in land and speculative power agreements, carries the most exposure when cycles shift. The giants building around the grid have the balance sheets to absorb that. Most of the market does not.

As inference becomes the dominant AI workload, the architecture that made sense for training starts to look like the wrong answer for what comes next.

Training is compute-bound. It wants centralization, long timescales, gigawatt-scale facilities removed from the end user. Inference is memory-bound and latency-sensitive.

Span's XFRA platform deploys distributed inference nodes behind the meter, at homes and small businesses, capturing electrical headroom that goes unused continuously. As Ryan put it: "98% of our homes have 80 amps of underutilized headroom — at all times." XFRA builds linearly rather than in lumpy multi-hundred-megawatt steps, avoids multi-year interconnection queues, and is structurally deflationary for electricity costs by increasing utilization on infrastructure that already exists.

The panel's most underrated idea: data centers, built and operated differently, could be the best thing that ever happened to the grid.

Grayson Zulauf of Heron Power holds this view. Their solid state transformer platform is designed not just to address the immediate transformer shortfall, but to fundamentally upgrade how grid-connected assets manage power.

A large data center has on-site generation, significant battery storage, precise load visibility, and workloads that can be shifted in time without degrading the product. These are the degrees of freedom that grid operators have wanted for decades. A data center that actively participates in demand response, co-locates with renewable generation, and treats its flexibility as an asset rather than a liability raises utilization on existing infrastructure, reduces the need for peaker plants, and makes the case for transmission investment by guaranteeing the anchor load.

"Data centers are a great accelerant," Grayson said in closing. "But we shouldn't be betting the whole grid — or the whole economy — on that."

Data centers are a great accelerant. But we shouldn't be betting the whole grid — or the whole economy — on that.

The path through growing community resistance isn't better marketing. It's a different operating model, one where data center developers enable better power technologies and demonstrably lower costs for their neighbors by making shared infrastructure work harder.

In the permitting environment that's coming, community buy-in won't be a nice-to-have. It will be a moat.

The next wave of AI infrastructure will not only be built. Much of it will be unlocked: from homes sitting on idle capacity, from a grid running at a third of its potential, from data centers wasting half their delivered power on redundancy for failures that almost never come.

At Wireframe, we're looking to back founders with a clear read of where the system is broken, aligned with creative and non-obvious solutions. New generation, smarter distribution, grid intelligence, edge compute, power electronics can all help solve a problem that even enormous amounts of capital cannot solve alone.

If that describes what you're building, we'd love to hear from you.

No items found.