Hyperscaler capex is surging, vacancy has all but vanished, and AI is pushing data centre demand far beyond traditional hubs – but questions remain over power, people, and long-term returns as development spills into secondary and tertiary markets.
In sum – what to know:
Mega buildout – hyperscaler spending on AI infra is at a step change, with the top five set to deploy more than $600bn in 2026 alone, driven largely by AI training workloads.
Critical supply – vacancy rates are below two percent in primary US markets and effectively zero in hotspots like Northern Virginia, forcing expansion into new geographies.
Regional focus – secondary and tertiary markets are in focus, but power availability, workforce attraction, and the future mix of AI training versus inference investments.
There was a good panel discussion last Sunday (January 18) in Honolulu, the first day of PTC’26, about the “generational build-cycle” in the US data centre market, teed-up nicely by Joe Valenti, managing director and co-head for media and telecom at Bank of America. “The demand profile is enormous,” he said, opening the discussion for a well-attended panel comprising data-centre analyst, consultancy, and operator companies.
He reeled off some top-line stats: the top five hyperscaler companies (Amazon, Microsoft, Google, Meta, Apple) will invest $602 billion of capital in data centre infrastructure in 2026; four of them will spend $100 billion each – which represents “north of 50 percent” of their revenue in some cases. “It is a step function in historical capital deployment,” said Valenti. They will go from four percent of total US energy consumption in 2024, to 10 percent by 2030, he said.
Notably, there is virtually zero over-capacity; the vacancy rate for US real estate, whether powered shell facilities or fully fitted spaces, is just 1.6 percent in primary markets, he said – so only 1.6 percent of total data-centre inventory is currently unoccupied in historical AI-infrastructure heartlands. “It is well less than one percent in certain markets – Northern Virginia being the most obvious one. So we have this huge imbalance of demand relative to supply.”

New regional markets
So demand is exploding and supply is nearly exhausted. And AI is the reason, linked mostly to the task to train massive frontier language (LLM) models. Cue the discussion, proper, and a question from Valenti about the spread into secondary and tertiary markets, to offset rising demand in energy-constrained primary markets. David Liggitt, chief at data centre analytics firm Datacenter Hawk (stylised datacenterHawk), further explained the demand profile.
He said: “[The] major markets in the US have been Chicago, Phoenix, Dallas, Atlanta – markets like this. We’ve seen secondary markets – like Minneapolis, Portland, Houston, Nashville – grow over time. And we’ve [also] seen growth into tertiary markets in the last 24 months. [We] tracked 15.6 GW of demand absorption in 2025 – which is the fancy term in the real estate industry for demand. To put that in context: it was 6.8 GW in 2024, and 3 GW in 2023.
“So we have gone from 3 GW to 6.8 GW to 15.6 GW… The transactional value of that 15.6 GW [of capacity] is over $2 trillion – for those leases, when put together. So the value is significant.” Interestingly, Liggitt suggested new deployments in so-called tertiary markets are outgunning their equivalents for capacity in more familiar territories. “If the total market in Portland is 500/600 MW of capacity, a data centre… going into North Dakota might be twice that.”
The meaning is a little unclear, but the sense is the secondary market, in some regions, could be usurped – so long as power is available in more rural sites. “The question [is whether] these tertiary markets… can actually do more than [the] initial deployments – considering [they will] stress the utility system. There’s a lot [to do] for some of these secondary markets to… complete these projects. It will be fascinating to see what happens… over time.”
Power and personnel
Jason Nance, in charge of client solutions at CBRE Data Center Solutions, a service group within commercial real-estate services and investment firm CBRE, responded: “The tertiary markets are more and more in the mix… [and] could very well become primary markets.” But a part of the challenge, beyond electrical power, is (hu)man power – and how to attract workers to the back-end of nowhere to staff a big rural AI factory.
“Everybody knows [about] Abilene (in Texas, where the Stargate project is based)… [but] people forget that the infrastructure for people to support these projects [has] to be there… The biggest problem is getting people to move to these tertiary markets. People look at the power and the network, but… you still need people at the end of the day. And that may stymie growth in these tertiary markets… Sometimes it’s a bridge too far… to move to North Dakota.”
Nebraska-based 1623 Farnam runs an interconnection facility in Omaha, considered to be a secondary market; it is “one of the most connected in the entire Midwest”, hosting 60-odd ‘carriers’ and more than 35 broadband providers. “There are eight hyperscale campuses within 40 miles of our facility,” explained Bill Severn, president and chief executive at the firm. “All of them are asking for double the power that they asked for three years ago.”
1623 Farnam is close with the local power company, and sees the challenge for both sides. “How do they solve that? Because those are not new builds; they are existing and preexisting… [And] have to work diligently with the power company to find those nooks and crannies where a little power might be squeezed out.” Beyond the vagaries of AI planning, there is an existential question about how such big projects will pay out after the big AI work is completed.
Training and inference
Valenti explained: “Most deployments are being used for generative AI, right? We know we’re in an arms race [where] multiple parties are trying to create the best model … But we’ll get to a point where the models are trained – quote/unquote – enough. And then we will have massive capacity in places like North Dakota, Mississippi, and Louisiana… [which will need to absorb] things other than generative AI. How do you guys think about that?”
How do data centre operators, private equity firms, and sovereign wealth funds assess long-term returns on new data-centre builds, especially in remote tertiary markets – a long way from the edge, where all the real AI action is; where the data is generated and acted upon, and AI inference increasingly takes up residency? Nance responded: “There’s a lot of exuberance around AI, and if we’re not careful it can become irrational exuberance, right?
“Everybody with a dollar and a piece of land… wants to build a data centre. But anybody putting money into the market needs to understand the fundamentals… Even the storied [AI] companies out there, which are in the news all the time, have to monetize [their infrastructure investments] at the end of the day… [They have to ask] if the investment… is the right one – because they’re not all the same. They have to be aware of the irrational things.”
Liggitt rejoined: “Another term we’re hearing quite a bit… is fungibility – related to sites being acquired, and whether they into other use cases down the road? That’s probably the best [question to ask]: whether there are other [AI] use cases that… [will] emerge when these big models are trained [that will] take that type of power…. [Because] the amount of power in some more rural markets is… hard to fathom. I mean, these are such large amounts.”
He added: “The other thing [to consider] is the wave and the maturity of inference, and how that impacts our space geographically – where it impacts things; how close to the bigger cities does inference need to be, as it matures over the next five years. How far out can [data centres] really go? Because that will also have an impact on where power is deployed.” Indeed, there is still some debate about the localisation of compute power for AI inference.
Another thing: networks
Sunday afternoon, and PTC has only just started; Valenti has had half a dozen different conversations with data centre operators already, and there is little real consensus about where to put inference workloads, he says. “I’ve had feedback that there are plenty of non-latency-sensitive inference workloads that don’t need to go anywhere else; that they can be deployed in some of the regions we’re talking about. Things like video rendering, for example.”
The question is put to Severn at 1623 Farnam. “We all read the headlines about these billion-dollar deals, and everybody talks about AI inference or training. But the bit that gets lost is [in the network] in-between. It’s just not nearly as sexy. But where we are winning [with AI] is in the network – because that component has to be in place. And the networks are coming to tertiary markets, where we’re an interconnection facility.”
Valenti let the network discussion go, prompting Severn with a question about the interest in and returns on fibre interconnectivity – beyond just the big talk about AI data centres and training models. Severn responded with an insight about the strategy at 1623 Farnam, and who to let into its carrier hotel / meet-me room set-up in Omaha, and who to keep out. “If a client doesn’t add to the ecosystem in the building, we say no,” said Severn.
“Everything needs to be additive. Our clients average 13 cross-connects to others in the building. Which shows it’s a really sticky environment. And if you think about all the AI startups, our building could be full right now if we wanted to. We just turned down someone that wanted the rest of it, and could have generated several million dollars worth of cash flow a year. But it would take up 40 percent of the space, and wouldn’t give us the same return.”
