KPMG details how enterprises are reshaping their digital infrastructure strategies to keep pace with the fast-evolving demands of AI workloads
In sum – what you need to know:
Cloud-to-edge shift – Enterprises are increasingly relying on hyperscaler cloud services for AI workloads, but growing inference and agentic demands are pushing compute and storage closer to the edge — and even to user devices — according to KPMG’s Philip Wong.
Infra investment complexity – Building AI-capable data centers involves major pre-construction and regulatory hurdles, including zoning, land access, power commitments, and fiber connectivity, with most funding currently led by hyperscalers and infrastructure funds.
Performance and cost tracking – Wong emphasized the need for enterprises to monitor construction timelines, utilization rates, and operational expenses, while AI users must sharpen their Cloud FinOps practices to manage rising inference and data costs.
Enterprises are reshaping their digital infrastructure strategies to keep pace with the fast-evolving demands of AI workloads, with hyperscaler clouds, edge computing and data center partnerships emerging as key elements of this shift, according to Philip Wong, principal in deal advisory and strategy for technology, media and telecom at KPMG.
“For most enterprises, AI/GenAI and eventually agentic AI will drive increased requirements for compute and storage to support inference workload and data pipes for AI apps/agents,” Wong explained in a recent interview with RCR Wireless News. “Most of these requirements are likely satisfied through the use of public hyperscaler cloud services.”
However, as inference and agentic AI workloads grow, Wong noted a clear movement toward edge computing: “There will also be a shift towards having the compute and storage closer to the end users (edge). Eventually as workload, models and architecture patterns evolve, in some cases, AI workload could be performed on end user devices (laptops).”
The KPMG analyst also pointed to the accompanying rise in connectivity demands, commenting: “There will also be demand in additional connectivity and bandwidth requirements to the compute and storage resources.”
While most AI infrastructure needs are currently met through public cloud providers, some enterprises may require more control. “For certain types of enterprises, there may be a need to manage their own compute and storage capacity, in which case, they can work with a retail data center operator for co-location or a data center developer to build their own,” he said.
When it comes to building AI-capable data centers, Wong highlighted significant cost and regulatory hurdles. “Most of the AI-capable data center investments are driven by hyperscalers and data center operators, with initial funding provided by the hyperscalers or operators themselves or through infrastructure funds and real estate developers,” he said.
He also shared that costs fall into several key categories: 1) pre-construction: costs related to land acquisition; 2) regulatory/zoning approvals; 3) power and utilities arrangements; 4) securing capital equipment for things like construction; and 5) the construction itself.
Regulatory compliance is an additional area of focus. Wong stated: “Regulatory considerations are mostly around zoning and land use. To the extent you need to extend connectivity to the site, RoW considerations for the fiber/telecom provider. Working with local utility to get commitment on power.”
Asked about the role of government in enabling national AI infrastructure, he said: “Don’t have a specific view on this but support from government (Federal and local) can help accelerate the development of AI Infrastructure. For data centers, access to power, land, connectivity, labor, and materials are critical factors that impact the speed of development. Government policies and regulations will have an impact to all those factors.”
From an advisory perspective, Wong emphasized that companies should track both construction and operational metrics when evaluating AI infrastructure investments.
“For companies investing in building data centers, monitoring construction costs and schedule overruns is key. Then, when the data center is up and running, the fill/utilization rate and cost to operate,” Wong noted. As for users of AI infrastructure, cost management remains essential. “For companies using AI infrastructure, it’s about managing Cloud FinOps well with the new inference and data workload,” he said.