YOU ARE AT:AI InfrastructureBookmarks: Powering AI infrastructure and the Theory of Constraints

Bookmarks: Powering AI infrastructure and the Theory of Constraints

Editor’s note: I’m in the habit of bookmarking on LinkedIn and X (and in actual books) things I think are insightful and interesting. What I’m not in the habit of doing is ever revisiting those insightful, interesting bits of commentary and doing anything with them that would benefit anyone other than myself. This weekly AI infrastructure column is an effort to correct that.

Do we have the power to meet AI infrastructure demand? It’ll take focus

NVIDIA CEO Jensen Huang often describes the current AI infrastructure buildout supercycle as a trillion-dollar-plus transformation of data centers into “AI factories” where intelligence is manufactured. During his GTC keynote earlier this year, Huang described a single rack in an AI factory as containing 600,000 parts and weighing 3,000 pounds. “AI factories are so complicated,” he said. 

If AI factories are complex systems designed to manufacture intelligence, then it makes sense to borrow management frameworks from traditional manufacturing to understand how we scale them. One of the most enduring is the Theory of Constraints, introduced by Eliyahu M. Goldratt in his 1984 dialogue-driven novel The Goal. It’s a simple premise: every system has a limiting factor. Focus your efforts on that bottleneck, and you’ll get the fastest gains in throughput. Once resolved, a new constraint emerges and the process repeats.

Infrastructure Masons identifies power as primary challenge facing the AI infrastructure supercycle

In the AI infrastructure world, the primary constraint is power. In its “State of the Digital Infrastructure Annual Report 2025,” Infrastructure Masons counts  55 gigawatts of active data center power capacity worldwide, with another 15 GW under construction and 135 GW in the development pipeline. That pipeline has jumped by 80 GW in just a year. The report authors are blunt: Access to power is the top industry challenge. AI amplifies this challenge.

The AI arms race, marked by training increasingly massive models and rapid growth in daily users, consumes an immense amount of electricity, and cutting-edge AI data centers are closer to heavy industry than traditional data centers. In fact, Infrastructure Masons CEO Santiago Suinaga put it in the report, “Today, the power capacity of our industry is on track to triple or even quadruple within the next five to seven years, which will put it in the same power consumption league as the cement, steel and petrochemical industries.” The mismatch between current power availability and projected demand with utility upgrade timelines means power is the gating factor, the constraint, in AI infrastructure deployment.

 

Screenshot 2025 05 01 at 9.43.10 AM
Image courtesy of the Theory of Constraints institute, tocinstitute.org

So how does the decades-old factory-floor logic of the Theory of Constraints help us think more clearly about AI infrastructure today? It outlines five “focusing steps” to manage bottlenecks. Here’s what that might look like when applied to the AI power problem:

  1. Identify the constraint: Power availability is now the dominant factor shaping when and where data centers get built.
  2. Exploit the constraint: Optimize around what’s available. Design within fixed power envelopes. Prioritize high-revenue workloads. Schedule compute-intensive workloads when grid pricing is lowest.
  3. Subordinate everything else to the constraint: Align rack deployments, GPU deliveries, and customer onboarding with powered site timelines. Don’t overbuild on sites where power is still speculative.
  4. Elevate the constraint: Engage earlier with utilities. Pursue PPAs and behind-the-meter solutions like fuel cells and microgrids. Invest in power procurement as a core competency, not an afterthought.
  5. Prevent inertia from becoming the constraint: Once power is no longer the bottleneck, be ready to identify and address the next one whether that’s cooling, network fiber, permitting, or skilled labor.

As Huang builds AI factories and Suinaga compares digital infrastructure to steel mills, we’d do well to ask: what’s the bottleneck? Goldratt reminds us, “Focusing on everything is synonymous with not focusing on anything.” Today, the bottleneck is power. Focus there. Then move on to what comes next.

For a big-picture analysis of AI infrastructure, including 2025 hyperscaler capex guidance, the rise of edge AI, the push to artificial general intelligence (AGI), and more, check out this long read.

ABOUT AUTHOR

Sean Kinney, Editor in Chief
Sean Kinney, Editor in Chief
Sean focuses on multiple subject areas including 5G, Open RAN, hybrid cloud, edge computing, and Industry 4.0. He also hosts Arden Media's podcast Will 5G Change the World? Prior to his work at RCR, Sean studied journalism and literature at the University of Mississippi then spent six years based in Key West, Florida, working as a reporter for the Miami Herald Media Company. He currently lives in Fayetteville, Arkansas.