YOU ARE AT:AI InfrastructureLiquidStack CEO breaks down AI infra pain points

LiquidStack CEO breaks down AI infra pain points

LiquidStack CEO noted that many data center operators remain hesitant to adopt liquid cooling chiefly due to persistent misconceptions

In sum – what to know:

AI infrastructure bottlenecks – Data center operators are hitting serious delays due to limited power availability, long lead times for cooling equipment, and the inefficiency of retrofitting older air-cooled facilities for high-density AI workloads.

Liquid cooling misconceptions – Despite common concerns, liquid cooling can reduce operating costs, boost energy efficiency and fit more computing power into less space.

Future-ready solutions – As AI chips grow more powerful, liquid cooling is becoming essential rather than optional, according to LiquidStack.

As data center operators race to meet the demands of AI workloads, they are currently facing mounting obstacles across power infrastructure, cooling capacity and capital planning. In an interview with RCR Wireless News, LiquidStack CEO Joe Capes noted that the current environment is creating major pain points that slow deployment and strain budgets.

“Issues with power generation capacity and grid interconnects are slowing the deployment of AI in some regions,” Capes said, adding that this “time to power” is a critical bottleneck.

And even in an scenario where power is available, infrastructure delays continue to complicate buildouts due to supply chain constraints. “The demand for power and cooling infrastructure is outstripping the industry’s ability to meet the required capacities and lead times,” he said. “This creates a struggle for data center operators to acquire necessary equipment quickly enough to meet scaling project deadlines.”

Legacy systems utilize air cooling, which Capes claimed typically handle only up to 20 kW per rack. This, he continued, makes this method “nonviable from a space, energy and cost perspective.”

Liquid cooling is plagued by misconceptions

Despite the growing urgency to scale AI infrastructure, many data center operators remain hesitant to adopt liquid cooling — largely due to persistent misconceptions about its cost, complexity and necessity. One common belief, according to Capes, is that operating expenses for liquid cooling are higher than for traditional systems. “Actually, the opposite is true. It takes far less energy to cool AI chips with liquid compared to the energy required for air cooling with fans,” Capes said. “Its Coefficient of Performance [COP] ratings are significantly higher than those of air cooling, resulting in increased operational cost savings and alignment with sustainability goals.” Im sum: Depending on the specific technology selected, Liquid cooling systems can require less space than traditional setups.

Another misconception is that the upfront costs are too high to justify adoption in the short term. Capes acknowledged that retrofitting does require investment, but the economics are increasingly favorable:

“Retrofitting existing air-cooled data centers comes with a significant upfront investment, which can lead some operators to delay adoption. However, the economic benefits of liquid cooling, particularly for high-density AI workloads, can yield a return on investment in less than two years, depending on factors such as energy costs and workload intensity.”

Complexity is another perceived barrier. Some operators believe liquid cooling systems are difficult to deploy and operate. Liquid cooling has long been a proven solution in high-performance computing, typically deployed at sites below 10 MW. While the broader liquid cooling industry is still evolving and working toward standardization, the technology is mature and delivers significant efficiency gains. “And working with experts can make initial challenges manageable,” Capes added. “We provide comprehensive full lifecycle services, including expert installation, preventive maintenance and continuous support and hands-on training.”

Lastly, Capes addressed the belief that liquid cooling is optional for advanced AI performance. He warned that as chips become more powerful, liquid cooling will no longer be a luxury — it will be a necessity.

Earlier this month, LiquidStack launched a modular, scalable coolant distribution unit with up to 10 MW cooling capacity.

ABOUT AUTHOR

Juan Pedro Tomás
Juan Pedro Tomás
Juan Pedro covers Global Carriers and Global Enterprise IoT. Prior to RCR, Juan Pedro worked for Business News Americas, covering telecoms and IT news in the Latin American markets. He also worked for Telecompaper as their Regional Editor for Latin America and Asia/Pacific. Juan Pedro has also contributed to Latin Trade magazine as the publication's correspondent in Argentina and with political risk consultancy firm Exclusive Analysis, writing reports and providing political and economic information from certain Latin American markets. He has a degree in International Relations and a master in Journalism and is married with two kids.