Telcos are seizing the AI moment by transforming old infrastructure into edge assets and positioning themselves as critical players in the AI value chain.
In sum – what to know:
Edge opportunity – telco assets like metro hubs and co/lo sites can host lightweight AI workloads, and make old MEC concepts newly relevant.
Ecosystem role – as always, telcos must carve a place in the AI ecosystem to avoid being just utility pipes for big hyper-scale providers.
Programmable networks – software-defined, on-demand fiber and 5G will enable telcos to support and monetize dynamic AI workloads.
Note: This article is continued from a previous entry, available here, and is taken from a longer editorial report, which is free to download – and available here, or by clicking on the image at the bottom. An attendant webinar on the same topic is available to watch on-demandhere.
To an extent, the edge migration to improve the flexibility, scalability, and efficiency of AI systems plays into the hands of telcos. “It is a whole other can of worms – about hosting and selling AI services,” remarks Volker Tegtmeyer, product marketing principal and manager at Red Hat. “Which means understanding enterprise customers.”
Again (see report / previous entries), this is a carry-over from other (familiar) digital-change disciplines, which position telcos as sales channels for all kinds of network and application services.
Their success in different sectors might be argued about, but broadly reflects the complexity of the consultation and integration work to solve business problems in each of them – and probably improves where general-purpose cellular fits, and where industries want lighter-touch treatments (such as in retail, hospitality, logistics, finance), rather than in hard-nosed Industry 4.0 domains where even (private) cellular is anyone’s game.
But this is by-the-by. The point is that the rush of infrastructure-building to serve varied generative and agentic AI use cases at the edge is making a new virtue of old venues – of the type telcos have anyway, going from big regional aggregation hubs and metro exchanges to dusty co/lo racks, and even, potentially, leftover capacity in cell towers. Even customer-premise equipment, deployed by telcos in homes and offices, has a role for lightweight inference and pre-processing.
Stephen Douglas, head of market strategy at Spirent, rejoins. “Operators realize they have to find a role in the wider ecosystem. Because if they don’t act now, they will just be dumb pipes again in five years – with someone else running AI traffic over the top,” he says. “Telcos have physical assets, connections to the grid, the connectivity backbone. Most have good strategies around sustainability. So it makes sense. It is not the only industry that could do it, but it is well positioned.”
Indeed, the whole multi-access edge computing (MEC) concept, trendy five years ago, has seen a resurgence. Fatih Nar, a chief technologist and architect at Red Hat, says: “Before this AI storm, there was this MEC wave – aimed at low-latency edge apps – which never materialized. But the edge is in place because telcos have fiber-to-the-home and -curb, and street boxes with memory storage. So they already went this way with disaggregated cloud RAN. MEC was happening even without an app to justify it.”
Increasingly, this is where the story is – about how to support AI, as much as how to use it. Douglas, recently back from MWC (at writing), reflects: “Until Barcelona, I’d have said it was mostly about using AI. And there are lots of use cases, which show tangible value; quite pragmatic, quite optimistic. But that has changed – to how they are supporting it. There is clear behind-the-scenes activity – about monetizing physical assets, both MEC and networks.”
For its part, Verizon is happy to show its hand, and discuss how it wants to pitch into this gold rush on AI infrastructure – as an explanation of its third “bucket” of AI interests, referenced earlier. Steve Szabo, vice president for technology enablement at Verizon Business, explains: “Everyone is doing things in this space, but everyone knows they can’t do everything. So the question is: how does everybody work together without stepping on each other’s toes?”
Which is the subtext, here – that everyone is “encroaching on everyone else’s revenue streams” he says. “Which just drags out our ability to quickly-deliver what the market wants,” he adds. He highlights Verizon’s work to draw lines and collaborate on common goals – to rent data center assets to deploy AI hardware, optimize its backhaul to deliver AI traffic, and offer network management products to design, connect, and control AI workloads.
Google Cloud and Meta are taking network capacity from Verizon Business for AI workloads. The company claims a $1 billion sales funnel for its new AI Connect enterprises offerings. It has just signed with cloud hosting firm Vultr, which will expand its GPU-as-a-service offer via at least one Verizon data center, and “hook directly” into its fiber network. “We will power everything on the backend to get it to the co/los, cloud sites, edges,” says Szabo.
He adds: “We bring a lot to the table: our assets and land; our space, power, and cooling; our networks. We have a ton of traction; our funnel is big in [shared and dedicated] fiber – just to transport these AI workloads. The requests are coming our way… [and] the strategy is very clear: they handle the GPU-level stuff and we get them to where they need to go – and to use more network services as our network programmability advances.”
In short, Verizon is not interested in building the core infrastructure, but sees itself as the “hub and highway” beyond these central AI engine rooms – and has the proof, as well, if its order book is to be believed. As its transport networks become more software-defined (programmable) – mostly across its fiber backbone, but extending to parts of its 5G network – it will be able to instantly adjust bandwidth and routing across edge, metro, and long-haul routes.
This will help the ecosystem to meet spikes in AI demand. As per its “bucket two” discussion earlier, Verizon will afford infrastructure partners and enterprise customers the same privileges in its “low-touch AI management products” – so they can “go from 1Gbps to 10Gbps, to 100Gbps” directly, without picking up the phone to Verizon, and “build the pipes in real time to wherever they want”. Programmable 5G will be layered in as well, over time.