Global, porgrammable, and dense – from the cloud to the edge: Verizon Business talked at MWC about how its sees the new AI stack evolving for telcos, and why its investments in backbone fiber, metro access, and private network will link cloud models and inference workloads to real-world machines.
In sum – what to know:
Layer cake – hyperscale training in centralized clouds, orchestration and transport across metro and long-haul fiber, and inference at the edge on public and private 5G.
Fiber diet – dense backbone infrastructure enables both hyperscale interconnect and enterprise connectivity, from WANs to private networks and public network slices.
AI platform – for enterprise AI, operators must deliver programmable, scalable connectivity that can dynamically respond to application demands in real time.
The AI economy is built on “layer-cake” infrastructure, says Verizon Business, where the middle layer, between the models and robots, brings connectivity and control. It is a high-fiber recipe, clearly, and the US firm is upgrading its backbone network as part of its AI Connect project to deliver programmability for new AI workloads. Such dynamism is smarter, as yet, and also simpler in fiber systems than in mobile networks – so long as the cables are already in the ground. But AI Connect covers its whole AI proposition, and for ‘connectivity’ read ‘orchestration’, in fiber and 5G.
This is the gist of a conversation with Daniel Lawson, senior vice president of global solutions at Verizon Business at MWC. What else – to put up-top, as a hits collection? “Private networks are here to stay,” it says – in case there is any doubt. Ignore the noise about Nokia, in other words; the industry knows the truth. In between the private edge and the public cloud, in the access network, there is speculative work to serve new inference workloads, provisionally going from sub-10 milliseconds on-premise to 20-80 milliseconds in the metro edge – and then “everything else”.
There is a huge opportunity, plus a little serendipity, for telcos in all of this. AI is the “gas on the fire” for enterprise services – and for the whole 5G story, actually, which just might mean the telecoms industry delivers on its own hype and promise for once. But just to pause…

Does any telco speak so well – so clearly, logically, confidently – about the multi-layered networking game in AI infrastructure right now as Verizon Business? Maybe, but not to RCR, anyway. Readers will know that, for good or bad, the RCR view of this market is skewed a little differently, from bottom to top, private edge to the public cloud. This is because RCR has shuffled the pack to capture more of the play, and the old press hound (me) on the IoT trail, which forked into private networks years ago, is now tracking everything south of the new ‘east-west’ AI fiber line.
Which sounds like random background, but points another way into the conversation with Verizon Business at MWC about just that – about how telcos might network the new AI economy between big data centers in the cloud, going via longhaul international and national fiber infrastructure, and up and down on fiber and mobile access systems – right into private networks, as required, on enterprise premises. Verizon Business talked about just this during a panel session at PTC in January; it talked about the same at an ‘innovation session’ in Paris last October.
Of all tier-one telcos, Verizon Business seems most at ease with (or articulate about) its place in the new AI world order – that, somehow, the story has spun its way, making a virtue of its stock-in-trade and good on its older bets. It has clarified its investment strategy and commercial direction. The PTC presentation, from January, is replayed to Lawson at MWC, mixed-in with parts of the Paris talk about its fiber footprint (‘top five in the world’, as RCR recalls), and mixed-up with references to the bigger AI interconnect bonanza (see recent coverage of Lumen, Ciena, Cisco, Nokia).
He responds: “Yeah, that assessment is right on. And I’ve had conversations about it here – looking at it like a layer cake, where the top layer is the AI, whether large language or specialized models, the middle is the orchestration to connect the physical assets, and the bottom is the robotics and vehicles, and all sorts… The model sees [the asset], the agent directs the action, and the orchestration spins up the bandwidth, or whatever. All of those things play together, but it doesn’t happen without the metro and long-haul fiber, and it doesn’t happen without the last-mile.”
He goes on: “The metro and long-haul connect the central model to the inference at the edge, and that last-mile connects to the facility [on public 5G or fiber] or within the facility with private 5G. Most instances of physical AI are in these RF-challenged environments – hospitals, factories, warehouses, all those types of things. So there is a natural fit for private 5G, as well, to enable these physical AI use cases.” We could almost leave it there; point made. But the conversation with Lawson is good – and goes way further, and is worth reporting in full.
So, given time constraints at RCR, the rest is submitted below as a full Q&A transcript. It covers the gamut: from the scaling of fiber infrastructure for hyperscale training workloads, the gradual shift towards the ‘network’ edge with inference tasks, right down into the private networking domain. As above, Lawson discusses how Verizon Business thinks about latency tiers for AI inference workloads – from on-prem deployments to metro edge and central cloud. But ultimately, the conversation is about network programmability for AI, across that whole middle orchestration layer.
…
Just on private 5G, quickly: does the market need defending at all? There is a narrative doing the rounds, partly because of Nokia’s decision to exit, that it is niche, difficult, hard to make work – certainly for big telecoms companies. Or does the industry know its proper place and value by now?
“The industry is much more comfortable with the fact that private networks are here to stay. Our view right now is around how to scale. We’ve seen significant developments in the device ecosystem, and just from the top – from the operators and vendors, and whatnot – in terms of flexibility and capability, and things of that nature. And we’re starting to see neutral host [adoption], as well, which is a great anchor for everyone that comes into a facility – very important in a hospital, say – and a way also to host the private infrastructure to do robotic surgery, or AMRs or inventory scans, things like that. So it is a core part of our strategy, both domestically and abroad.”
And for Verizon Business, at group level: the fiber interconnect, the ducts and cabling that Verizon Business was talking about at PTC in relation to the One Fiber project, and metro-level densification – is there a piece above that, to interconnect data centers, or is that the same as the metro? Because it is unclear to me how much of that new AI interconnect project is going to traditional telcos, and how much is hyperscaler-owned.
“They’re related. But it’s all the same. It’s an infrastructure build.”
So is that interconnect piece the immediate task for telcos in the AI infrastructure build-out? I mean, we’re at a mobile show, talking about 5G and 6G; but is the AI infrastructure story really at that level at the moment – so far as networks go? And then the inferencing at the edge via macro access network – is that 18 months out? Is that the next phase? Or is all of this coming together now, at the same time?
“I feel like it is coming together at the same time. The [One Fiber] work has been going for a number of years to densify the network – initially, just for operational stability and control of the backhaul infrastructure that runs the mobile network. But at the same time, as we’ve seen this explosion of data, and how AI impacts data flows across the network – I mean, fiber in general is multipurpose, right? So it allows us to do much more there – in terms of lit services; 100G, 400G, and above; and also dark fiber. And then you layer the Frontier acquisition on top, which gives even more density across the lower 48 states in the US – all of that is a great play on the consumer side, but it also adds to our strategy on the B2B side as well, because we have more assets to deliver those connections at speed.”
And just fit the whole proposition together for me – across your longhaul fiber assets, your fiber and mobile access networks. How does that all come together in service of enterprises in the AI era?
“Think of it like an enterprise flavor of the convergence play in the consumer market. Fiber is a key enabler of basic enterprise inter-connectivity. But it is also key for backhaul of the radio access network, as well as for private deployments and slices of the public macro network as well. All of that is based on application requirements to operate a certain way – the ability to reserve resources and guarantee performance in the network. All that comes back to super high-bandwidth backbone and fiber. So it is essential to who we are. We were the first to bring fiber to the home in the US. We still operate a global backbone that allows us to have these conversations at scale.
“And going back to private networks: our first one was with British Associated Ports at the port of Southampton in the UK; I just met with the folks at Thames Freeport this week as well, looking for what’s next; we are doing a lot of work in Germany with automakers and pharmaceutical manufacturers, solving problems within their plants. But all of that also connects to a global WAN, which is built on the back of that fiber backbone. Fundamentally, it is just connectivity but there are lots of flavors of it – and on prem, whether it is private 5G or Wi-Fi or wired, it all runs on that backbone. In the end, it’s just about how to connect the robot, the camera, the whatever – to the model. But fiber is a critical enabler for the AI revolution.”
Has all of this come around to the telco worldview, a little bit – the whole AI thing. Is that fair? Because by accident or design, there was the One Fiber project to densify the RAN backhaul, and the whole MEC story, which never happened, but kind of works for inference. And I don’t know if that has been mothballed and reinvented. But that distributed fiber and 5G architecture, which you’re upgrading and enabling, plays well.
“With anything, it is a bit of both. Certainly, we did not see the uptake of edge compute that we thought we would five years ago. It was a solution looking for a problem to an extent. And there is no way we could have predicted where we would be now, five years hence, from an AI perspective – because people weren’t even really talking about it at that point. I do think you’re seeing, if not a renaissance, then some acknowledgement that there are going to be use cases that will require sub-20 or sub-10 millisecond latency. And that’s a physics problem, which moves the compute closer to the application – because you can’t outrun the speed of light.”
Where does it have to be?
“Well, so we’re looking at the edge. But like anything, right, there are different tools for different jobs. You certainly will see AI on devices, and we are seeing that already. In large enterprise deployments, we are looking at it across three paradigms: one is on-prem, so sub-10 millisecond latency; one is the telco edge and cloud edge, the metro and the geo (urban and regional) edge, in that 20-80 millisecond range; and then there’s everything else, which is just in the macro cloud. And so it depends on what you’re trying to do.”
And do we know that yet – what we are trying to do? Is that mapped already, or is this a speculative discipline to an extent – just to have it all covered and trust it will arrive? Do we know how much of those enterprise workloads and that inferencing is going on the premise, for which applications and industries, and how much is going in the metro edge – outside of the central cloud? Because at the moment we are still training the models in the cloud – unless it is all on-prem, on private 5G. So it is a future exercise, right?
“To a large extent, it is. Most of what is happening in the AI space, and why we’ve seen such a boom in wave services and dark fiber, is just the massive appetite for bandwidth in these large data centers for training – how to bring the data in. As we see inferencing shift to the edge, that is when those use cases will develop. We are seeing use cases already, but they are generally in small pockets. I was at a conference last week talking about advances in robotic surgery, and there are hospitals and hospital systems doing this today, and pushing the envelope on the latency and jitter they can withstand. But is there robotic surgery in every hospital? No, not even close.”
And then there is the device edge, as well, right? And what is capable in terms of edge AI processing on today’s chipsets and devices shifts all the time – and presumably the trend will be to unburden the network at every turn?
“That’s right. And per that layer cake framework, the network has to operate more like a platform than it ever has in the telco space. Go back 10 or 15 years, it was this monolithic thing, designed to [run at] the maximum. And if you went over, it was a six-month process to upgrade or whatever. Now, the work is about embedding agility, scalability, security into the platform itself – so that when a model detects a defect, via a computer vision application running quality assurance on a manufacturing line, the platform lets you zoom in, pump the resolution to 8K, and open a massive upload channel back to it – to figure out what’s going on, whether it’s with the line or machinery or the raw materials. The network has to be able to respond in real time. Snap, snap. You have to build that capability and flexibility into the network infrastructure. Some of that is in the fiber build, and some of it is about network programmability via our AI Connect portfolio.”
Is there more programmability in the fibre network or the mobile network? How much of that AI Connect portfolio is about the mobile piece?
“It’s really across the whole of it, but that core fiber transport – around lit- and dark-network programmability, and where to put compute resources – is what is really driving it.”
So, can you do more in the fiber network at the moment than you can in the mobile network?
“Oh, absolutely. Because you’re getting significantly higher throughput. And to your point, much of what is being done now is just to train the models – and so you’ve got dense, dense fiber into these data centers so they can continue to bring in data and train.”
On the mobile piece: there’s lots of talk at MWC about programmability and autonomous systems, of course. I was with Microsoft just now, which said, ‘yeah, they’re talking about level four, but they’re really down at level two or three’ – on the TM Forum rating. Which sounds right. But the same kind of story is quite well developed in fiber in terms of autonomy, dynamism, control – in terms of the type of agility you mentioned?
“It is well understood in terms of the necessity of it. The journey to full programmability is ongoing. Across both, right. In another way, it is easier to build that level of programmability into the wireless network because you’re not dealing with physical cables. If I’m bringing new fiber into a data center, then it is a construction project. Which is what One Fiber has been. But once it is there, you have tunable optics, an API control plane – so you might have one wavelength at 100G, get a new dump of data, and turn up three more so you’ve got 400G of throughput – or upgrade the link from 100G to 400G. Once you have the infrastructure in place, then you can absolutely do that.
“And when you have that, you can also create a plan that includes slicing, say, for application traffic of a certain type – for a first responder or a Zoom call, or whatever else. The network recognizes what it is, opens up the slice, and boom… I mean, you are still dealing with a physical medium, but it is radio waves, and not fiber. They are at different stages of the journey, but it is exactly why private networks have made such an impact within the four [enterprise] walls as well – because you don’t have to run fiber to all these PLCs, and because you can’t run fiber to robots or AMRs. Just because they would drive around, and pull the cable out. Which is why the flexibility and scalability of private 5G is such a dynamic use case within the four walls. But we’re doing exactly the same in the macro as well.”
So within all of that, what is the main message for Verizon Business, at the moment – at shows like this, in meetings with partners, in business with customers? In terms of networks for AI and AI for networks, and how to monetize it all in service of enterprises, ultimately? Where’s the focus?
“Well, we think about AI in three buckets: one is applied, about how we’re using it to run more efficiently, reduce cognitive load, all those types of things; two is embedded, about how are we building it into the product set so the products are smarter and create more customized experiences for a financial services customer versus a manufacturing customer, and so on and so forth; and three is enabled, about how to help customers along the AI journey, which is where dense fiber connectivity, private networks, and all the services that wrap around come into play.”
Which is moving fastest? Which is the one that is talked about most in the boardroom – where the roadmap decisions have to go faster?
“Probably be the AI Connect piece – because, just like with fiber, it is a massive construction project, which requires capital and time. But we are very bullish because we are not just building for single use cases, but for our own backbone for our customers to use, and ultimately for the proliferation of these inferencing models. But the Verizon message is very simple: a programmable, scalable infrastructure that allows you to connect the academic to the physical. Which may be robotics or something in the consumer space, or any number of things. But without connectivity – secure, stable, scalable, all those adjectives – it is more like an academic exercise.”
And at MWC, specifically: you don’t have a stand, so far as I know, and I have hardly attended any keynotes so I have not seen you there. So what are the meetings about – when you are together with the whole ecosystem? What are you looking to solve?
“A lot of it is about alignment. Because the companies here are key to how we build and operate networks, and how we interface with customers. But the conversation spans the portfolio, right? We’ve met with folks to talk about IoT – because we can support the same types of use cases we’re enabling in the private 5G network in the macro network. We are talking with the vendors, of course, because we use them in the public network, and also for private deployments – and so we are making sure we’ve got alignment in terms of our go-to-market strategy. We are meeting with chipset makers – just because, you know, you need the chip to make the connection. We are talking with the folks that help us to orchestrate and integrate into customer environments – so we remove some of the friction from there – to turn services up and down, those types of things. The hyperscalers, too, of course.”
Are they the same conversations as last year?
“Same conversations, different momentum. We are starting to go parabolic, almost – in terms of how much things are changing. A lot of things we were talking about last year as theoretical are now happening in the real world today. So similar conversations, but lots more momentum.”
Regardless of the general AI hype, which is different altogether, it feels like maybe AI might help telecoms deliver on its own crazy hype – for once. Is that right? I mean, the 5G story, like the IoT story, was so hyped and confused, and it was only in private enterprise edge set-ups, and often in 4G ones, that we saw any of the more advanced capabilities and applications. But then enterprises are slow and complicated, by nature. But do you think that AI, now, is driving commercial interest in the kinds of Release 17/18 capabilities that got the telco industry so excited five years ago? Does it feel like the 5G hype might be delivered on, at last.
“Yes. I mean, we knew from the way the standard was built, and through the evolution of 4G, that 5G was more business oriented than consumer oriented – although it has a great impact on customer experience on the consumer side, as well. But yes, we are now seeing some of these highly sophisticated use cases, which require programmability and scalability, come to fruition – and AI has been the gas on the fire. So yeah, we are trending towards all those things we thought about five years ago when 5G started to roll out – we’re actually now starting to see those.”
And from a Verizon point of view, and how your piece is strategical for the future – I mean, there’s the branding and marketing of the consumer piece, but the real stuff, I would say, and the monetization opportunity with the ability to deliver different connectivity on-demand in different places, edge to cloud, with private networks, slices, public 5G, all your fiber stuff – is that hopefully transformative monetization opportunity for telco within reach suddenly?
“Absolutely. Look, Verizon is a team sport, right? The investments we’re making are to delight our consumer customers and to delight our B2B and public sector customers as well. That’s the benefit and value of building these composable, flexible networks, which are fiber based – because we can do all these different things over one common infrastructure. And we’re sticking to the core of what we do best, which is connecting things and people – to the things and people they need to be connected to. And so yeah, the monetization opportunity is super attractive for us because I can build once and use in lots of different ways – the enterprise use cases and the consumer experience as well.”
