YOU ARE AT:5GAT&T: Edge computing key to 5G self-driving car, AR/VR use cases

AT&T: Edge computing key to 5G self-driving car, AR/VR use cases

SDN and mobile edge computing ‘go hand-in-hand’

In addition to increased capacity and lowered latency–as compared to what’s offered by LTE and its variants–5G use cases, particularly latency-sensitive applications like self-driving vehicles and augmented and virtual reality, will require dense, distributed network infrastructure. Data center-based cloud computing has a major role to play in 5G, but, perhaps equally so, mobile edge computing will be a key enabler of next-generation network services.

When considering an interstate filled with autonomous vehicles, real-time data processing can quite literally mean the difference between a smooth commute and a potentially life-threatening accident. In order to support that need for instantaneous insight, compute and processing functions need to move from the cloud to the edge.

“Edge computing fulfills the promise of the cloud to transcend the physical constraints of our mobile devices,” Andre Fuetsch, president of AT&T Labs and chief technology officer, said in a prepared statement. “The capabilities of tomorrow’s 5G are the missing link that will make edge computing possible. And few companies have the sheer number of physical locations that AT&T has that are needed to solve the latency dilemma.” Said another way, “The cloud comes to you,” according to AT&T.

In terms of leveraging its existing network infrastructure footprint, AT&T plans to distribute compute infra to “tens of thousands of central offices, macro towers and small cells that are generally never farther than a few miles from our customers.”

Edge computing to solve the AR/VR latency problem

This is something Marcus Wheldon, president of Nokia Bell Labs and Nokia CTO, talks about a lot. Last year at the SCTE-ISBE Cable-Tec Expo, he discussed how futuristic applications will require a new, more distributed network architecture.

“One of the reasons we’re excited about the future is we think the future is nothing like today,” he said. “We’re going to build a new network architecture. The point of the future is still about entertaining people, but it’s equally about changing our world by instrumenting everything. You can automate all mundane tasks. Fundamentally…it’s to create time. My first task is to create time. Local is going to become more important…which is power to the cable industry, power to the local providers. Local is where the power will be in the future. The trick is to federate…to look pseudo-global.”

Specific to AR/VR–and not to be overly reductive–latency is what can make you nauseous. Your brain is processing imagery more quickly than the network can serve it to you. To truly mainstream enterprise and consumer AR/VR experiences, you’ve got to solve for latency.

The role of software-defined networking

AT&T described a “hand-in-hand” relationship between edge computing and software-defined network (SDN). The carrier, which internally developed the ECOMP virtualization framework, then brought it open source, in partnership with Huawei’s OPEN-O, via the Linux Foundation under the new moniker ONAP, said this week the goal is to virtualize 75% of network functions by 2020. As for benchmarking, AT&T reps said it hit 34% software control at the end of 2016, “and we aim to cross the halfway mark this year, reaching 55%.”

 

ABOUT AUTHOR

Sean Kinney, Editor in Chief
Sean Kinney, Editor in Chief
Sean focuses on multiple subject areas including 5G, Open RAN, hybrid cloud, edge computing, and Industry 4.0. He also hosts Arden Media's podcast Will 5G Change the World? Prior to his work at RCR, Sean studied journalism and literature at the University of Mississippi then spent six years based in Key West, Florida, working as a reporter for the Miami Herald Media Company. He currently lives in Fayetteville, Arkansas.