SoftBank said that the demonstration results show that high-performance AI models and GPUs are indispensable for achieving 5G-Advanced and 6G performance
In sum – what to know:
30% boost in 5G throughput – Transformer-based AI improved real-world uplink performance compared to conventional methods, showing stronger results than earlier Convolutional Neural Network-based (CNN) research.
Latency cut below 1 ms – Processing time dropped to 338 microseconds, meeting strict real-time 5G needs while outperforming the CNN model by 26%.
Simulation doubled downlink gains – SRS prediction simulations showed throughput improvements of up to 31% for moving devices, more than doubling results of simpler AI models.
Japanese carrier SoftBank has developed a new AI architecture using a Transformer model for radio access networks (RAN), the telco said in a release.
The telco noted that recent tests showed that the system improved 5G uplink throughput by about 30% and reduced processing delays well below the one-millisecond target for real-time communications.
The research is part of SoftBank’s work on “AI for RAN,” which applies AI to wireless signal processing. The company emphasized that the technology marks a step toward practical use of AI-RAN in live networks.
In real-world testing, the Transformer-based approach ran on GPUs and increased uplink throughput by 8% compared to SoftBank’s earlier model, which was a Convolutional Neural Network (CNN) model. Uplink throughput improved by about 30% compared to conventional methods without AI. It also cut latency to an average of 338 microseconds, about 26% faster than the CNN model, SoftBank said.
SoftBank also simulated use of the model for Sounding Reference Signal (SRS) prediction, a process that helps base stations assign beams to devices. The new model more than doubled throughput improvements compared to earlier tests with simpler AI models, boosting downlink speeds by 29% for devices moving at 80 km/h and 31% at 40 km/h.
“The most significant technical challenge for the practical application of ‘AI for RAN’ is to further improve communication quality using high-performance AI models while operating under the real-time processing constraint of less than one millisecond. SoftBank addressed this by developing a lightweight and highly efficient Transformer-based architecture that focuses only on essential processes, achieving both low latency and maximum AI performance,” the carrier said.
“The demonstration results show that high-performance AI models like Transformer and the GPUs that run them are indispensable for achieving the high communication performance required in the 5G-Advanced and 6G eras. Furthermore, an AI-RAN that controls the RAN on GPUs allows for continuous performance upgrades through software updates as more advanced AI models emerge, even after the hardware has been deployed. This will enable telecommunication carriers to improve the efficiency of their capital expenditures and maximize value,” the telco added.
SoftBank also said it will accelerate the commercialization of the technologies validated in this demonstration.
SoftBank recently deployed quantum computing technology in a trial of its 5G Radio Access Network (RAN). In a Tokyo proof-of-concept, the Japanese operator used an Ising machine — a combinatorial optimization quantum system — to recalibrate base station carrier-aggregation (CA) settings, achieving a 10% increase in downlink speeds and up to 50% growth in data transmission capacity.