YOU ARE AT:5GAutomating 5G and oRAN testing

Automating 5G and oRAN testing

4G revolutionized data throughput and latency, offering simplified architecture and scalable networks; 5G propels us even further. With latency below 5 ms and remarkably higher speeds, it unlocks real-time processing, enabling AI/ML in diverse scenarios: intelligent transportation for smart cities, energy-efficient homes, secure autonomous vehicles, immersive augmented reality and Industry 4.0/5.0 innovations.

However, realizing 5G’s potential demands a shift. Dynamic open architectures are needed for the radio access network (RAN), contrasting with legacy networks that were static and homogenous. This shift enables power reduction, self-healing networks, AI adaptations and customized radio deployments. The promise of open radio access network (ORAN) lies in cost reduction and revenue generation, though the complexity mandates thorough testing and real-time validation, crucial in a multi-vendor, disaggregated ecosystem.

Simultaneously, the 5G Core adopts a cloud-based microservices structure, decoupling hardware from software. Each core function becomes a plug-and-play service, facilitating third-party vendors to provide solutions in a diverse multi-vendor approach. A parallel transformation unfolds in the RAN domain, as open RAN dismantles barriers for purpose specific RANs. Its chief advantage is fostering an open, multi-vendor radio access ecosystem, empowering operators to diversify supply chains and tailor innovative solutions while driving cost efficiency through competition and resource sharing.

However, these prospects are met with challenges — multi-vendor compatibility, standards adherence, and, crucially, real-time processing and data volume performance. Rigorous testing across levels is imperative to meet functional, security, scalability and resilience demands. This calls for frequent, stringent testing and validation, exceeding those of prior network standards.

The list below provides a streamlined version of the general steps required in validating an ORAN network. Even simplified, this will require sophisticated network tools and significant effort to set up and tear down environments. 

  1. Planning and preparation:
    • Define objectives — functionality, performance, scalability, interoperability, security
    • Create scenarios — diverse test cases, edge scenarios for comprehensive coverage
    • Gather resources — hardware, software, tools for effective testing
  2. Functional testing:
    • Validate RAN functionality — radio resource management, handover, cell selection, mobility
    • Verify RIC functionality — RIC-RAN interaction focusing on E2/O2/A1 interface compliance
  • xApp: in predefined scenarios e.g. validating behavior for QoS optimization, load balancing, energy saving, network slicing control etc
  • rApp: validate non real-time behaviors such as policy optimization or network healing
  1. Performance testing:
    • Capacity and throughput — user loads, traffic handling capabilities
    • Latency and delay — response times of Xapps, adherence to performance thresholds
    • Covergence time for machine learning algorithms employed in rApps
    • Quality of service (QoS) — service prioritization, predefined policies
  2. Interoperability testing:
    • RAN-RIC interface — effective communication, control behavior
    • Network element interactions — seamless engagement with core networks, orchestrators and aggregated behavior and interactions with x/rAapps especially for 3rd party components across versions.
  3. Security testing:
    • Vulnerability assessment — identifying weaknesses, susceptibility to breaches
    • Authentication and authorization — proper enforcement, preventing unauthorized access
  4. Scalability testing:
    • Evaluate scalability — network load, user count, traffic volume handling
  5. Resilience and redundancy testing:
    • Assess resilience — recovery from failures, effectiveness of redundancy
  6. Regression testing:
    • Periodic validation — new updates, changes impact assessment, existing functionalities
  7. Negative Testing:
    • Evaluate behavior induced by protocol errors or invalid message parameters
  8. Documentation and analysis:
    • Record results — document outcomes, identified defects
    • Analyze outcomes — evaluate against criteria, performance targets
  9. Test validation and sign-off:
    • Verify compliance — RAN, RIC pass required tests, meet acceptance criteria
    • Final report — comprehensive overview, recommendations for improvement

While there is one logical architecture for ORAN solutions, there are multiple implementation and deployment scenarios. A complete testing solution must be flexible to adapt to those scenarios so that each one may be instantiated, configured, engineered and tested. The configurations range from a fully disaggregated set of functions as per the ORAN alliance to any number of bundles of CU/RU/DU and near and non real-time RIC and orchestration. 

Luxoft’s Software Defined Lab (SDL) provides and supports an end-to-end testing environment that can span all types of solution infrastructures including:

  • Cloud solutions — this includes public cloud, private cloud, hybrid cloud and multi-cloud solutions
  • On-prem solutions — including virtualized and non-virtualized data centers, COTS and discrete vendor-specific components (VNFs and PNFs) and test simulation/emulation equipment

With SDL used to manage an ORAN validation environment, it provides a software test automation environment with built-in DevOps features targeted at service providers, operators, virtual network functions (VNF) vendors and network equipment manufacturers (NEMs). The following are its key features:

  • An operator or a VNF vendor can comprehensively model an operation of an NFV-based network using multiple 3rd party VNFs 
  • The operator can compare, evaluate, model, test and engineer the capacity and performance of a solution involving components in multiple infrastructures (cloud, on-prem, virtualized, physical)
  • VNFs will be automatically deployed in a desired configuration that implements a required service chain or network services (the SUT)
  • The integrated test automation environment orchestrates and executes traffic and functional tests against the assembled VNF configuration 
  • SDL can test across physical network function (PNF) and virtual network function boundaries for operators as they transition various portions of their services to NFV 
  • The same system can be used to test existing physical network functions (PNFs) using cost-effective virtual test tools and simulators and can then be used to validate/compare/contrast the behavior of those network functions as they are transitioned to VNFs

At Luxoft, we utilize a combination of test capabilities available from the community and from our partners including Viavi, Spirent and others; SDL is not directly a testing tool and is agnostic with regards to any testing tools you may use that can be placed under software control. These tools along with Luxoft’s proven industry leading practice for telecom component and system validation, test automation, and our unique Software Defined Lab solution enable us to achieve a cost effective, yet comprehensive solution for the evaluation, integration, validation, engineering and productization of the components necessary for your RAN and RIC solutions. 

Whether you’re evaluating and verifying RAN/RIC vendors to work with your existing solution, verifying your RAN/RIC solution for use with other vendors’ networks, implementing a traditional RAN/RIC or transitioning to C-RAN, or developing your own RAN/RIC solution and need an experienced partner to develop and customize your solution, Luxoft has the team, tools, ecosystem and experience to ensure the success of your project.

Authors:
James Hopson, Technical Product Manager, Luxoft
James has been a Solutions Architect and Product Manager in telecommunications operations and virtualization for more than 30 years, most recently specializing in the areas of network and cloud solutions, NFV testing and test automation. He is the CTO for networking and cloud solutions at Luxoft and a solution manager in Luxoft’s Technology, Media and Telecom organization. James is responsible for Luxoft’s Software Defined Lab Services Accelerator which is an automated configuration and test, environment management and modelling framework for private/public/hybrid cloud, NFV, hybrid and physical environments. 


Mihnea Ionescu, Senior Software Engineer, Luxoft
With over 15 years of software development experience, Mihnea’s primary area of focus lies in wireless technologies, including WiFi, 4G and 5G. He is actively engaged in the development of Open RAN testing solutions and 5G radio access network modeling.

Learn more about Luxoft here.

ABOUT AUTHOR