Showing posts with label RAN. Show all posts
Showing posts with label RAN. Show all posts

Friday, November 15, 2024

RAN, AI, AI-RAN and Open RAN

The Japanese MNO Softbank is taking an active role in trying to bring AI to RAN. In a research story published recently, they explain that AI-RAN integrates AI into mobile networks to enhance performance and enable low-latency, high-security services via distributed AI data centres. This innovative infrastructure supports applications like real-time urban safety monitoring and optimized network throughput. Through the AI-RAN Alliance, SoftBank collaborates with industry leaders to advance technology and create an ecosystem for AI-driven societal and industrial solutions.

This video provides a nice short explanation of what AI-RAN means:

SoftBank's recent developments in AI-RAN technology further its mission to integrate AI with mobile networks, highlighted by the introduction of "AITRAS." This converged solution leverages NVIDIA's Grace Hopper platform and advanced orchestrators to unify vRAN and AI applications, enabling efficient and scalable networks. By collaborating with partners like Red Hat and Fujitsu, SoftBank aims to commercialize AI-RAN globally, addressing the demands of next-generation connectivity. Together, these initiatives align with SoftBank's vision of transforming telecommunications infrastructure to power AI-driven societies. Details are available on SoftBank's page here.

Last month, theNetworkingChannel hosted a webinar looking at 'AI-RAN and Open RAN: Exploring Convergence of AI-Native Approaches in Future Telecommunication Technologies'. The slides have not been shared and the details of the speakers are available here. The webinar is embedded below:

NVIDIA has a lot more technical details available on their blog post here.

Related Posts

Wednesday, August 10, 2022

AI/ML Enhancements in 5G-Advanced for Intelligent Network Automation

Artificial Intelligence (AI) and Machine Learning (ML) has been touted to automate the network and simplify the identification and debug of issues that will arise with increasing network complexity. For this reason 3GPP has many different features that are already present in Release-17 but are expected to evolve further in Release-18. 

I have already covered some of this topics in earlier posts. Ericsson's recent whitepaper '5G Advanced: Evolution towards 6G' also has a good summary on this topic. Here is an extract from that:

Intelligent network automation

With increasing complexity in network design, for example, many different deployment and usage options, conventional approaches will not be able to provide swift solutions in many cases. It is well understood that manually reconfiguring cellular communications systems could be inefficient and costly.

Artificial intelligence (AI) and machine learning (ML) have the capability to solve complex and unstructured network problems by using a large amount of data collected from wireless networks. Thus, there has been a lot of attention lately on utilizing AI/ML-based solutions to improve network performance and hence providing avenues for inserting intelligence in network operations.

AI model design, optimization, and life-cycle management rely heavily on data. A wireless network can collect a large amount of data as part of its normal operations. This provides a good base for designing intelligent network solutions. 5G Advanced addresses how to optimize the standardized interfaces for data collection while leaving the automation functionality, for example, training and inference up to the proprietary implementation to support full flexibility in the automation of the network.

AI/ML for RAN enhancements

Three use cases have been identified in the Release 17 study item related to RAN performance enhancement by using AI/ML techniques. Selected use cases from the Release 17 technical report will be taken into the normative phase in the next releases. The selected use cases are: 1) network energy saving; 2) load balancing; and 3) mobility optimization.

The selected use cases can be supported by enhancements to current NR interfaces, targeting performance improvements using AI/ML functionality in the RAN while maintaining the 5G NR architecture. One of the goals is to ensure vendor incentives in terms of innovation and competitiveness by keeping the AI model implementation specific. As shown in Fig.2 (on the top) an intent-based management approach can be adopted for use cases involving RAN-OAM interactions. The intent will be received by the RAN. The RAN will need to understand the intent and trigger certain functionalities as a result.

AI/ML for physical layer enhancements

It is generally expected that AI/ML functionality can be used to improve the radio performance and/or reduced the complexity/overhead of the radio interface. 3GPP TSG RAN has selected three use cases to study the potential air interface performance improvements through AI/ML techniques, such as beam management, channel state information feedback enhancement, and positioning accuracy enhancements for different scenarios. The AI/ML-based methods may provide benefits compared to traditional methods in the radio interface. The challenge will be to define a unified AI/ML framework for the air interface by adequate AI/ML model characterization using various levels of collaboration between gNB and UE.

AI/ML in 5G core

5G Advanced will provide further enhancements of the architecture for analytics and on ML model life-cycle management, for example, to improve correctness of the models. The advancements in the architecture for analytics and data collection serve as a good foundation for AI/ML-based use cases within the different network functions (NFs). Additional use cases will be studied where NFs make use of analytics with the target to support in their decision making, for example, network data analytics functions (NWDAF)- assisted generation of UE policy for network slicing.

If you are interested in studying this topic further, check out 3GPP TR 37.817: Study on enhancement for data collection for NR and ENDC. Download the latest version from here.

Related Posts

Monday, June 13, 2022

Tutorial on 4G/5G Mobile Network Uplink Working and Challenges

People involved with mobile technology know the challenges with uplink for any generation of mobile network. With increasing data rates in 4G and 5G, the issue has become important as most of the speeds are focused on download but upload speeds are quite poor.

People who follow us across our channels know of many of the presentations we share across them from various sources, not just ours. One such presentation by Peter Schmidt looked at the uplink in details. In fact we recommend following him on Twitter if you are interested in technical details and infrastructure.

The details of his talk as follows:

The lecture highlights the influences on the mysterious part of mobile communications - sources of interference in the uplink and their impact on mobile communication as well as practices for detecting sources of RF interference.

The field strength bar graph of a smartphone (the downlink reception field strength) is only half of the truth when assessing a mobile network coverage. The other half is the uplink, which is largely invisible but highly sensitive to interference, the direction from the end device to the base stations. In this lecture, sources of uplink interference, their effects and measurement and analysis options will be explained.

Cellular network uplink is essential for mobile communication, but nobody can really see it. The uplink can be disrupted by jammers, repeaters, and many other RF sources. When it is jammed, mobile communication is limited. I will show what types of interference sources can disrupt the uplink and what impact this has on cellular usage and how interference hunting can be done.

First I explain the necessary level symmetry of the downlink (from the mobile radio base station - eNodeB to the end device) and the uplink (from the end device back to the eNodeB). Since the transmission power of the end device and eNodeB are very different, I explain the technical background to achieving symmetry. In the following I will explain the problems and possibilities when measuring uplink signals on the eNodeB, it is difficult to look inside the receiver. In comparison, the downlink is very easy to measure, you can see the bars on your smartphone or you can use apps that provide detailed field strength information etc. However, the uplink remains largely invisible. However, if this is disturbed on the eNodeB, the field strength bars on the end device say nothing. I will present a way of observing which some end devices bring on board or can be read out of the chipset with APPs. The form in which the uplink can be disrupted, the effects on communication and the search for uplink sources of disruption will complete the presentation. I will also address the problem of 'passive intermodulation' (PIM), a (not) new source of interference in base station antenna systems, its assessment, measurement and avoidance.

The slides are available here. The original lecture was in German, a dubbed video is embedded below:

If you know of some other fantastic resources that we can share with our audience, please feel free to add them in the comments.

Related Posts

Tuesday, November 2, 2021

Energy Consumption in Mobile Networks and RAN Power Saving Schemes

We just made a tutorial on this topic looking at where most of the power consumption in the mobile network occurs and some of the ways this power consumption can be reduced. 

The chart in the Tweet above (also in the presentation) clearly shows that the energy costs for operators run in many millions. Small power saving schemes can still have a big impact on the total energy reduction, thereby saving huge amounts of energy and costs.

The March issue of ZTE Communications Magazine contains some good articles looking at how to tackle the energy challenges in the network going forward. This recent article by Ericsson is also a good source of information on this topic.

Anyway, the slides and the video of the tutorial is embedded below:

Related Posts:

Thursday, May 13, 2021

Anomaly Detection and other AI Algorithms in RAN Optimization


Yesterday I watched this very inspiring live chat that I would like to recommend to anyone who is interested in how machine learning techniques (aka "AI") can help to optimize and troubleshoot the Radio Access Network.

 [The real contents of in the video starts at approx. 42:00 min] 

My key takeaways from this fireside chat are: 

Verizon Wireless has enough data (100… 500 time series KPIs per cell) that they use to feed anomaly detection ML algorithms and this generates a huge number of alarms, but only a few actionable outputs. 

The “big elephant” (Nick Feamster) is to identify if these alarms indicating real problems that can and have to be fixed or if they just indicate a new behavior of e.g. a new handset or a SW version that was not present in the training phase of the ML algorithm and hence, its pattern is detected as a new “anomaly”. 

For Bryan Larish (Director Wireless AI Innovation, Verizon) the “big open problem” is “that it is not clear what the labels are” and “no standard training sets exist”. 

[For more details watch the video section between 52.00 min and 57:32 min and listen to Bryan’s experience!] 

In most cases Verizon seems to need subject matter experts to classify and label these anomaly alarms due to “the huge diversity” in data pattern. 

According to Bryan only for very few selected use cases it is possible to build an automated loop to fix the issue. Especially the root causes of radio interference are often mechanical or cabling issues that need manual work to get fixed. 

All in all it is my personal impression at the end of the session that anomaly detection is currently a bit overhyped and that the real challenges and problems to be resolved start after anomalies are detected.

Nevertheless, as Bryan summarizes: “ML is a very, very powerful tool.” 

However, strategically he seems not to see a lot of value in anomaly detection by itself, but rather: “Can we use machine learning (results) to change how we build networks in the future?”