Showing posts with label Big Data. Show all posts
Showing posts with label Big Data. Show all posts

Wednesday 23 January 2019

AI and Analytics Based Network Designing & Planning

Recently I blogged about how Deutsche Telekom is using AI for variety of things. The most interesting being (from this blog point of view), fiber-optic roll-out. According to their press release (shortened for easy reading):

"The shortest route to the customer is not always the most economical. By using artificial intelligence in the planning phase we can speed up our fiber-optic roll-out. This enables us to offer our customers broadband lines faster and, above all, more efficiently," says Walter Goldenits, head of Technology at Telekom Deutschland. It is often more economical to lay a few extra feet of cable. That is what the new software-based technology evaluates using digitally-collected environmental data. Where would cobblestones have to be dug up and laid again? Where is there a risk of damaging tree roots?

The effort and thus costs involved in laying cable depend on the existing structure. First, civil engineers open the ground and lay the conduits and fiber-optic cables. Then they have to restore the surface to its previous condition. Of course, the process takes longer with large paving stones than with dirt roads.

"Such huge amounts of data are both a blessing and a curse," says Prof. Dr. Alexander Reiterer, who heads the project at the Fraunhofer IPM. "We need as many details as possible. At the same time, the whole endeavour is only efficient if you can avoid laboriously combing through the data to find the information you need. For the planning process to be efficient the evaluation of these enormous amounts of data must be automated." Fraunhofer IPM has developed software that automatically recognizes, localizes and classifies relevant objects in the measurement data.

The neural network used for this recognizes a total of approximately 30 different categories through deep learning algorithms. This includes trees, street lights, asphalt and cobblestones. Right down to the smallest detail: Do the pavements feature large pavement slabs or small cobblestones? Are the trees deciduous or coniferous? The trees' root structure also has a decisive impact on civil engineering decisions.

Once the data has been collected, a specially-trained artificial intelligence is used to make all vehicles and individuals unidentifiable. The automated preparation phase then follows in a number of stages. The existing infrastructure is assessed to determine the optimal route. A Deutsche Telekom planner then double-checks and approves it.


In the recent TIP Summit 2018, Facebook talked about ‘Building Better Networks with Analytics’ and showed off their analytics platform. Vincent Gonguet, Product Manager, Connectivity Analytics, Facebook talked about how Facebook is using a three-pronged approach of accelerating fiber deployment, expanding 4G coverage and planning 5G networks. The video from the summit as follows:

TIP Summit 2018 Day 1 Presentation - Building Better Networks with Analytics from Telecom Infra Project on Vimeo.

Some of the points highlighted in the video:
  • Educating people to connect requires three main focus areas, Access, Affordability and Awareness – One of the main focus areas of TIP is access. 
  • 4G coverage went from 20% to 80% of world population in the last 5 years. The coverage growth is plateauing because the last 20% is becoming more and more uneconomical to connect.
  • Demand is outpacing supply is many parts of the world (indicating that networks has to be designed for capacity, not just coverage)
  • 19% of 4G traffic can’t support high quality videos today at about 1.5 Mbps
  • Facebook has a nice aggregated map of percentage of Facebook traffic across the world that is experiencing very low speeds, less than 0.5 Mbps
  • Talk looks at three approaches in which Facebook works with TIP members to accelerate fiber deployment, expand 4G coverage and plan 5G networks.
  • A joint fiber deployment project with Airtel and BCS in Uganda was announced at MWC 2018
  • 700 km of fiber deployment was planned to serve over 3 million people (Uganda’s population is roughly 43 million)
  • The real challenge was not just collecting data about roads, infrastructure, etc. New cities would emerge over the period of months with tens of thousands of people 
  • In such situations it would be difficult for human planners to go through all the roads and select the most economical route. Also, different human planners do thing in different ways and hence there is no consistency. In addition, its very hard to iterate. 
  • To make deployments simpler and easier, it was decided to first provide coverage to people who need less km of fiber. The savings from finding optimal path for these people can go in connecting more people.
  • It is also important for the fiber networks to have redundancy but it’s difficult to do this at scale
  • An example and simulation of how fiber networks are created is available in the video  from 07:45 – 11:00.
  • Another example is that of prioritizing 4G deployments based on user experience, current network availability and presence of 4G capable devices in partnership with XL Axiata is available in the video from 11:00 – 14:13. Over 1000 sites were deployed and more than 2 million people experienced significant improvement in their speeds and the quality of videos. 
  • The final example is planning of 5G mmWave networks. This was done in partnership with Deutsche Telekom, trying to bring high speeds to 25,000 apartment homes in a sq. km in the center of Berlin. The goal was to achieve over 1Gbps connection using a mixture of fiber and wireless. The video looks at the simulation of Lidar data where the wireless infrastructure can be deployed. Relevant part is from 14:13 – 20:25.
Finally, you may remember my blog post on Automated 4G / 5G Hetnet Design by Keima. Some of the work they do overlaps with both examples above. I reached out to Iris Barcia to see if they have any comments on the two different approaches above. Below is her response:

“It is very encouraging that DT and Facebook are seeing the benefits of data and automation for design. I think that is the only way we’re going to be able to plan modern communication networks. We approach it from the RAN planning perspective: 8 years ago our clients could already reduce cost by automatically selecting locations with good RF performance and close to fibre nodes, alternatively locations close to existing fibre routes or from particular providers. Now the range of variables that we are capable of computing is vast and it includes aspects such as accessibility rules, available spectrum, regulations, etc. This could be easily extended to account for capability/cost of deploying fibre per type of road. 

But also, we believe in the benefit of a holistic business strategy, and over the years our algorithms have evolved to prioritise cost and consumers more precisely. For example, based on the deployment needs we can identify areas where it would be beneficial to deploy fibre: the study presented at CWTEC showed a 5G Fixed Wireless analysis per address, allowing fibre deployments to be prioritised for those addresses characterised by poor RF connectivity.”

There is no doubt in my mind that more and more of these kinds of tools that relies on Analytics and Artificial Intelligence (AI) will be required to design and plan the networks. By this I don’t just mean 5G and other future networks but also the existing 2G, 3G & 4G networks and Hetnets. We will have to wait and see what’s next.


Related Blog Posts:

Monday 13 August 2018

Telefonica: Big Data, Machine Learning (ML) and Artificial Intelligence (AI) to Connect the Unconnected


Earlier, I wrote a detailed post on how Telefonica was on a mission to connect 100 Million Unconnected with their 'Internet para todos' initiative. This video below is a good advert of what Telefinica is trying to achieve in Latin America


I recently came across a LinkedIn post on how Telefónica uses AI / ML to connect the unconnected by Patrick Lopez, VP Networks Innovation @ Telefonica. It was no brainer that this needs to be shared.



In his post, Patrick mentions the following:

To deliver internet in these environments in a sustainable manner, it is necessary to increase efficiency through systematic cost reduction, investment optimization and targeted deployments.

Systematic optimization necessitates continuous measurement of the financial, operational, technological and organizational data sets.

1. Finding the unconnected


The first challenge the team had to tackle was to understand how many unconnected there are and where. The data set was scarce and incomplete, census was old and population had much mobility. In this case, the team used high definition satellite imagery at the scale of the country and used neural network models, coupled with census data as training. Implementing visual machine learning algorithms, the model literally counted each house and each settlement at the scale of the country. The model was then enriched with crossed reference coverage data from regulatory source, as well as Telefonica proprietary data set consisting of geolocalized data sessions and deployment maps. The result is a model with a visual representation, providing a map of the population dispersion, with superimposed coverage polygons, allowing to count and localize the unconnected populations with good accuracy (95% of the population with less than 3% false positive and less than 240 meters deviation in the location of antennas).


2. Optimizing transport



Transport networks are the most expensive part of deploying connectivity to remote areas. Optimizing transport route has a huge impact on the sustainability of a network. This is why the team selected this task as the next challenge to tackle.

The team started with adding road and infrastructure data to the model form public sources, and used graph generation to cluster population settlements. Graph analysis (shortest path, Steiner tree) yielded population density-optimized transport routes.


3. AI to optimize network operations


To connect very remote zones, optimizing operations and minimizing maintenance and upgrade is key to a sustainable operational model. This line of work is probably the most ambitious for the team. When it can take 3 hours by plane and 4 days by boat to reach some locations, being able to make sure you can detect, or better, predict if / when you need to perform maintenance on your infrastructure. Equally important is how your devise your routes so that you are as efficient as possible. In this case, the team built a neural network trained with historical failure analysis and fed with network metrics to provide a model capable of supervising the network health in an automated manner, with prediction of possible failure and optimized maintenance route.

I think that the type of data driven approach to complex problem solving demonstrated in this project is the key to network operators' sustainability in the future. It is not only a rural problem, it is necessary to increase efficiency and optimize deployment and operations to keep decreasing the costs.


Finally, its worth mentioning again that I am helping CW (Cambridge Wireless) organise their annual CW TEC conference on the topic 'The inevitable automation of Next Generation Networks'. There are some good speakers and we will have similar topics covered from different angles, using some other interesting approaches. The fees are very reasonable so please join if you can.

Related posts:

Sunday 29 July 2018

Automating the 5G Core using Machine Learning and Data Analytics

One of the new entities introduced by 3GPP in the 5G Core SBA (see tutorial here) is Network Data Analytics Function, NWDAF.
3GPP TR 23.791: Study of Enablers for Network Automation for 5G (Release 16) describes the following 5G Network Architecture Assumptions:

1 The NWDAF (Network Data Analytics Function) as defined in TS 23.503 is used for data collection and data analytics in centralized manner. An NWDAF may be used for analytics for one or more Network Slice.
2 For instances where certain analytics can be performed by a 5GS NF independently, a NWDAF instance specific to that analytic maybe collocated with the 5GS NF. The data utilized by the 5GS NF as input to analytics in this case should also be made available to allow for the centralized NWDAF deployment option.
3 5GS Network Functions and OAM decide how to use the data analytics provided by NWDAF to improve the network performance.
4 NWDAF utilizes the existing service based interfaces to communicate with other 5GC Network Functions and OAM.
5 A 5GC NF may expose the result of the data analytics to any consumer NF utilizing a service based interface.
6 The interactions between NF(s) and the NWDAF take place in the local PLMN (the reporting NF and the NWDAF belong to the same PLMN).
7 Solutions shall neither assume NWDAF knowledge about NF application logic. The NWDAF may use subscription data but only for statistical purpose.

Picture SourceApplication of Data Mining in the 5G Network Architecture by Alexandros Kaloxylos

Continuing from 3GPP TR 23.791:

The NWDAF may serve use cases belonging to one or several domains, e.g. QoS, traffic steering, dimensioning, security.
The input data of the NWDAF may come from multiple sources, and the resulting actions undertaken by the consuming NF or AF may concern several domains (e.g. Mobility management, Session Management, QoS management, Application layer, Security management, NF life cycle management).
Use case descriptions should include the following aspects:
1. General characteristics (domain: performance, QoS, resilience, security; time scale).
2. Nature of input data (e.g. logs, KPI, events).
3. Types of NF consuming the NWDAF output data, how data is conveyed and nature of consumed analytics.
4. Output data.
5. Possible examples of actions undertaken by the consuming NF or AF, resulting from these analytics.
6. Benefits, e.g. revenue, resource saving, QoE, service assurance, reputation.

Picture SourceApplication of Data Mining in the 5G Network Architecture by Alexandros Kaloxylos

3GPP TS 23.501 V15.2.0 (2018-06) Section 6.2.18 says:

NWDAF represents operator managed network analytics logical function. NWDAF provides slice specific network data analytics to a NF. NWDAF provides network analytics information (i.e., load level information) to a NF on a network slice instance level and the NWDAF is not required to be aware of the current subscribers using the slice. NWDAF notifies slice specific network status analytic information to the NFs that are subscribed to it. NF may collect directly slice specific network status analytic information from NWDAF. This information is not subscriber specific.

In this Release of the specification, both PCF and NSSF are consumers of network analytics. The PCF may use that data in its policy decisions. NSSF may use the load level information provided by NWDAF for slice selection.

NOTE 1: NWDAF functionality beyond its support for Nnwdaf is out of scope of 3GPP.
NOTE 2: NWDAF functionality for non-slice-specific analytics information is not supported in this Release of the specification.

3GPP Release-16 is focusing on 5G Expansion and 5G Efficiency, SON and Big Data are part of 5G Efficiency.
Light Reading Artificial Intelligence and Machine Learning section has a news item on this topic from Layer123's Zero Touch & Carrier Automation Congress:

The 3GPP standards group is developing a machine learning function that could allow 5G operators to monitor the status of a network slice or third-party application performance.

The network data analytics function (NWDAF) forms a part of the 3GPP's 5G standardization efforts and could become a central point for analytics in the 5G core network, said Serge Manning, a senior technology strategist at Sprint Corp.

Speaking here in Madrid, Manning said the NWDAF was still in the "early stages" of standardization but could become "an interesting place for innovation."

The 3rd Generation Partnership Project (3GPP) froze the specifications for a 5G new radio standard at the end of 2017 and is due to freeze another set of 5G specifications, covering some of the core network and non-radio features, in June this year as part of its "Release 15" update.

Manning says that Release 15 considers the network slice selection function (NSSF) and the policy control function (PCF) as potential "consumers" of the NWDAF. "Anything else is open to being a consumer," he says. "We have things like monitoring the status of the load of a network slice, or looking at the behavior of mobile devices if you wanted to make adjustments. You could also look at application performance."

In principle, the NWDAF would be able to make use of any data in the core network. The 3GPP does not plan on standardizing the algorithms that will be used but rather the types of raw information the NWDAF will examine. The format of the analytics information that it produces might also be standardized, says Manning.

Such technical developments might help operators to provide network slices more dynamically on their future 5G networks.

Generally seen as one of the most game-changing aspects of 5G, the technique of network slicing would essentially allow an operator to provide a number of virtual network services over the same physical infrastructure.

For example, an operator could provide very high-speed connectivity for mobile gaming over one slice and a low-latency service for factory automation on another -- both reliant on the same underlying hardware.

However, there is concern that without greater automation operators will have less freedom to innovate through network slicing. "If operators don't automate they will be providing capacity-based slices that are relatively large and static and undifferentiated and certainly not on a per-customer basis," says Caroline Chappell, an analyst with Analysys Mason .

In a Madrid presentation, Chappell said that more granular slicing would require "highly agile end-to-end automation" that takes advantage of progress on software-defined networking and network functions virtualization.

"Slices could be very dynamic and perhaps last for only five minutes," she says. "In the very long term, applications could create their own slices."

Despite the talk of standardization, and signs of good progress within the 3GPP, concern emerged this week in Madrid that standards bodies are not moving quickly enough to address operators' needs.

Caroline Chappell's talk is available here whereas Serge Manning's talk is embedded below:



I am helping CW organise the annual CW TEC conference on the topic The inevitable automation of Next Generation Networks
Communications networks are perhaps the most complex machines on the planet. They use vast amounts of hardware, rely on complex software, and are physically distributed over land, underwater, and in orbit. They increasingly provide essential services that underpin almost every aspect of life. Managing networks and optimising their performance is a vast challenge, and will become many times harder with the advent of 5G. The 4th Annual CW Technology Conference will explore this challenge and how Machine Learning and AI may be applied to build more reliable, secure and better performing networks.

Is the AI community aware of the challenges facing network providers? Are the network operators and providers aware of how the very latest developments in AI may provide solutions? The conference will aim to bridge the gap between AI/ML and communications network communities, making each more aware of the nature and scale of the problems and the potential solutions.

I am hoping to see some of this blog readers at the conference. Looking forward to learning more on this topic amongst others for network automation.

Related Post:

Monday 29 February 2016

The Internet of Me: It’s all about my screens - Bob Schukai


I had the pleasure of attending the IET Turing lecture last week and listening to Robert Schukai. He gave a brilliant talk on how Smartphones are changing the way we do things. Its a very interesting talk but its nearly 87 minutes long. Slides are not available but the video is embedded below.


Tuesday 22 October 2013

Korea Telecom ‘Route Decision System’ for midnight buses

Interesting presentation from Korea Telecom in LTE Asia 2013 about how they use Big Data to decide the night bus routes. Here are two pics which are self explanatory


We will soon start seeing operators using the data being collected from users and this can also be a nice little earner for them.

Monday 29 July 2013

Big Data and Vulnerability of Cellular Systems

I am sure most of you are aware of Big Data, if not watch this video on my old post here. Moray Rumney from Agilent recently gave a talk in #FWIC on how Big Data techniques can be used to exploit the vulnerabilities in a cellular system. Though the talk focussed on GSM and 3G, it is always a good intro. The presentation embedded below:



You can also listen to the audio of his presentation here.

Monday 22 October 2012

M2M and the 'Big Data'

Couple of months back, there was this Dilbert strip on the big data.


Apparently, Social networks, M2M devices and many other sources of data, keeps on generating data all the time. This data can provide us with a lot of useful information if proper analytics can be done on it. This is a real challenge in guess. There will also be security and privacy implications that may decide how and what can be used and by whom.

Here is a simple introductory video by Intel explaining what Big Data is: