Wednesday 5 September 2018

LiFi can be a valuable tool for densification

LiFi has been popping up in the news recently. I blogged about it (as LED-Fi) 10 years back. While the concept has remained the same, many of the limitations associated with the technology has been overcome. One of the companies driving LiFi is Scottish startup called pureLiFi.


I heard Professor Harald Haas at IEEE Glasgow Summit speak about how many of the limitations of LiFi have been overcome in the last few years (see videos below). This is a welcome news as there is a tremendous amount of Visible Light Spectrum that is available for exploitation.


While many discussions on LiFi revolve round its use as access technology, I think the real potential lies in its use as backhaul for densification.

For 5G, when we are looking at small cells, every few hundred meters, probably on streetlights and lamp posts, there is a requirement for alternative backhaul to fiber. Its difficult to run fiber to each and every lamp post. Traditionally, this was solved by microwave solutions but another option available in 5G is Integrated Access and Backhauling (IAB) or Self-backhauling.


A better alternative could be to use LiFi for this backhauling between lamp posts or streetlights. This can help avoid complications with IAB when multiple nodes are close by and also any complications with the technology until it matures. This approach is of course being trialed but as the picture above shows, rural backhaul is just one option.
LiFi is being studied as part of IEEE 802.11bb group as well as its potential is being considered for 5G.

Here is a vieo playlist explaining LiFi technology in detail.




Further reading:

Monday 13 August 2018

Telefonica: Big Data, Machine Learning (ML) and Artificial Intelligence (AI) to Connect the Unconnected


Earlier, I wrote a detailed post on how Telefonica was on a mission to connect 100 Million Unconnected with their 'Internet para todos' initiative. This video below is a good advert of what Telefinica is trying to achieve in Latin America


I recently came across a LinkedIn post on how Telefónica uses AI / ML to connect the unconnected by Patrick Lopez, VP Networks Innovation @ Telefonica. It was no brainer that this needs to be shared.



In his post, Patrick mentions the following:

To deliver internet in these environments in a sustainable manner, it is necessary to increase efficiency through systematic cost reduction, investment optimization and targeted deployments.

Systematic optimization necessitates continuous measurement of the financial, operational, technological and organizational data sets.

1. Finding the unconnected


The first challenge the team had to tackle was to understand how many unconnected there are and where. The data set was scarce and incomplete, census was old and population had much mobility. In this case, the team used high definition satellite imagery at the scale of the country and used neural network models, coupled with census data as training. Implementing visual machine learning algorithms, the model literally counted each house and each settlement at the scale of the country. The model was then enriched with crossed reference coverage data from regulatory source, as well as Telefonica proprietary data set consisting of geolocalized data sessions and deployment maps. The result is a model with a visual representation, providing a map of the population dispersion, with superimposed coverage polygons, allowing to count and localize the unconnected populations with good accuracy (95% of the population with less than 3% false positive and less than 240 meters deviation in the location of antennas).


2. Optimizing transport



Transport networks are the most expensive part of deploying connectivity to remote areas. Optimizing transport route has a huge impact on the sustainability of a network. This is why the team selected this task as the next challenge to tackle.

The team started with adding road and infrastructure data to the model form public sources, and used graph generation to cluster population settlements. Graph analysis (shortest path, Steiner tree) yielded population density-optimized transport routes.


3. AI to optimize network operations


To connect very remote zones, optimizing operations and minimizing maintenance and upgrade is key to a sustainable operational model. This line of work is probably the most ambitious for the team. When it can take 3 hours by plane and 4 days by boat to reach some locations, being able to make sure you can detect, or better, predict if / when you need to perform maintenance on your infrastructure. Equally important is how your devise your routes so that you are as efficient as possible. In this case, the team built a neural network trained with historical failure analysis and fed with network metrics to provide a model capable of supervising the network health in an automated manner, with prediction of possible failure and optimized maintenance route.

I think that the type of data driven approach to complex problem solving demonstrated in this project is the key to network operators' sustainability in the future. It is not only a rural problem, it is necessary to increase efficiency and optimize deployment and operations to keep decreasing the costs.


Finally, its worth mentioning again that I am helping CW (Cambridge Wireless) organise their annual CW TEC conference on the topic 'The inevitable automation of Next Generation Networks'. There are some good speakers and we will have similar topics covered from different angles, using some other interesting approaches. The fees are very reasonable so please join if you can.

Related posts:

Friday 10 August 2018

Changes in LTE pricing strategies


Its been a while since I blogged about pricing strategies (see old posts here, here and here). I recently enjoyed listening to Soichi Nakajima, Director of "Digital Telco and OTT" at IDATE DigiWorld when he presented a talk on LTE pricing strategy. The slides are embedded below



I think the slides are self-explanatory but here is the summary worth highlighting:

How LTE plans have changed: shift in focus from data allowance to quality of service 

  • Mobile data services are still largely structured by on data allowance, but high volume and unlimited plans are increasingly common. 
  • Unlimited does not necessarily mean high-end: some target users with a small budget, providing a very slow connection. 
  • Quality of service becoming central in structuring product lines – especially speed which my or may not be combined with data caps – as is content quality. 
  • Certain applications being favoured through zero rating (traffic not deducted from the customer’s allowance). This can be a way to market unlimited plans and avoid fixed-mobile substitution. 
  • Growing number of partnerships with OTT video services, rather than selling premium content plans, which are tending to wane.

The slides are available to download from techUK page here. There is also a bonus presentation on "How to address the challenges of providing connectivity on trains".

Sunday 5 August 2018

ITU 'Network 2030': Initiative to support Emerging Technologies and Innovation looking beyond 5G advances

Source: ITU

As per this recent ITU Press Release:

The International Telecommunication Union, the United Nations specialized agency for information and communication technology (ICT), has launched a new research initiative to identify emerging and future ICT sector network demands, beyond 2030 and the advances expected of IMT-2020 (5G) systems. This work will be carried out by the newly established ITU Focus Group on Technologies for Network 2030, which is open to all interested parties.

The ITU focus group aims to guide the global ICT community in developing a "Network 2030" vision for future ICTs. This will include new concepts, new architecture, new protocols – and new solutions – that are fully backward compatible, so as to support both existing and new applications.

"The work of the ITU Focus Group on Technologies for 'Network 2030' will provide network system experts around the globe with a very valuable international reference point from which to guide the innovation required to support ICT use cases through 2030 and beyond," said ITU Secretary-General Houlin Zhao.

These ICT use cases will span new media such as hologrammes, a new generation of augmented and virtual reality applications, and high-precision communications for 'tactile' and 'haptic' applications in need of processing a very high volume of data in near real-time – extremely high throughput and low latency.   

Emphasizing this need, the focus group's chairman, Huawei's Richard Li, said, "This Focus Group will look at new media, new services and new architectures. Holographic type communications will have a big part to play in industry, agriculture, education, entertainment – and in many other fields. Supporting such capabilities will call for very high throughput in the range of hundreds of gigabits per second or even higher."

The ITU Focus Group on Technologies for 'Network 2030' is co-chaired by Verizon's Mehmet Toy, Rostelecom's Alexey Borodin, China Telecom's Yuan Zhang, Yutaka Miyake from KDDI Research, and is coordinated through ITU's Telecommunication Standardization Sector – which works with ITU's 193 Member States and more than 800 industry and academic members to establish international standards for emerging ICT innovations.

The ITU focus group reports to and will inform a new phase of work of the ITU standardization expert group for 'Future Networks' – Study Group 13. It will also strengthen and leverage collaborative relationships with and among other standards development organizations including: The European Telecommunications Standards Institute (ETSI), the Association for Computing Machinery's Special Interest Group on Data Communications (ACM SIGCOMM), and the Institute of Electrical and Electronics Engineers' Communications Society (IEEE ComSoc).
Source: ITU

According to the Focus Group page:

The FG NET-2030, as a platform to study and advance international networking technologies, will investigate the future network architecture, requirements, use cases, and capabilities of the networks for the year 2030 and beyond. 

The objectives include: 

• To study, review and survey existing technologies, platforms, and standards for identifying the gaps and challenges towards Network 2030, which are not supported by the existing and near future networks like 5G/IMT-2020.
• To formulate all aspects of Network 2030, including vision, requirements, architecture, novel use cases, evaluation methodology, and so forth.
• To provide guidelines for standardization roadmap.
• To establish liaisons and relationships with other SDOs.

An ITU interview with Dr. Richard Li, Huawei, Chairman of the ITU-T FG on Network 2030 is available on YouTube here.

A recent presentation by Dr. Richard Li on this topic is embedded below:



First Workshop on Network 2030 will be held in New York City, United States on 2 October 2018. Details here.

Related News:

Sunday 29 July 2018

Automating the 5G Core using Machine Learning and Data Analytics

One of the new entities introduced by 3GPP in the 5G Core SBA (see tutorial here) is Network Data Analytics Function, NWDAF.
3GPP TR 23.791: Study of Enablers for Network Automation for 5G (Release 16) describes the following 5G Network Architecture Assumptions:

1 The NWDAF (Network Data Analytics Function) as defined in TS 23.503 is used for data collection and data analytics in centralized manner. An NWDAF may be used for analytics for one or more Network Slice.
2 For instances where certain analytics can be performed by a 5GS NF independently, a NWDAF instance specific to that analytic maybe collocated with the 5GS NF. The data utilized by the 5GS NF as input to analytics in this case should also be made available to allow for the centralized NWDAF deployment option.
3 5GS Network Functions and OAM decide how to use the data analytics provided by NWDAF to improve the network performance.
4 NWDAF utilizes the existing service based interfaces to communicate with other 5GC Network Functions and OAM.
5 A 5GC NF may expose the result of the data analytics to any consumer NF utilizing a service based interface.
6 The interactions between NF(s) and the NWDAF take place in the local PLMN (the reporting NF and the NWDAF belong to the same PLMN).
7 Solutions shall neither assume NWDAF knowledge about NF application logic. The NWDAF may use subscription data but only for statistical purpose.

Picture SourceApplication of Data Mining in the 5G Network Architecture by Alexandros Kaloxylos

Continuing from 3GPP TR 23.791:

The NWDAF may serve use cases belonging to one or several domains, e.g. QoS, traffic steering, dimensioning, security.
The input data of the NWDAF may come from multiple sources, and the resulting actions undertaken by the consuming NF or AF may concern several domains (e.g. Mobility management, Session Management, QoS management, Application layer, Security management, NF life cycle management).
Use case descriptions should include the following aspects:
1. General characteristics (domain: performance, QoS, resilience, security; time scale).
2. Nature of input data (e.g. logs, KPI, events).
3. Types of NF consuming the NWDAF output data, how data is conveyed and nature of consumed analytics.
4. Output data.
5. Possible examples of actions undertaken by the consuming NF or AF, resulting from these analytics.
6. Benefits, e.g. revenue, resource saving, QoE, service assurance, reputation.

Picture SourceApplication of Data Mining in the 5G Network Architecture by Alexandros Kaloxylos

3GPP TS 23.501 V15.2.0 (2018-06) Section 6.2.18 says:

NWDAF represents operator managed network analytics logical function. NWDAF provides slice specific network data analytics to a NF. NWDAF provides network analytics information (i.e., load level information) to a NF on a network slice instance level and the NWDAF is not required to be aware of the current subscribers using the slice. NWDAF notifies slice specific network status analytic information to the NFs that are subscribed to it. NF may collect directly slice specific network status analytic information from NWDAF. This information is not subscriber specific.

In this Release of the specification, both PCF and NSSF are consumers of network analytics. The PCF may use that data in its policy decisions. NSSF may use the load level information provided by NWDAF for slice selection.

NOTE 1: NWDAF functionality beyond its support for Nnwdaf is out of scope of 3GPP.
NOTE 2: NWDAF functionality for non-slice-specific analytics information is not supported in this Release of the specification.

3GPP Release-16 is focusing on 5G Expansion and 5G Efficiency, SON and Big Data are part of 5G Efficiency.
Light Reading Artificial Intelligence and Machine Learning section has a news item on this topic from Layer123's Zero Touch & Carrier Automation Congress:

The 3GPP standards group is developing a machine learning function that could allow 5G operators to monitor the status of a network slice or third-party application performance.

The network data analytics function (NWDAF) forms a part of the 3GPP's 5G standardization efforts and could become a central point for analytics in the 5G core network, said Serge Manning, a senior technology strategist at Sprint Corp.

Speaking here in Madrid, Manning said the NWDAF was still in the "early stages" of standardization but could become "an interesting place for innovation."

The 3rd Generation Partnership Project (3GPP) froze the specifications for a 5G new radio standard at the end of 2017 and is due to freeze another set of 5G specifications, covering some of the core network and non-radio features, in June this year as part of its "Release 15" update.

Manning says that Release 15 considers the network slice selection function (NSSF) and the policy control function (PCF) as potential "consumers" of the NWDAF. "Anything else is open to being a consumer," he says. "We have things like monitoring the status of the load of a network slice, or looking at the behavior of mobile devices if you wanted to make adjustments. You could also look at application performance."

In principle, the NWDAF would be able to make use of any data in the core network. The 3GPP does not plan on standardizing the algorithms that will be used but rather the types of raw information the NWDAF will examine. The format of the analytics information that it produces might also be standardized, says Manning.

Such technical developments might help operators to provide network slices more dynamically on their future 5G networks.

Generally seen as one of the most game-changing aspects of 5G, the technique of network slicing would essentially allow an operator to provide a number of virtual network services over the same physical infrastructure.

For example, an operator could provide very high-speed connectivity for mobile gaming over one slice and a low-latency service for factory automation on another -- both reliant on the same underlying hardware.

However, there is concern that without greater automation operators will have less freedom to innovate through network slicing. "If operators don't automate they will be providing capacity-based slices that are relatively large and static and undifferentiated and certainly not on a per-customer basis," says Caroline Chappell, an analyst with Analysys Mason .

In a Madrid presentation, Chappell said that more granular slicing would require "highly agile end-to-end automation" that takes advantage of progress on software-defined networking and network functions virtualization.

"Slices could be very dynamic and perhaps last for only five minutes," she says. "In the very long term, applications could create their own slices."

Despite the talk of standardization, and signs of good progress within the 3GPP, concern emerged this week in Madrid that standards bodies are not moving quickly enough to address operators' needs.

Caroline Chappell's talk is available here whereas Serge Manning's talk is embedded below:



I am helping CW organise the annual CW TEC conference on the topic The inevitable automation of Next Generation Networks
Communications networks are perhaps the most complex machines on the planet. They use vast amounts of hardware, rely on complex software, and are physically distributed over land, underwater, and in orbit. They increasingly provide essential services that underpin almost every aspect of life. Managing networks and optimising their performance is a vast challenge, and will become many times harder with the advent of 5G. The 4th Annual CW Technology Conference will explore this challenge and how Machine Learning and AI may be applied to build more reliable, secure and better performing networks.

Is the AI community aware of the challenges facing network providers? Are the network operators and providers aware of how the very latest developments in AI may provide solutions? The conference will aim to bridge the gap between AI/ML and communications network communities, making each more aware of the nature and scale of the problems and the potential solutions.

I am hoping to see some of this blog readers at the conference. Looking forward to learning more on this topic amongst others for network automation.

Related Post:

Tuesday 24 July 2018

Multicast Operation on Demand (MooD) and Service Continuity for eMBMS


Many regular readers of this blog are aware that back in 2014 I wrote a post looking critically at LTE-Broadcast business case and suggested a few approaches to make it a success. Back in those days, 2014 was being billed as the year of LTE-Broadcast or eMBMS (see here and here for example). I was just cautioning people against jumping on the LTE-B bandwagon.

According to a recent GSA report 'LTE Broadcast (eMBMS) Market Update – March 2018':

  • thirty-nine operators are known to have been investing in eMBMS demonstrations, trials, deployments or launches
  • five operators have now deployed eMBMS or launched some sort of commercial service using eMBMS

Its good to see some operators now getting ready to deploy eMBMS for broadcast TV scenarios. eMBMS will also be used in Mission Critical Communications for the features described here.

In a recent news from the Australian operator Telstra:

Telstra is now streaming live sports content to a massive base of around 1.2 million devices each weekend and sports fans consume 37 million minutes of live content over our apps on any given weekend.

This increase brings new challenges to the way traffic on our mobile network is managed. Even though a large group of people might be streaming the same real-time content at the same time, we still need to ensure a high quality streaming experience for our customers.

This challenge makes our sporting apps a prime use case for LTE-Broadcast (LTE-B).

Earlier this year, we announced we would be turning on LTE-B functionality on the AFL Live Official app for Telstra customers with Samsung Galaxy S8 and Galaxy S9 devices. Following extensive testing, Telstra is the only operator in Australia – and one of the first in the world – to deploy LTE-B into its mobile network.

At a live demonstration in Sydney, over 100 Samsung Galaxy S8 and Galaxy S9 devices were on display showing simultaneous high definition content from the AFL Live Official app using LTE-B.

Its interesting to note here that the broadcast functionality (and probably intelligence) is built into the app.

According to another Telstra news item (emphasis mine):

The use of LTE-Broadcast technology changes the underlying efficiency of live video delivery as each cell can now support an unlimited number of users watching the same content with improved overall quality. To date though, LTE-B technology has required that a dedicated part of each cell’s capacity be set aside for broadcasting. This had made the LTE-B business case harder to prove in for lower streaming demand rates.

This has now changed as Telstra and our partners have enabled the world’s first implementation of the Multicast Operation on Demand (MooD) feature whereby cells in the network only need to configure for LTE-B when there are multiple users watching the same content.

This combined with the Service Continuity feature allows mobile users to move around the network seamlessly between cells configured for LTE-B and those which are not.

Earlier this year we announced our intention to enable LTE-Broadcast (LTE-B) across our entire mobile network in 2018. With MooD and service continuity we are one step closer to that goal as we head into another year of major growth in sporting content demand.

Supported by technology partners Ericsson and Qualcomm, Telstra has now delivered world first capability to ensure LTE-B can be delivered as efficiently as possible.

Service Continuity will allow devices to transition in and out of LTE-B coverage areas without interruption. For instance, you might be at a music festival streaming an event on your phone but need to leave the venue and make your way back home (where LTE-B is not in use). Service Continuity means you can continue to watch the stream and the transition will be seamless – even though you have the left the broadcast area.

Taking that a step further, MooD allows the network to determine how many LTE-B compatible devices in any given area are consuming the same content. MooD then intelligently activates or deactivates LTE-B, ensuring the mobile network is as efficient as possible in that location.

For example, if a die-hard football fan is streaming a match we will likely service that one user with unicast, as that is the most efficient way of delivering the content. However if more users in the same cell decide to watch the match, MooD makes the decision automatically as to whether it is more efficient to service those users by switching the stream to broadcasting instead of individual unicast streams.

Its good to see Ericsson & Qualcomm finally taking eMBMS to commercial deployment. Back in 2015, I added their videos from MWC that year. See post here.
I think the Telstra post already provides info on why MooD is needed but this picture from Qualcomm whitepaper above makes it much clearer. Back in 3G MBMS and early days or eMBMS, there used to be a feature called counting, MooD is effectively doing the same thing.
For Service Continuity, this paper 'Service Continuity for eMBMS in LTE/LTE-Advanced Network: Standard Analysis and Supplement' by Ngoc-Duy Nguyen and Christian Bonnet has interesting proposal on how it should be done. I cannot be sure if this is correct as per the latest specifications but its interesting to learn how this would be done when the user moves out of coverage area in Idle or connected mode.

Note that this Expway paper also refers to Service continuity as Session continuity.

Related posts:



Thursday 19 July 2018

5G Synchronisation Requirements


5G will probably introduce tighter synchronization requirements than LTE. A recent presentation from Ericsson provides more details.

In frequencies below 6GHz (referred to as frequency range 1 or FR1 in standards), there is a probability to use both FDD and TDD bands, especially in case of re-farming of existing bands. In frequencies above 6GHz (referred to as frequency range 2 or FR2 in standards, even though FR2 starts from 24.25 GHz), it is expected that all bands would be TDD.

Interesting to see that the cell phase synchronization accuracy measured at BS antenna connectors is specified to be better than 3 μs in 3GPP TS 38 133. This translates into a network-wide requirements of +/-1.5 microseconds and is applicable to both FR1 and FR2, regardless of the cell size.

Frequency Error for NR specified in 3GPP TS 38.104 states that the base station (BS) shall be accurate to within the following accuracy range observed over 1 ms:
Wide Area BS → ±0.05 ppm
Medium Range BS → ±0.1 ppm
Local Area BS → ±0.1 ppm

The presentation specifies that based on request by some operators, studies in ITU-T on the feasibility of solutions targeting end-to-end time synchronization requirements on the order of +/-100 ns to +/-300 ns

There is also a challenge of how the sync information is transported within the network. The conclusion is that while the current LTE sync requirements would work in the short term, new solutions would be required in the longer term.

If this is an area of interest, you will also enjoy watching CW Heritage SIG talk by Prof. Andy Sutton, "The history of synchronisation in digital cellular networks". Its available here.

Thursday 12 July 2018

Minimum Bandwidth Requirement for 5G Non-Standalone (NSA) Deployment

I was attending the IEEE 5G World Forum live-stream, courtesy of IEEE Tv and happen to hear Egil Gronstad, Senior Director of Technology Development and Strategy at T-Mobile USA. He said that they will be building a nationwide 5G network that will initially be based on 600 MHz band.


During the Q&A, Egil mentioned that because of the way the USA has different markets, on average they have 31 MHz of 600 MHz (Band 71). The minimum is 20 MHz and the maximum is 50 MHz.

So I started wondering how would they launch 4G & 5G in the same band for nationwide coverage? They have a good video on their 5G vision but that is of course probably going to come few years down the line.

In simple terms, they will first deploy what is known as Option 3 or EN-DC. If you want a quick refresher on different options, you may want to jump to my tutorial on this topic at 3G4G here.

The Master Node (recall dual connectivity for LTE, Release-12. See here) is an eNodeB. As with any LTE node, it can take bandwidths from 1.4 MHz to 20 MHz. So the minimum bandwidth for LTE node is 1.4 MHz.

The Secondary Node is a gNodeB. Looking at 3GPP TS 38.101-1, Table 5.3.5-1 Channel bandwidths for each NR band, I can see that for band 71


NR band / SCS / UE Channel bandwidth
NR Band
SCS
kHz
5 MHz
101,2 MHz
152 MHz
202 MHz
252 MHz
30 MHz
40 MHz
50 MHz
60 MHz
80 MHz
90 MHz
100 MHz
n71
15
Yes
Yes
Yes
Yes








30

Yes
Yes
Yes








60













The minimum bandwidth is 5MHz. Of course this is paired spectrum for FDD band but the point I am making here is that you need just 6.4 MHz minimum to be able to support the Non-Standalone 5G option.

I am sure you can guess that the speeds will not really be 5G speeds with this amount of bandwidth but I am looking forward to all these kind of complaints in the initial phase of 5G network rollout.

I dont know what bandwidths T-Mobile will be using but we will see at least 10MHz of NR in case where the total spectrum is 20 MHz and 20 MHz of NR where the total spectrum is 50 MHz.

If you look at the earlier requirements list, the number being thrown about for bandwidth was 100 MHz for below 6 GHz and up to 1 GHz bandwidth for spectrum above 6 GHz. Don't think there was a hard and fast requirement though.

Happy to hear your thoughts.

Tuesday 3 July 2018

Terahertz and Beyond 100 GHz progress

There seems to be a good amount of research going on in higher frequencies to see how a lot more spectrum with a lot more bandwidth can be used in future radio communications. NTT recently released information about "Ultra high-speed IC capable of wireless transmission of 100 gigabits per second in a 300 GHz band". Before we discuss anything, lets look at what Terahertz means from this article.

Terahertz wave: Just as we use the phrase ‘kilo’ to mean 103 , so we use the term ‘giga’ to mean 109 and the term ‘tera’ to mean 1012 . “Hertz (Hz)” is a unit of a physical quantity called frequency. It indicates how many times alternating electric signals and electromagnetic waves change polarity (plus and minus) per second. That is, one terahertz (1 THz = 1,000 GHz) is the frequency of the electromagnetic wave changing the polarity by 1 × 1012 times per second. In general, a terahertz wave often indicates an electromagnetic wave of 0.3 THz to 3 THz.

While there are quite a few different numbers, this is the one that is most commonly being used. The following is the details of research NTT did.

In this research, we realized 100 Gbps wireless transmission with one wave (one carrier), so in the future, we can extend to multiple carriers by making use of the wide frequency band of 300 GHz band, and use spatial multiplexing technology such as MIMO and OAM. It is expected to be an ultra high-speed IC technology that enables high-capacity wireless transmission of 400 gigabits per second. This is about 400 times the current LTE and Wi-Fi, and 40 times 5G, the next-generation mobile communication technology. It is also expected to be a technology that opens up utilization of the unused terahertz wave frequency band in the communications field and non-communication fields.

Complete article and paper available here.

Huawei has also been doing research in W (92 - 114.5 GHz) and D (130 - 174.5 GHz) bands.


A recent presentation by Debora Gentina, ETSI ISG mWT WI#8 Rapporteur at the UK Spectrum Policy Forum is embedded below.



This presentation can be downloaded from UK SPF site here. Another event on beyond 100GHz that took place last year has some interesting presentations too. Again, on UKSPF site here.


Ericsson has an interesting article in Technology Review, looking at beyond 100GHz from backhaul point of view. Its available here.

If 5G is going to start using the frequencies traditionally used by backhaul then backhaul will have to start looking at other options too.

Happy to listen to your thoughts and insights on this topic.

Monday 25 June 2018

Free Apps for Field Testing - Part 2

The last time I wrote about the free apps for field testing, many people came back and suggested additional apps that are much more commonly used. In fact we got the following comment when 3G4G re-posted this

As I have used both these apps frequently, here is a small summary on them.

Network Signal Guru: This is surprisingly very popular and is quite useful. The only issue is that you need to have a rooted phone with Qualcomm chipset. I know many testers have their favourite phones and quite a few testers buy the latest phones, root them and start testing using NSG (Network Signal Guru).

I prefer using Motorola Moto Gx series phones. They are cheap, not too difficult to root (YouTube have quite a few tutorials and Google search works too) and I find that their receivers are better than others. Have detected cells that other phones cant and have even camped and speed tested on them too.

So what can NSG do?

It can provide lots of useful information on the physical layer, cell configurations, neighbor cell lists, MIMO, etc.
You can even RAT lock to LTE / WCDMA / GSM and band lock to use a specific band. It can be very useful during surveys when you want to check if you can see particular frequency anywhere in an area. You can also see Codecs, RACH information, Data information, etc.

Finally, one of the best things I find is the signalling information. Some of the details are only available for purchased option, its nevertheless very useful. Just in case you are wondering how much does it cost, its roughly £50 per month license in UK.


Cell Mapper: I find this much more helpful as it can be used without rooting. CellMapper is a crowd-sourced cellular tower and coverage mapping service. Its simple and only used for basic testing but nevertheless very useful. To give you an idea, the other day I was camped on a cell with very good signal quality but very poor data rates and there weren't many people so congestion didn't seem like a factor. On investigation I found out that I was camped on 800MHz band that has limited bandwidth per operator and there was no CA.

Cell mapper, as you can see provides information about the cell you are camped on, the cell tower location, what other sectors and frequencies are there, etc.


Do you have a favorite testing app that I missed? Let me know in comments.