Sunday 12 March 2017

High Power / Performance User Equipment (#HPUE)

3GPP refers to HPUE as High Power UE while the US operator Sprint prefers to use the term High Performance UE.

HPUE was initially defined for US Public Safety Band 14 (700MHz). The intention was that this high power UEs can increase the coverage range from 4km to 8km. This would mean larger coverage areas and less number of cells.

While the commercial UE's (class 3) transmit at +23dBm (max 200mW), the Public Safety people intend to use class 1 UE transmitting +31 dBm (max 1.25W). It was felt that this feature could be beneficial for some TDD bands that do not have to worry about backward compatibility. One such band, pushed by Sprint was TDD Band 41 (2500MHz). As this band is for the commercial UE's, instead of class 1, class 2 power at +26dBm (max 400mW) was proposed.

3GPP TS 36.886 provides the following justification:

Currently, 3GPP has defined only Power Class UE 3 as the type of UE supported for TDD LTE band 41 operations. This definition was based on aligning TDD LTE Band 41 UE power classes with prior work in 3GPP related to other bands. However, it should be mentioned that 3GPP UE Power Class 3 definition (i.e. 23dBm) was mainly driven to ensure backward compatibility with prior technologies (i.e. GSM/UMTS) [2] so that network deployment topologies remain similar. Furthermore, maintaining the same power class UE definition (i.e. Class 3) as previous technologies would maintaining compliance with various national regulatory rulings, particularly in terms of SAR, for FDD LTE duplexing mode. 

However, TDD LTE band 41 does not have any 3GPP legacy technologies associated with it, hence the backward compatibility consideration is not applicable in its case. Also, since band 41 is defined as a TDD LTE band, it is less susceptible to SAR levels that FDD LTE bands due to SAR definition. Therefore, defining a new UE power class with higher than 23dBm Tx power for TDD LTE Band 41 operations would not compromise any of 3GPP foundational work, while improving UE and network performance. It should also be mentioned that 3GPP has done similar work on other bands (i.e. band 14) when defining a higher power class UE, hence the concept presented in this document is a continuation of that process.

The present document carries out a feasibility analysis for defining a UE Power class 2 (i.e. 26dBm) for operation on TDD LTE band 41. The document analyses current and future technological advancements in the area of UE RF front-end components and architectures that enable such definition while maintaining 3GPP specification and other regulatory bodies' requirements. It should be emphasized that this proposal only relates to single carrier UL operations on TDD band 41 (i.e. TM-1/2 modes) without affecting current 3GPP definition for UL carrier aggregation on band 41.

As you can see from the tweet above, Sprint CEO is quite pleased with the HPUE. 

SourceDiana Goovaerts

Iain Gillott, iGR points out that HPUE applies to Sprint’s 2.5 GHz TDD network and associated spectrum, and the company claims up to 30 percent increase in cell cover from the new technology.  It should be noted that HPUE is a 3GPP standard that applies to the 2.5 GHz TDD band (Band 41) and is also to be used by China Mobile and Softbank.  HPUE was developed as part of the Global TDD LTE Initiative (GTI) which includes Qualcomm Technologies, Samsung, ZTE, Broadcom, MediaTek, Skyworks Solutions, Alcatel, Motorola, LG and Qorvo... The cool part: the improvement in coverage comes from simply improving the device uplink power.  So Sprint, China Mobile and Softbank will not have to visit their cell sites to make changes; they just need 2.5 GHz TDD devices with HPUE to get the benefit.


Milan Milanović recently wrote about Sprint’s Gigabit Class LTE network goes live in New Orleans. One of the questions I had was why is the uplink so rubbish as compared to downlink. He kindly pointed out to me that this is TDD config 2
If you are wondering what is TDD Config 2, see the pic below
Source: ShareTechNote

Sprint expects HPUE to appear in postpaid devices starting in 2017, including new devices from Samsung, LG, HTC, and Moto. It’s expected that all of Sprint’s new devices will have HPUE support within the next two years.

I think it would be interesting to see how this impacts when there are a lot more users and devices. I am quite sure there will be more requests for HPUE in further TDD bands.

Related Links:

Monday 6 March 2017

IMT-2020 (5G) Requirements


ITU has just agreed on key 5G performance requirements for IMT-2020. A new draft report ITU-R M.[IMT-2020.TECH PERF REQ] is expected to be finally approved by  ITU-R Study Group 5 at its next meeting in November 2017. The press release says "5G mobile systems to provide lightning speed, ultra-reliable communications for broadband and IoT"


The following is from the ITU draft report:

The key minimum technical performance requirements defined in this document are for the purpose of consistent definition, specification, and evaluation of the candidate IMT-2020 radio interface technologies (RITs)/Set of radio interface technologies (SRIT) in conjunction with the development of ITU-R Recommendations and Reports, such as the detailed specifications of IMT-2020. The intent of these requirements is to ensure that IMT-2020 technologies are able to fulfil the objectives of IMT-2020 and to set a specific level of performance that each proposed RIT/SRIT needs to achieve in order to be considered by ITU-R for IMT-2020.


Peak data rate: Peak data rate is the maximum achievable data rate under ideal conditions (in bit/s), which is the received data bits assuming error-free conditions assignable to a single mobile station, when all assignable radio resources for the corresponding link direction are utilized (i.e., excluding radio resources that are used for physical layer synchronization, reference signals or pilots, guard bands and guard times). 

This requirement is defined for the purpose of evaluation in the eMBB usage scenario. 
The minimum requirements for peak data rate are as follows:
Downlink peak data rate is 20 Gbit/s.
Uplink peak data rate is 10 Gbit/s.


Peak spectral efficiency: Peak spectral efficiency is the maximum data rate under ideal conditions normalised by channel bandwidth (in bit/s/Hz), where the maximum data rate is the received data bits assuming error-free conditions assignable to a single mobile station, when all assignable radio resources for the corresponding link direction are utilized (i.e. excluding radio resources that are used for physical layer synchronization, reference signals or pilots, guard bands and guard times).

This requirement is defined for the purpose of evaluation in the eMBB usage scenario.
The minimum requirements for peak spectral efficiencies are as follows: 
Downlink peak spectral efficiency is 30 bit/s/Hz.
Uplink peak spectral efficiency is 15 bit/s/Hz.


User experienced data rate: User experienced data rate is the 5% point of the cumulative distribution function (CDF) of the user throughput. User throughput (during active time) is defined as the number of correctly received bits, i.e. the number of bits contained in the service data units (SDUs) delivered to Layer 3, over a certain period of time.

This requirement is defined for the purpose of evaluation in the related eMBB test environment.
The target values for the user experienced data rate are as follows in the Dense Urban – eMBB test environment: 
Downlink user experienced data rate is 100 Mbit/s
Uplink user experienced data rate is 50 Mbit/s


5th percentile user spectral efficiency: The 5th percentile user spectral efficiency is the 5% point of the CDF of the normalized user throughput. The normalized user throughput is defined as the number of correctly received bits, i.e., the number of bits contained in the SDUs delivered to Layer 3, over a certain period of time, divided by the channel bandwidth and is measured in bit/s/Hz. 

This requirement is defined for the purpose of evaluation in the eMBB usage scenario.
Indoor Hotspot – eMBB - Downlink: 0.3 bit/s/Hz Uplink: 0.21 bit/s/Hz
Dense Urban – eMBB - Downlink: 0.225 bit/s/Hz Uplink: 0.15 bit/s/Hz
Rural – eMBB - Downlink: 0.12 bit/s/Hz Uplink: 0.045 bit/s/Hz


Average spectral efficiency: Average spectral efficiency  is the aggregate throughput of all users (the number of correctly received bits, i.e. the number of bits contained in the SDUs delivered to Layer 3, over a certain period of time) divided by the channel bandwidth of a specific band divided by the number of TRxPs and is measured in bit/s/Hz/TRxP.

This requirement is defined for the purpose of evaluation in the eMBB usage scenario.
Indoor Hotspot – eMBB - Downlink: 9 bit/s/Hz/TRxP Uplink: 6.75 bit/s/Hz/TRxP
Dense Urban – eMBB - Downlink: 7.8 bit/s/Hz/TRxP Uplink: 5.4 bit/s/Hz/TRxP
Rural – eMBB - Downlink: 3.3 bit/s/Hz/TRxP Uplink: 1.6 bit/s/Hz/TRxP


Area traffic capacity: Area traffic capacity is the total traffic throughput served per geographic area (in Mbit/s/m2). The throughput is the number of correctly received bits, i.e. the number of bits contained in the SDUs delivered to Layer 3, over a certain period of time.

This requirement is defined for the purpose of evaluation in the related eMBB test environment.
The target value for Area traffic capacity in downlink is 10 Mbit/s/m2 in the Indoor Hotspot – eMBB test environment.


User plane latency: User plane latency is the contribution of the radio network to the time from when the source sends a packet to when the destination receives it (in ms). It is defined as the one-way time it takes to successfully deliver an application layer packet/message from the radio protocol layer 2/3 SDU ingress point to the radio protocol layer 2/3 SDU egress point of the radio interface in either uplink or downlink in the network for a given service in unloaded conditions, assuming the mobile station is in the active state. 
This requirement is defined for the purpose of evaluation in the eMBB and URLLC usage scenarios.
The minimum requirements for user plane latency are
4 ms for eMBB
1 ms for URLLC 
assuming unloaded conditions (i.e., a single user) for small IP packets (e.g., 0 byte payload + IP header), for both downlink and uplink.


Control plane latency: Control plane latency refers to the transition time from a most “battery efficient” state (e.g. Idle state) to the start of continuous data transfer (e.g. Active state).
This requirement is defined for the purpose of evaluation in the eMBB and URLLC usage scenarios.
The minimum requirement for control plane latency is 20 ms. Proponents are encouraged to consider lower control plane latency, e.g. 10 ms.


Connection density: Connection density is the total number of devices fulfilling a specific quality of service (QoS) per unit area (per km2).

This requirement is defined for the purpose of evaluation in the mMTC usage scenario.
The minimum requirement for connection density is 1 000 000 devices per km2.


Energy efficiency: Network energy efficiency is the capability of a RIT/SRIT to minimize the radio access network energy consumption in relation to the traffic capacity provided. Device energy efficiency is the capability of the RIT/SRIT to minimize the power consumed by the device modem in relation to the traffic characteristics. 
Energy efficiency of the network and the device can relate to the support for the following two aspects:
a) Efficient data transmission in a loaded case;
b) Low energy consumption when there is no data.
Efficient data transmission in a loaded case is demonstrated by the average spectral efficiency 

This requirement is defined for the purpose of evaluation in the eMBB usage scenario.
The RIT/SRIT shall have the capability to support a high sleep ratio and long sleep duration. Proponents are encouraged to describe other mechanisms of the RIT/SRIT that improve the support of energy efficient operation for both network and device.


Reliability: Reliability relates to the capability of transmitting a given amount of traffic within a predetermined time duration with high success probability

This requirement is defined for the purpose of evaluation in the URLLC usage scenario. 
The minimum requirement for the reliability is 1-10-5 success probability of transmitting a layer 2 PDU (protocol data unit) of 32 bytes within 1 ms in channel quality of coverage edge for the Urban Macro-URLLC test environment, assuming small application data (e.g. 20 bytes application data + protocol overhead). 
Proponents are encouraged to consider larger packet sizes, e.g. layer 2 PDU size of up to 100 bytes.


Mobility: Mobility is the maximum mobile station speed at which a defined QoS can be achieved (in km/h).

The following classes of mobility are defined:
Stationary: 0 km/h
Pedestrian: 0 km/h to 10 km/h
Vehicular: 10 km/h to 120 km/h
High speed vehicular: 120 km/h to 500 km/h

Mobility classes supported:
Indoor Hotspot – eMBB: Stationary, Pedestrian
Dense Urban – eMBB: Stationary, Pedestrian, Vehicular (up to 30 km/h)
Rural – eMBB: Pedestrian, Vehicular, High speed vehicular 


Mobility interruption time: Mobility interruption time is the shortest time duration supported by the system during which a user terminal cannot exchange user plane packets with any base station during transitions.

This requirement is defined for the purpose of evaluation in the eMBB and URLLC usage scenarios.
The minimum requirement for mobility interruption time is 0 ms.


Bandwidth: Bandwidth is the maximum aggregated system bandwidth. The bandwidth may be supported by single or multiple radio frequency (RF) carriers. The bandwidth capability of the RIT/SRIT is defined for the purpose of IMT-2020 evaluation.

The requirement for bandwidth is at least 100 MHz
The RIT/SRIT shall support bandwidths up to 1 GHz for operation in higher frequency bands (e.g. above 6 GHz). 

In case you missed, a 5G logo has also been released by 3GPP


Related posts:



Friday 24 February 2017

Connecting Rural Scotland using Airmasts and Droneways


This week EE has finally done a press release on what they term as Airmasts (see my blog post here). Back in Nov. last year, Mansoor Hanif, Director of Converged Networks and Innovation BT/EE gave an excellent presentation on connecting rural Scottish Islands using Airmasts and Droneways at the Facebook TIP Summit. Embedded below are the slides and video from that talk.





In other related news, AT&T is showing flying COWs (Cell On Wheels) that can transmit LTE signals


Their innovation blog says:

It is designed to beam LTE coverage from the sky to customers on the ground during disasters or big events.
...
Here’s how it works. The drone we tested carries a small cell and antennas. It’s connected to the ground by a thin tether. The tether between the drone and the ground provides a highly secure data connection via fiber and supplies power to the Flying COW, which allows for unlimited flight time.  The Flying COW then uses satellite to transport texts, calls, and data. The Flying COW can operate in extremely remote areas and where wired or wireless infrastructure is not immediately available. Like any drone that we deploy, pilots will monitor and operate the device during use.

Once airborne, the Flying COW provides LTE coverage from the sky to a designated area on the ground.  

Compared to a traditional COW, in certain circumstances, a Flying COW can be easier to deploy due to its small size. We expect it to provide coverage to a larger footprint because it can potentially fly at altitudes over 300 feet— about 500% higher than a traditional COW mast.  

Once operational, the Flying COW could eventually provide coverage to an area up to 40 square miles—about the size of a 100 football fields. We may also deploy multiple Flying COWs to expand the coverage footprint.

Nokia on the other hand has also been showcasing drones and LTE connectivity for public safety at D4G Award event in Dubai


Nokia's Ultra Compact Network provides a standalone LTE network to quickly re-establish connectivity to various mission-critical applications including video-equipped drones. Drones can stream video and other sensor data in real time from the disaster site to a control center, providing inputs such as exact locations where people are stranded and nature of the difficulty of reaching the locations.

Related Posts:



Friday 17 February 2017

What's '5G' in one word for you?


Last month in the IET 'Towards 5G Mobile Technology – Vision to Reality' seminar, Dr. Mike Short threw out a challenge to all speakers to come up with one word to describe 5G technology. The speakers came up with the following 'one words':
  • Professor Mischa Dohler, Centre for Telecommunications Research, King's College London, UK - Skills
  • Professor Maziar Nekovee, Professor,University of Sussex UK - Transformative or Magic
  • Professor Andy Sutton, Principal Network Architect, BT, UK - Opportunity
  • Professor Mark Beach, University of Bristol, UK - Networked-Society
  • Mark Barrett, CMO, Bluwireless, UK - Gigabit
  • Dr Nishanth Sastry, Centre for Telecommunications Research, Kings’ College London, UK - Flexibility or Efficiency
  • Dr Reiner Hoppe, Developer Electromagnetic Solutions, Altair - Radio
  • Professor Klaus Moessner, 5G Innovation Centre, University of Surrey, UK - Capacity
  • Joe Butler, Director of Technology, Ofcom, UK - Ubiquity
  • Dr Deeph Chana, Deputy Director, Institute for Security Science and Technology, Imperial College London, UK - Accessibility
What is your one word to describe 5G? Please add in comments. I welcome critical suggestions too :-)

Anyway, for anyone interested, the following story summarises the event:

This section contained story from Storify. Storify was shut down on May 16, 2018 and as a result the story was lost. An archived version of the story can be seen on wayback machine here.

Related links:

Sunday 5 February 2017

An Introduction to IoT: Connectivity & Case Studies


I did an introductory presentation on IoT yesterday at at the University of Northampton, Internet of Things event. Below if my presentation in full. Can be downloaded from slideshare.



xoxoxoxoxoxo Added 18/02/2017 oxoxoxoxoxoxox

Below is video of the presentation above and post presentation interview:

Wednesday 1 February 2017

5G Network Architecture and Design Update - Jan 2017

Andy Sutton, Principal Network Architect at BT recently talked about the architecture update from the Dec 2016 3GPP meeting. The slides and the video is embedded below.





You can see all the presentations from IET event 'Towards 5G Mobile Technology – Vision to Reality' here.

Eiko Seidel recently also wrote an update from 3GPP 5G Adhoc regarding RAN Internal Functional Split. You can read that report here.

Related posts:

Thursday 26 January 2017

3GPP Rel-14 IoT Enhancements


A presentation (embedded below) by 3GPP RAN3 Chairman - Philippe Reininger - at the IoT Business & Technologies Congress (November 30, in Singapore). Main topics are eMTC, NB-IOT and EC-GSM-IoT as completed in 3GPP Release 13 and enhanced in Release 14. Thanks to Eiko Seidel for sharing the presentation.


Sunday 22 January 2017

Augmented / Virtual Reality Requirements for 5G


Ever wondered whether 5G would be good enough for Augmented and Virtual Reality or will we need to wait for 6G? Some researchers are trying to identify the AR / VR requirements, challenges from a mobile network point of view and possible options to solve these challenges. They have recently published a research paper on this topic.

Here is a summary of some of the interesting things I found in this paper:

  • Humans process nearly 5.2 gigabits per second of sound and light.
  • Without moving the head, our eyes can mechanically shift across a field of view of at least 150 degrees horizontally (i.e., 30:000 pixels) and 120 degrees vertically (i.e., 24:000 pixels).
  • The human eye can perceive much faster motion (150 frames per second). For sports, games, science and other high-speed immersive experiences, video rates of 60 or even 120 frames per second are needed to avoid motion blur and disorientation.
  • 5.2 gigabits per second of network throughput (if not more) is needed.
  • At today’s 4K resolution, 30 frames per second and 24 bits per pixel, and using a 300 : 1 compression ratio, yields 300 megabits per second of imagery. That is more than 10x the typical requirement for a high-quality 4K movie experience.
  • 5G network architectures are being designed to move the post-processing at the network edge so that processors at the edge and the client display devices (VR goggles, smart TVs, tablets and phones) carry out advanced image processing to stitch camera feeds into dramatic effects.
  • In order to tackle these grand challenges, the 5G network architecture (radio access network (RAN), Edge and Core) will need to be much smarter than ever before by adaptively and dynamically making use of concepts such as software defined networking (SDN), network function virtualization (NFV) and network slicing, to mention a few facilitating a more flexible allocating resources (resource blocks (RBs), access point, storage, memory, computing, etc.) to meet these demands.
  • Immersive technology will require massive improvements in terms of bandwidth, latency and reliablility. Current remotereality prototype requires 100-to-200Mbps for a one-way immersive experience. While MirrorSys uses a single 8K, estimates about photo-realistic VR will require two 16K x 16K screens (one to each eye).
  • Latency is the other big issue in addition to reliability. With an augmented reality headset, for example, real-life visual and auditory information has to be taken in through the camera and sent to the fog/cloud for processing, with digital information sent back to be precisely overlaid onto the real-world environment, and all this has to happen in less time than it takes for humans to start noticing lag (no more than 13ms). Factoring in the much needed high reliability criteria on top of these bandwidth and delay requirements clearly indicates the need for interactions between several research disciplines.


These key research directions and scientific challenges are summarized in Fig. 3 (above), and discussed in the paper. I advice you to read it here.

Related posts:

Monday 16 January 2017

Gigabit LTE?


Last year Qualcomm announced the X16 LTE modem that was capable of up to 1Gbps, category 16 in DL and Cat 13 (150 Mbps) in UL. See my last post on UE categories here.


Early January, it announced Snapdragon 835 at CES that looks impressive. Android central says "On the connectivity side of things, there's the Snapdragon X16 LTE modem, which enables Category 16 LTE download speeds that go up to one gigabit per second. For uploads, there's a Category 13 modem that lets you upload at 150MB/sec. For Wi-Fi, Qualcomm is offering an integrated 2x2 802.11ac Wave-2 solution along with an 802.11ad multi-gigabit Wi-Fi module that tops out at 4.6Gb/sec. The 835 will consume up to 60% less power while on Wi-Fi."

Technology purists would know that LTE, which is widely referred to as 4G, was in fact pre-4G or as some preferred to call it, 3.9G. New UE categories were introduced in Rel-10 to make LTE into LTE-Advanced with top speeds of 3Gbps. This way, the ITU requirements for a technology to be considered 4G (IMT-Advanced) was satisfied.


LTE-A was already Gigabit capable in theory but in practice we had been seeing peak speeds of up to 600Mbps until recently. With this off my chest, lets look at what announcements are being made. Before that, you may want to revisit what 4.5G or LTE-Advanced Pro is here.

  • Qualcomm, Telstra, Ericsson and NETGEAR Announce World’s First Gigabit Class LTE Mobile Device and Gigabit-Ready Network. Gigabit Class LTE download speeds are achieved through a combination of 3x carrier aggregation, 4x4 MIMO on two aggregated carriers plus 2x2 MIMO on the third carrier, and 256-QAM higher order modulation. 
  • TIM in Italy is the first in Europe to launch 4.5G up to 500 Mbps in Rome, Palermo and Sanremo
  • Telenet in partnership with ZTE have achieved a download speed of 1.3 Gbps during a demonstration of the ZTE 4.5G new technology. That's four times faster than 4G's maximum download speed. Telenet is the first in Europe to reach this speed in real-life circumstances. 4.5G ZTE technology uses 4x4 MIMO beaming, 3-carrier aggregation, and a QAM 256 modulation.
  • AT&T said, "The continued deployment of our 4G LTE-Advanced network remains essential to laying the foundation for our evolution to 5G. In fact, we expect to begin reaching peak theoretical speeds of up to 1 Gbps at some cell sites in 2017. We will continue to densify our wireless network this year through the deployment of small cells and the use of technologies like carrier aggregation, which increases peak data speeds. We’re currently deploying three-way carrier aggregation in select areas, and plan to introduce four-way carrier aggregation as well as LTE-License Assisted Access (LAA) this year."
  • T-Mobile USA nearly reached a Gigabit and here is what they say, "we reached nearly 1 Gbps (979 Mbps) on our LTE network in our lab thanks to a combination of three carrier aggregation, 4x4 MIMO and 256 QAM (and an un-released handset)."
  • The other US operator Sprint expects to unveil some of its work with 256-QAM and massive MIMO on Sprint’s licensed spectrum that pushes the 1 gbps speed boundary. It’s unclear whether this will include an actual deployment of the technology

So we are going to see a lot of higher speed LTE this year and yes we can call it Gigabit LTE but lets not forget that the criteria for a technology to be real '4G' was that it should be able to do 1Gbps in both DL and UL. Sadly, the UL part is still not going Gigabit anytime soon.

Saturday 7 January 2017

New LTE UE Categories (Downlink & Uplink) in Release-13

Just noticed that the LTE UE Categories have been updated since I last posted here. Since Release-12 onwards, we now have a possibility of separate Downlink (ue-CategoryDL) and Uplink (ue-CategoryUL) categories.

From the latest RRC specifications, we can see that now there are two new fields that can be present ue-CategoryDL and ue-CategoryUL.

An example defined here is as follows:

Example of RRC signalling for the highest combination
UE-EUTRA-Capability
   ue-Category = 4
      ue-Category-v1020 = 7
         ue-Category-v1170 = 10
            ue-Category-v11a0 = 12
               ue-CategoryDL-r12 = 12
               ue-CategoryUL-r12 = 13
                  ue-CategoryDL-v1260 = 16

From the RRC Specs:

  • The field ue-CategoryDL is set to values m1, 0, 6, 7, 9 to 19 in this version of the specification.
  • The field ue-CategoryUL is set to values m1, 0, 3, 5, 7, 8, 13 or 14 in this version of the specification.

3GPP TS 36.306 section 4 provides much more details on these UE categories and their values. I am adding these pictures from the LG space website.



More info:



Saturday 10 December 2016

Free Apps for Field Testing

People who follow me on Twitter may have often noticed I put photos when I am doing surveys, field testing, debugging, etc. In the good old days we often had to carry a lot of different kind of specialised test equipment to do basic measurements. Nowadays a lot of these can be done with the help of free apps on Android phones. The best tool that can provide a great amount of info is Qualcomm's QXDM but its really expensive.

Here are a few tools that I use. If you have one that I havent listed below, please add it in comments.


The screen shot shows the main tools along with my favourite, SpeedTest. While I agree that Speedtest is not the most reliable approach to speed of your connection, I think its the most standard one being used.


WiFi Analyzer is another great app that can be used at home and other locations where people complain about not getting good WiFi speeds. I have been at locations where the 2.4GHz is absolutely packed with APs. 5GHz is also getting busier, though there are still a lot of free channels.


G-NetTrack Lite is a great tool to keep track of the cells you have been visiting. In case you are driving this can collect a lot of valuable info. The paid version, G-NetTrack Pro can collect the info in form of a map that can be used for offline viewing with the help of Google Earth.


I use LTE Discovery mainly for finding the band I am currently camped on. It would be great if a tool can give the exact frequency and earfcn but the band is good enough too. I was once in a situation where I could see two different cells but they had the same PCI. Only after using this, I figured out they were on different bands.


Finally, Network Cell Info Lite gives neighbour cells which can often be useful. I am not sure of these are the neighbours from System Info or from Measurement Control messages sent by network or just something like Detected cells that the phone sees around.

Pind and IPConfig are other tools that can come handy sometimes.

Are there any other tools that you like? Please share using comments.

Free Apps for Field Testing - Part 2

Sunday 4 December 2016

5G, Hacking & Security


It looks like devices that are not manufactures with security and privacy in mind are going to be the weakest link in future network security problems. I am sure you have probably read about how hacked cameras and routers enabled a Mirai botnet to take out major websites in October. Since then, there has been no shortage of how IoT devices could be hacked. In fact the one I really liked was 'Researchers hack Philips Hue lights via a drone; IoT worm could cause city blackout' 😏.


Enter 5G and the problem could be be made much worse. With high speed data transfer and signalling, these devices can create an instantaneous attack on a very large scale and generating signalling storm that can take a network down in no time.

Giuseppe TARGIA, Nokia presented an excellent summary of some of these issues at the iDate Digiworld Summit 2016. His talk is embedded below:



You can check out many interesting presentations from the iDate Digiworld Summit 2016 on Youtube and Slideshare.

Related posts:


Wednesday 23 November 2016

Facebook's Attempt to Connect the Unconnected

I am sure that by now everyone is aware of Facebook's attempt to connect the people in rural and remote areas. Back in March they published the State of Connectivity report highlighting that there are still over 4 billion people that are unconnected.


The chart above is very interesting and shows that there are still people who use 2G to access Facebook. Personally, I am not sure if these charts take Wi-Fi into account or not.

In my earlier post in the Small Cells blog, I have made a case for using Small Cells as the best solution for rural & remote coverage. There are a variety of options for power including wind turbines, solar power and even the old fashioned diesel/petrol generators. The main challenge is sometimes the backhaul. To solve this issue Facebook has been working on its drones as a means of providing the backhaul connectivity.


Recently Facebook held its first Telco Infra Project (TIP) Summit in California. The intention was to bring the diverse set of members (over 300 as I write this post) in a room, discuss ideas and ongoing projects.


There were quite a few interesting talks (videos available here). I have embedded the slides and the talk by SK Telecom below but before I that I was to highlight the important point  made by AMN.


As can be seen in the picture above, technology is just one of the challenges in providing rural and remote connectivity. There are other challenges that have to be considered too.

Embedded below is the talk provided by Dr. Alex Jinsung Choi,  CTO, SK Telecom and TIP Chairman and the slides follow that.



For more info, see:
Download the TIP slides from here.

Thursday 17 November 2016

5G, Debates, Predictions and Stories

This post contains summary of three interesting events that took place recently.


CW (Cambridge Wireless) organised a couple of debates on 5G as can be seen from the topics above. Below is the summary video and twitter discussion summary/story.





The second story is from 'The Great Telco Debate 2016' organised by TM forum


I am not embedding the story but for anyone interested, they can read the twitter summary here: https://storify.com/zahidtg/the-great-telco-debate-2016



Finally, it was 'Predictions: 2017 and Beyond', organised by CCS Insight. The whole twitter discussion is embedded below.


Saturday 12 November 2016

Verizon's 5G Standard

Earlier this year I wrote a Linkedin post on how operators are setting a timetable for 5G (5G: Mine is bigger than yours) and recently Dean Bubley of Disruptive Analysis wrote a similar kind of post also on Linkedin with a bit more detail (5G: Industry Politics, Use-Cases & a Realistic Timeline)


Some of you may be unaware that the US operator Verizon has formed 'Verizon 5G Technology Forum' (V5GTF) with the intention of developing the first set of standards that can also influence the direction of 3GPP standardization and also provide an early mover advantage to itself and its partners.

The following from Light Reading news summarizes the situation well:

Verizon has posted its second round of work with its partners on a 5G specification. The first round was around the 5G radio specification; this time the work has been on the mechanics of connecting to the network. The operator has been working on the specification with Cisco Systems Inc., Ericsson AB, Intel Corp., LG Electronics Inc., Nokia Corp., Qualcomm Inc. and Samsung Corp. via the 5G Technology Forum (V5GTF) it formed late in 2015.

Sanyogita Shamsunder, director of strategy at Verizon, says that the specification is "75% to 80% there" at least for a "fixed wireless use case." Verizon is aiming for a "friendly, pre-commercial launch" of a fixed wireless pilot in 2017, Koeppe notes.

Before we go further, lets see this excellent video by R&S wherein Andreas Roessler explains what Verizon is up to:



Verizon and SKT are both trying to be the 5G leaders and trying to roll out a pre-standard 5G whenever they can. In fact Qualcomm recently released a 28 GHz modem that will be used in separate pre-standard 5G cellular trials by Verizon and Korea Telecom

Quoting from the EE times article:

The Snapdragon X50 delivers 5 Gbits/second downlinks and multiple gigabit uplinks for mobile and fixed-wireless networks. It uses a separate LTE connection as an anchor for control signals while the 28 GHz link delivers the higher data rates over distances of tens to hundreds of meters.

The X50 uses eight 100 MHz channels, a 2x2 MIMO antenna array, adaptive beamforming techniques and 64 QAM to achieve a 90 dB link budget. It works in conjunction with Qualcomm’s SDR05x mmWave transceiver and PMX50 power management chip. So far, Qualcomm is not revealing more details of modem that will sample next year and be in production before June 2018.

Verizon and Korea Telecom will use the chips in separate trials starting late next year, anticipating commercial services in 2018. The new chips mark a departure from prototypes not intended as products that Qualcomm Research announced in June.

Korea Telecom plans a mobile 5G offering at the February 2018 Winter Olympics. Verizon plans to launch in 2018 a less ambitious fixed-wireless service in the U.S. based on a specification it released in July. KT and Verizon are among a quartet of carriers that formed a group in February to share results of early 5G trials.

For its part, the 3GPP standards group is also stepping up the pace of the 5G standards efforts it officially started earlier this year. It endorsed last month a proposal to consider moving the date for finishing Phase I, an initial version of 5G anchored to LTE, from June 2018 to as early as December 2017, according to a recent Qualcomm blog.

Coming back to Verizon's 5G standard, is it good enough and compatible with 3GPP standards? The answer right now seems to be NO.


The following is from Rethink Wireless:

The issue is that Verizon’s specs include a subcarrier spacing value of 75 kHz, whereas the 3GPP has laid out guidelines that subcarrier spacing must increase by 30 kHz at a time, according to research from Signals Research Group. This means that different networks can work in synergy if required without interfering with each other.

Verizon’s 5G specs do stick to 3GPP requirements in that it includes MIMO and millimeter wave (mmWave). MmWave is a technology that both AT&T and Verizon are leading the way in – which could succeed in establishing spectrum which is licensed fairly traditionally as the core of the US’s high frequency build outs.

A Verizon-fronted group recently rejected a proposal from AT&T to push the 3GPP into finalizing an initial 5G standard for late 2017, thus returning to the original proposed time of June 2018. Verizon was supported by Samsung, ZTE, Deutsche Telecom, France Telecom, TIM and others, which were concerned the split would defocus SA and New Radio efforts and even delay those standards being finalized.

Verizon has been openly criticized in the industry, mostly by AT&T (unsurprisingly), as its hastiness may lead to fragmentation – yet it still looks likely to beat AT&T to be the first operator to deploy 5G, if only for fixed access.

Verizon probably wants the industry to believe that it was prepared for eventualities such as this – prior to the study from Signal Research Group, the operator said its pre-standard implementation will be close enough to the standard that it could easily achieve full compatibility with simple alterations. However, Signals Research Group’s president Michael Thelander has been working with the 3GPP since the 5G standard was birthed, and he begs to differ.

Thelander told FierceWireless, “I believe what Verizon is doing is not hardware-upgradeable to the real specification. It’s great to be trialing, even if you define your own spec, just to kind of get out there and play around with things. That’s great and wonderful and hats off to them. But when you oversell it and call it 5G and talk about commercial services, it’s not 5G. It’s really its own spec that has nothing to do with Release 16, which is still three years away. Just because you have something that operates in millimeter wave spectrum and uses Massive MIMO and OFDM, that doesn’t make it a 5G solution.”

Back in the 3G days, NTT Docomo was the leader in standards and it didn't have enough patience to wait for 3GPP standards to complete. As a result it released its first 3G network called FOMA (Freedom of Mobile Access) based on pre-standard version of specs. This resulted in handset manufacturers having to tweak their software to cope with this version and it suffered from economy of scale. Early version of 3G phones were also not able to roam on the Docomo network. In a way, Verizon is going down the same path.

While there can be some good learning as a result of this pre-5G standard, it may be a good idea not to get too tied into it. A standard that is not compliant will not achieve the required economy of scale, either with handsets or with dongles and other hotspot devices.


Related posts: