Showing posts with label Spectrum. Show all posts
Showing posts with label Spectrum. Show all posts

Monday, 24 September 2018

5G New Radio Standards and other Presentations


A recent Cambridge Wireless event 'Radio technology for 5G – making it work' was an excellent event where all speakers delivered an interesting and insightful presentation. These presentations are all available to view and download for everyone for a limited time here.

I blogged about the base station antennas last week but there are other couple of presentations that stood out for me.


The first was an excellent presentation from Sylvia Lu from u-Blox, also my fellow CW Board Member. Her talk covered variety of topics including IoT, IIoT, LTE-V2X and Cellular positioning, including 5G NR Positioning Trend. The presentation is embedded below and available to download from Slideshare





The other presentation on 5G NR was one from Yinan Qi of Samsung R&D. His presentation looked at variety of topics, mainly Layer 1 including Massive MIMO, Beamforming, Beam Management, Bandwidth Part, Reference Signals, Phase noise, etc. His presentation is embedded below and can be downloaded from SlideShare.




Related Posts:

Wednesday, 5 September 2018

LiFi can be a valuable tool for densification

LiFi has been popping up in the news recently. I blogged about it (as LED-Fi) 10 years back. While the concept has remained the same, many of the limitations associated with the technology has been overcome. One of the companies driving LiFi is Scottish startup called pureLiFi.


I heard Professor Harald Haas at IEEE Glasgow Summit speak about how many of the limitations of LiFi have been overcome in the last few years (see videos below). This is a welcome news as there is a tremendous amount of Visible Light Spectrum that is available for exploitation.


While many discussions on LiFi revolve round its use as access technology, I think the real potential lies in its use as backhaul for densification.

For 5G, when we are looking at small cells, every few hundred meters, probably on streetlights and lamp posts, there is a requirement for alternative backhaul to fiber. Its difficult to run fiber to each and every lamp post. Traditionally, this was solved by microwave solutions but another option available in 5G is Integrated Access and Backhauling (IAB) or Self-backhauling.


A better alternative could be to use LiFi for this backhauling between lamp posts or streetlights. This can help avoid complications with IAB when multiple nodes are close by and also any complications with the technology until it matures. This approach is of course being trialed but as the picture above shows, rural backhaul is just one option.
LiFi is being studied as part of IEEE 802.11bb group as well as its potential is being considered for 5G.

Here is a vieo playlist explaining LiFi technology in detail.




Further reading:

Thursday, 12 July 2018

Minimum Bandwidth Requirement for 5G Non-Standalone (NSA) Deployment

I was attending the IEEE 5G World Forum live-stream, courtesy of IEEE Tv and happen to hear Egil Gronstad, Senior Director of Technology Development and Strategy at T-Mobile USA. He said that they will be building a nationwide 5G network that will initially be based on 600 MHz band.


During the Q&A, Egil mentioned that because of the way the USA has different markets, on average they have 31 MHz of 600 MHz (Band 71). The minimum is 20 MHz and the maximum is 50 MHz.

So I started wondering how would they launch 4G & 5G in the same band for nationwide coverage? They have a good video on their 5G vision but that is of course probably going to come few years down the line.

In simple terms, they will first deploy what is known as Option 3 or EN-DC. If you want a quick refresher on different options, you may want to jump to my tutorial on this topic at 3G4G here.

The Master Node (recall dual connectivity for LTE, Release-12. See here) is an eNodeB. As with any LTE node, it can take bandwidths from 1.4 MHz to 20 MHz. So the minimum bandwidth for LTE node is 1.4 MHz.

The Secondary Node is a gNodeB. Looking at 3GPP TS 38.101-1, Table 5.3.5-1 Channel bandwidths for each NR band, I can see that for band 71


NR band / SCS / UE Channel bandwidth
NR Band
SCS
kHz
5 MHz
101,2 MHz
152 MHz
202 MHz
252 MHz
30 MHz
40 MHz
50 MHz
60 MHz
80 MHz
90 MHz
100 MHz
n71
15
Yes
Yes
Yes
Yes








30

Yes
Yes
Yes








60













The minimum bandwidth is 5MHz. Of course this is paired spectrum for FDD band but the point I am making here is that you need just 6.4 MHz minimum to be able to support the Non-Standalone 5G option.

I am sure you can guess that the speeds will not really be 5G speeds with this amount of bandwidth but I am looking forward to all these kind of complaints in the initial phase of 5G network rollout.

I dont know what bandwidths T-Mobile will be using but we will see at least 10MHz of NR in case where the total spectrum is 20 MHz and 20 MHz of NR where the total spectrum is 50 MHz.

If you look at the earlier requirements list, the number being thrown about for bandwidth was 100 MHz for below 6 GHz and up to 1 GHz bandwidth for spectrum above 6 GHz. Don't think there was a hard and fast requirement though.

Happy to hear your thoughts.

Tuesday, 3 July 2018

Terahertz and Beyond 100 GHz progress

There seems to be a good amount of research going on in higher frequencies to see how a lot more spectrum with a lot more bandwidth can be used in future radio communications. NTT recently released information about "Ultra high-speed IC capable of wireless transmission of 100 gigabits per second in a 300 GHz band". Before we discuss anything, lets look at what Terahertz means from this article.

Terahertz wave: Just as we use the phrase ‘kilo’ to mean 103 , so we use the term ‘giga’ to mean 109 and the term ‘tera’ to mean 1012 . “Hertz (Hz)” is a unit of a physical quantity called frequency. It indicates how many times alternating electric signals and electromagnetic waves change polarity (plus and minus) per second. That is, one terahertz (1 THz = 1,000 GHz) is the frequency of the electromagnetic wave changing the polarity by 1 × 1012 times per second. In general, a terahertz wave often indicates an electromagnetic wave of 0.3 THz to 3 THz.

While there are quite a few different numbers, this is the one that is most commonly being used. The following is the details of research NTT did.

In this research, we realized 100 Gbps wireless transmission with one wave (one carrier), so in the future, we can extend to multiple carriers by making use of the wide frequency band of 300 GHz band, and use spatial multiplexing technology such as MIMO and OAM. It is expected to be an ultra high-speed IC technology that enables high-capacity wireless transmission of 400 gigabits per second. This is about 400 times the current LTE and Wi-Fi, and 40 times 5G, the next-generation mobile communication technology. It is also expected to be a technology that opens up utilization of the unused terahertz wave frequency band in the communications field and non-communication fields.

Complete article and paper available here.

Huawei has also been doing research in W (92 - 114.5 GHz) and D (130 - 174.5 GHz) bands.


A recent presentation by Debora Gentina, ETSI ISG mWT WI#8 Rapporteur at the UK Spectrum Policy Forum is embedded below.



This presentation can be downloaded from UK SPF site here. Another event on beyond 100GHz that took place last year has some interesting presentations too. Again, on UKSPF site here.


Ericsson has an interesting article in Technology Review, looking at beyond 100GHz from backhaul point of view. Its available here.

If 5G is going to start using the frequencies traditionally used by backhaul then backhaul will have to start looking at other options too.

Happy to listen to your thoughts and insights on this topic.

Wednesday, 16 May 2018

100 Gbps wireless transmission using Orbital Angular Momentum (OAM) multiplexing


From a press release by NTT Group:

Nippon Telegraph and Telephone Corporation (NTT, Head Office: Chiyoda-ku, Tokyo, President and CEO: Hiroo Unoura) has successfully demonstrated for the first time in the world 100 Gbps wireless transmission using a new principle — Orbital Angular Momentum (OAM) multiplexing — with the aim of achieving terabit-class wireless transmission to support demand for wireless communications in the 2030s. It was shown in a laboratory environment that dramatic leaps in transmission capacity could be achieved by an NTT devised system that mounts data signals on the electromagnetic waves generated by this new principle of OAM multiplexing in combination with widely used Multiple-Input Multiple-Output (MIMO) technology. The results of this experiment revealed the possibility of applying this principle to large-capacity wireless transmission at a level about 100 times that of LTE and Wi-Fi and about 5 times that of 5G scheduled for launch. They are expected to contribute to the development of innovative wireless communications technologies for next-generation of 5G systems such as connected cars, virtual-reality/augmented-reality (VR/AR), high-definition video transmission, and remote medicine.


NTT is to present these results at Wireless Technology Park 2018 (WTP2018) to be held on May 23 – 25 and at the 2018 IEEE 87th Vehicular Technology Conference: VTC2018-Spring, an international conference sponsored by the Institute of Electrical and Electronics Engineers (IEEE) to be held on June 3 – 6.


For more technical details look at the bottom of this link.

Related Post:

Friday, 8 December 2017

Monday, 27 November 2017

5G and CBRS Hype?

The dissenting voices on 5G and CBRS are getting louder. While there are many analysts & operators who have been cautioning against 5G, its still moving ahead with a rapid pace. In the recent Huawei Mobile Broadband forum for example, BT's boss admitted that making case for 5G is hard. Bruno Jacobfeuerborn, CTO of Deutsche Telekom on the other hand is sitting on the fence. Dean Bubley's LinkedIn post is interesting too.



Anyway, we have storified most of the tweets from Huawei Mobile Broadband Forum here.


Signals Research Group recently published their Signals Flash report, which is different from the more detailed Signals Ahead reports looking at 5G and CBRS, in addition to other topics. I have embedded the report below (with permission - thanks Mike) but you can download your own copy from here.

The summary from their website will give a good idea of what that is about:

CBRS – Much Ado About Not Very Much.  The FCC is heading in the right direction with how it might regulate the spectrum. However, unless you are a WISP or a private entity looking to deploy a localized BWA service, we don’t see too many reasons to get excited.

Handicapping the 5G Race.  Millimeter wave networks will be geographically challenged, 600 MHz won’t scale or differentiate from LTE, Band 41 may be the most promising, but this isn’t saying much. Can network virtualization make a winner?

It makes no Cents! Contrary to widespread belief,  5G won’t be a new revenue opportunity for operators – at least in the near term. The vertical markets need to get on board while URLLC will lag eMBB and prove far more difficult to deploy.

This Fierce Wireless article summarises the issues with CBRS well.

“While (some) issues are being addressed, the FCC can’t solve how to carve up 150 MHz of spectrum between everyone that wants a piece of the pie, while also ensuring that everyone gets a sufficient amount of spectrum,” the market research firm said in a report. “The 150 MHz is already carved up into 7- MHz for PAL (Priority Access License) and 80 MHz for GAA (General Authorized Access). The pecking order for the spectrum is incumbents, followed by PAL, and then by GAA…. 40 MHz sounds like a lot of spectrum, but when it comes to 5G and eMBB, it is only somewhat interesting, in our opinion. Further, if there are multiple bidders going after the PAL licenses then even achieving 40 MHz could be challenging.”

Signals said that device compatibility will also be a significant speed bump for those looking to leverage CBRS. Manufacturers won’t invest heavily to build CBRS-compatible phones until operators deploy infrastructure “in a meaningful way,” but those operators will need handsets that support the spectrum for those network investments to pay dividends. So while CBRS should prove valuable for network operators, it may not hold as much value for those who don’t own wireless infrastructure.

“The device ecosystem will develop but it is likely the initial CBRS deployments will target the more mundane applications, like fixed wireless access and industrial IoT applications,” the firm said. “We believe infrastructure and devices will be able to span the entire range of frequencies—CBRS and C-Band—and the total amount of available spectrum, combined with the global interest in the C-Band for 5G services, will make CBRS more interesting and value to operators. Operators will just have to act now, and then wait patiently for everything to fall into place.”

While many parts of the world are focusing on using frequencies around and above 3.5GHz for 5G, USA would be the only country using it for 4G. I suspect that many popular devices may not support CBRS but could be good for Fixed Wireless Access (FWA). It remains to be seen if economy of scale would be achieved.


Saturday, 7 October 2017

2G / 3G Switch Off: A Tale of Two Worlds

Source: Wikipedia

2G/3G switch off is always a topic of discussion in most conferences. While many companies are putting their eggs in 4G & 5G baskets, 2G & 3G is not going away anytime soon.

Based on my observations and many discussions that I have had over the past few months, I see a pattern emerging.

In most developed nations, 2G will be switched off (or some operators may leave a very thin layer) followed by re-farming of 3G. Operators will switch off 3G at earliest possible opportunity as most users would have moved to 4G. Users that would not have moved to 4G would be forced to move operators or upgrade their devices. This scenario is still probably 6 - 10 years out.



As we all know that 5G will need capacity (and coverage) layer in sub-6GHz, the 3G frequencies will either be re-farmed to 4G or 5G as 2G is already being re-farmed to 4G. Some operators may choose to re-balance the usage with some lower frequencies exchanged to be used for 5G (subject to enough bandwidth being available).


On the other hand, in the developing and less-developed nations, 3G will generally be switched off before 2G. The main reason being that there are still a lot of feature phone users that rely on 2G technologies. Most, if not all, 3G phones support 2G so the existing 3G users will be forced onto 2G. Those who can afford, will upgrade to newer smartphones while those who cant will have to grudgingly use 2G or change operators (not all operators in a country will do this at the same time).

Many operators in the developing countries believe that GSM will be around until 2030. While it may be difficult to predict that far in advance, I am inclined to believe this.

For anyone interested, here is a document listing 2G/3G switch off dates that have been publicly announced by the operators.



Let me know what you think.

Further reading:

Thursday, 20 July 2017

Second thoughts about LTE-U / LAA

Its been a while since I wrote about LTE-U / LAA on this blog. I have written a few posts on the small cells blog but they seem to be dated as well. For anyone needing a quick refresher on LTE-U / LAA, please head over to IoTforAll or ShareTechNote. This post is not about the technology per se but the overall ecosystem with LTE-U / LAA (and even Multefire) being part of that.

Lets recap the market status quickly. T-Mobile US has already got LTE-U active and LAA was tested recently. SK Telecom achieved 1Gbps in LAA trials with Ericsson. AT&T has decided to skip the non-standard LTE-U and go to standards based LAA. MTN & Huawei have trialled LAA for in-building in South Africa. All these sound good and inspires confidence in the technology however some observations are worrying me.


Couple of years back when LTE-U idea was conceived, followed by LAA, the 5GHz channels were relatively empty. Recently I have started to see that they are all filling up.

Any malls, hotels, service stations or even big buildings I go to, they all seem to be occupied. While supplemental downlink channels are 20MHz each, the Wi-Fi channels could be 20MHz, 40MHz, 80MHz or even 160MHz.

On many occasions I had to switch off my Wi-Fi as the speeds were so poor (due to high number of active users) and go back to using 4G. How will it impact the supplemental downlink in LTE-U / LAA? How will it impact the Wi-Fi users?

On my smartphone, most days I get 30/40Mbps download speeds and it works perfectly fine for all my needs. The only reason we would need higher speeds is to do tethering and use laptops for work, listen to music, play games or watch videos. Most people I know or work with dont require gigabit speeds at the moment.

Once a user that is receiving high speeds data on their device using LTE-U / LAA creates a Wi-Fi hotspot, it may use the same 5GHz channels as the ones that the network is using for supplemental downlink. How do you manage this interference? I am looking forward to discussions on technical fora where users will be asking why their download speeds fall as soon as they switch Wi-Fi hotspot on.

The fact is that in non-dense areas (rural, sub-urban or even general built-up areas), operators do not have to worry about the network being overloaded and can use their licensed spectrum. Nobody is planning to deploy LTE-U / LAA in these areas. In dense and ultra-dense areas, there are many users, many Wi-Fi access points, ad-hoc Wi-Fi networks and many other sources of interference. In theory LTE-U / LAA can help significantly but as there are many sources of interference,its uncertain if it would be a win-win for everyone or just more interference for everyone to deal with.

Further reading:

Sunday, 19 March 2017

Latest on 5G Spectrum - March 2017

In an earlier post I mentioned that there will be three different types of spectrum that would be needed for 5G; coverage layer, capacity layer and high throughput layer. There is now a consensus within the industry for this approach.


In a 5G seminar, back in Jan, there were a few speakers who felt that there is an informal agreement about the frequencies that will be used. One such slide from Ofcom could be seen in the picture above. Ofcom has also recently released a report expanding on this further.


Analysys Mason has nicely summarized the bands suggested by Ofcom and possibly available in the UK for 5G in the picture above.

Global mobile Suppliers Association (GSA) has also nicely summarised the bands under investigations and trials as follows:

Coverage Layer600 MHz, 700 MHz, 800 MHz, 900 MHz, 1.5 GHz, 2.1 GHz, 2.3 GHz and 2.6 GHz

Capacity Layer:

Europe                     3400 – 3800 MHz (awarding trial licenses)

China                       3300 – 3600 MHz (ongoing trial), 4400 – 4500 MHz, 4800 – 4990 MHz

Japan                       3600 – 4200 MHz and 4400-4900 MHz

Korea                       3400 – 3700 MHz

USA                          3100 – 3550 MHz (and 3700 – 4200 MHz)

High Throughput Layer:

USA:      27.5 – 28.35 GHz and 37 – 40 GHz pre-commercial deployments in 2018

Korea:   26.5 – 29.5 GHz trials in 2018 and commercial deployments in 2019

Japan:   27.5 – 28.28 GHz trials planned from 2017 and potentially commercial deployments in 2020

China:    Focusing on 24.25 – 27.5 GHz and 37 – 43.5 GHz studies

Sweden: 26.5 – 27.5 GHz awarding trial licenses for use in 2018 and onwards

EU:        24.25 – 27.5 GHz for commercial deployments from 2020

Finally, as a reminder, list of bands originally approved for IMT-2020 (5G) as follows:


Another potential band, not being mentioned above is the 66-76GHz spectrum. This band is adjacent to the 60 GHz Wi-Fi (57 GHz - 66 GHz). Lessons learned from that band can be applied to the 5G band too.

Related links:



Sunday, 12 March 2017

High Power / Performance User Equipment (#HPUE)

3GPP refers to HPUE as High Power UE while the US operator Sprint prefers to use the term High Performance UE.

HPUE was initially defined for US Public Safety Band 14 (700MHz). The intention was that this high power UEs can increase the coverage range from 4km to 8km. This would mean larger coverage areas and less number of cells.

While the commercial UE's (class 3) transmit at +23dBm (max 200mW), the Public Safety people intend to use class 1 UE transmitting +31 dBm (max 1.25W). It was felt that this feature could be beneficial for some TDD bands that do not have to worry about backward compatibility. One such band, pushed by Sprint was TDD Band 41 (2500MHz). As this band is for the commercial UE's, instead of class 1, class 2 power at +26dBm (max 400mW) was proposed.

3GPP TS 36.886 provides the following justification:

Currently, 3GPP has defined only Power Class UE 3 as the type of UE supported for TDD LTE band 41 operations. This definition was based on aligning TDD LTE Band 41 UE power classes with prior work in 3GPP related to other bands. However, it should be mentioned that 3GPP UE Power Class 3 definition (i.e. 23dBm) was mainly driven to ensure backward compatibility with prior technologies (i.e. GSM/UMTS) [2] so that network deployment topologies remain similar. Furthermore, maintaining the same power class UE definition (i.e. Class 3) as previous technologies would maintaining compliance with various national regulatory rulings, particularly in terms of SAR, for FDD LTE duplexing mode. 

However, TDD LTE band 41 does not have any 3GPP legacy technologies associated with it, hence the backward compatibility consideration is not applicable in its case. Also, since band 41 is defined as a TDD LTE band, it is less susceptible to SAR levels that FDD LTE bands due to SAR definition. Therefore, defining a new UE power class with higher than 23dBm Tx power for TDD LTE Band 41 operations would not compromise any of 3GPP foundational work, while improving UE and network performance. It should also be mentioned that 3GPP has done similar work on other bands (i.e. band 14) when defining a higher power class UE, hence the concept presented in this document is a continuation of that process.

The present document carries out a feasibility analysis for defining a UE Power class 2 (i.e. 26dBm) for operation on TDD LTE band 41. The document analyses current and future technological advancements in the area of UE RF front-end components and architectures that enable such definition while maintaining 3GPP specification and other regulatory bodies' requirements. It should be emphasized that this proposal only relates to single carrier UL operations on TDD band 41 (i.e. TM-1/2 modes) without affecting current 3GPP definition for UL carrier aggregation on band 41.

As you can see from the tweet above, Sprint CEO is quite pleased with the HPUE. 

SourceDiana Goovaerts

Iain Gillott, iGR points out that HPUE applies to Sprint’s 2.5 GHz TDD network and associated spectrum, and the company claims up to 30 percent increase in cell cover from the new technology.  It should be noted that HPUE is a 3GPP standard that applies to the 2.5 GHz TDD band (Band 41) and is also to be used by China Mobile and Softbank.  HPUE was developed as part of the Global TDD LTE Initiative (GTI) which includes Qualcomm Technologies, Samsung, ZTE, Broadcom, MediaTek, Skyworks Solutions, Alcatel, Motorola, LG and Qorvo... The cool part: the improvement in coverage comes from simply improving the device uplink power.  So Sprint, China Mobile and Softbank will not have to visit their cell sites to make changes; they just need 2.5 GHz TDD devices with HPUE to get the benefit.


Milan Milanović recently wrote about Sprint’s Gigabit Class LTE network goes live in New Orleans. One of the questions I had was why is the uplink so rubbish as compared to downlink. He kindly pointed out to me that this is TDD config 2
If you are wondering what is TDD Config 2, see the pic below
Source: ShareTechNote

Sprint expects HPUE to appear in postpaid devices starting in 2017, including new devices from Samsung, LG, HTC, and Moto. It’s expected that all of Sprint’s new devices will have HPUE support within the next two years.

I think it would be interesting to see how this impacts when there are a lot more users and devices. I am quite sure there will be more requests for HPUE in further TDD bands.

Related Links:

Sunday, 11 September 2016

How much spectrum would 5G need?


The above picture is a summary of the spectrum that was agreed to be studied for IMT-2020 (5G). You can read more about that here. I have often seen discussions around how much spectrum would be needed by each operator in total. While its a complex question, we cannot be sure unless 5G is defined completely. There have been some discussions about the requirements which I am listing below. More informed readers please feel free to add your views as comments.


Real Wireless has done some demand analysis on how much spectrum is required for 5G. A report by them for European Commission is due to be published sometime soon. As can be seen in the slide above, one of the use cases is about multi gigabit motorway. If the operators deploy 5G the way they have deployed 4G then 56 GHz of spectrum would be required. If they move to a 100% shared approach where all operators act as MVNO and there is another entity that deploys all infrastcture, including spectrum then the spectrum requirement will go down to 14 GHz.

This is in addition to all the other spectrum for 2G, 3G & 4G that the operator already holds. I have embedded the presentation below and it can be downloaded from here:



The UK Spectrum Policy Forum (UKSPF) recently held a workshop on Frequency bands for 5G, the presentations for which are available to download on the link I provided.


Its going to be a huge challenge to estimate what applications will require how much amount of spectrum and what would be the priority as compared to other applications. mmMagic is one such group looking at spectrum requirements, use cases, new concepts, etc. They have estimated that around 3.1GHz would be required by each operator for 99% reliability. This seems more reasonable. It would be interesting to see how much would operators be willing to spend for such a quantity of spectrum.



Related posts:



Sunday, 29 May 2016

5G & 802.11ax


Samsung is one of the 5G pioneers who has been active in this area for quite a while, working in different technology areas but also making results and details available for others to appreciate and get an idea on what 5G is all about. 

I published a post back in 2014 from their trials going on then. Since then they have been improving on these results. They recently also published the 5G vision paper which is available here and here.



In the recent 5G Huddle, Raj Gawera from Samsung gave an excellent presentation (below) on the topic of "The future connected world". 



What we really liked is how closely 5G and 802.11ax can be considered aligned, not only in terms of requirements but also the roadmap.

Anyway, here is the presentation embedded below. Let me know what you think in the comments below.


Saturday, 9 January 2016

5G Spectrum Discussions

While most people are looking at 5G from the point of new technologies, innovative use cases and even lumping everything under sun as part of 5G, many are unaware of the importance of spectrum and the recently concluded ITU World Radio Conference 2015 (WRC-15).

As can be seen in the picture above, quite a few bands above 24GHz were identified for 5G. Some of these bands have an already existing allocation for mobile service on primary basis. What this means is that mobile services can be deployed in these bands. For 3G and 4G, the spectrum used was in bands below 4GHz, with 1800MHz being the most popular band. Hence there was never a worry for those high frequency bands being used for mobile communication.

As these bands have now been selected for study by ITU, 5G in these bands cannot be deployed until after WRC-19, where the results of these studies will be presented. There is a small problem though. Some of the bands that were initially proposed for 5G, are not included in this list of bands to be studied. This means that there is a possibility that some of the proponent countries can go ahead and deploy 5G in those bands.

For three bands that do not already have mobile services as primary allocation, additional effort will be required to have mobile as primary allocation for them. This is assuming that no problems are identified as a result of studies going to be conducted for feasibility of these bands for 5G.


To see real benefits of 5G, an operator would need to use a combination of low and high frequency bands as can be seen in the picture above. Low frequencies for coverage and high frequencies for capacity and higher data rates.


As I mentioned in an earlier blog post, 5G will be coming in two phases. Phase 1 will be Rel-15 in H2, 2018 and Phase 2, Rel-16, in Dec. 2019. Phase 1 of 5G will generally consist of deployment in lower frequency bands as the higher frequency bands will probably get an approval after WRC-19. Once these new bands have been cleared for 5G deployment, Phase 2 of 5G would be ready for deployment of these high frequency bands.

This also brings us to the point that 5G phase 1 wont be significantly different from LTE-A Pro (or 4.5G). It may be slightly faster and maybe a little bit more efficient.

One thing I suspect that will happen is start of switching off of 3G networks. The most commonly used 3G (UMTS) frequency is 2100MHz (or 2.1GHz). If a network has to keep some 3G network running, it will generally be this frequency. This will also allow other international users to roam onto that network. All other 3G frequencies would soon start migrating to 4G or maybe even 5G phase 1.

Anyway, 2 interesting presentations on 5G access and Future of mmWave spectrum are embedded below. They are both available to download from the UK Spectrum Policy Forum (SPF) notes page here.








Further reading:


Saturday, 19 December 2015

ADS-B to enable global flight tracking


One of the things that the World Radio Conference 2015 (WRC-15) enabled was to provide a universal spectrum allocation for flight tracking. What this means in simple terms is that once completely implemented, flights will hopefully no longer be lost, like MH370. It will now be possible to accurately track flights with satellites across nearly 100% of the globe, up from 30% today, by 2018.

To make you better understand this, see this video below:


Automatic Dependent Surveillance (ADS) is a surveillance technique in which aircraft automatically provide, via a data link, data derived from on-board navigation and position-fixing systems, including aircraft identification, four-dimensional position and additional data as appropriate. ADS data is displayed to the controller on a screen that replicates a radar screen. ICAO Doc 4444 PANS-ATM notes that air traffic control service, may be predicated on the use of ADS provided that identification of the aircraft involved is unambiguously established. Two main versions of ADS are currently in use:

  • Automatic Dependent Surveillance-Broadcast (ADS-B) is a function on an aircraft or surface vehicle that broadcasts position, altitude, vector and other information for use by other aircraft, vehicles and by ground facilities. It has become the main application of the ADS principle.
  • Automatic Dependent Surveillance-Contract (ADS-C) functions similarly to ADS-B but the data is transmitted based on an explicit contract between an ANSP and an aircraft. This contract may be a demand contract, a periodic contract, an event contract and/or an emergency contract. ADS-C is most often employed in the provision of ATS over transcontinental or transoceanic areas which see relatively low traffic levels. 

The ITU press release on this topic:

The frequency band 1087.7-1092.3 MHz has been allocated to the aeronautical mobile-satellite service (Earth-to-space) for reception by space stations of Automatic Dependent Surveillance-Broadcast (ADS-B) emissions from aircraft transmitters.

The frequency band 1087.7-1092.3 MHz is currently being utilized for the transmission of ADS-B signals from aircraft to terrestrial stations within line-of-sight. The World Radiocommunication Conference (WRC-15) has now allocated this frequency band in the Earth-to-space direction to enable transmissions from aircraft to satellites. This extends ADS-B signals beyond line-of-sight to facilitate reporting the position of aircraft equipped with ADS-B anywhere in the world, including oceanic, polar and other remote areas.

WRC-15 recognized that as the standards and recommended practices (SARP) for systems enabling position determination and tracking of aircraft are developed by the International Civil Aviation Organization (ICAO), the performance criteria for satellite reception of ADS-B signals will also need to be addressed by ICAO.

This agreement follows the disappearance and tragic loss of Malaysian Airlines Flight MH370 in March 2014 with 239 people on board, which spurred worldwide discussions on global flight tracking and the need for coordinated action by ITU and other relevant organizations.

For more details see: globalflightsafety.org