Thursday, 14 March 2019

A Primer on Pagers


After Avengers clip about Nick Fury paging Captain Marvel, I was asked by a few people about what, why and how of Pagers. Similar questions were asked when Tron: Legacy was released. It took me a while but finally managed to make a non-technical introduction on Pagers for people with very basic understanding of technology.

Slides



Video


If you used pagers in past, tell me about the good, the bad and the ugly of pagers.

Tuesday, 12 March 2019

Can Augmented & Mixed Reality be the Killer App 5G needs?


Last October Deutsche Telekom, Niantic and MobiledgeX announced a partnership to create advanced augmented reality experiences over mobile network technologies. I was lucky to find some time to go and play it at Deutsche Telekom booth. The amount of processing needed for this to work at best also meant that the new Samsung Galaxy S10+ were needed but I felt that it also occasionally struggled with the amount of data being transferred.


The pre-MWC press release said:

Deutsche Telekom, Niantic Inc., MobiledgeX and Samsung Showcase World’s First Mobile Edge Mixed Reality Multi-Gamer Experience

At the Deutsche Telekom booth at MWC 2019 (hall 3, booth 3M31) the results of the previously announced collaboration between Deutsche Telekom, Niantic, Inc., and MobiledgeX are on display and you’re invited to play. Niantic’s “Codename: Neon”, the world’s first edge-enhanced Mixed Reality Multiplayer Experience, delivered by ultra-low latency, Deutsche Telekom edge-enabled network, and Samsung Galaxy S10+ with edge computing enablement, will be playable by the public for the first time. 

“The ultra-low latency that Mobile Edge Computing (MEC) enables, allows us to create more immersive, exciting, and entertaining gameplay experiences. At Niantic, we’ve long celebrated adventures on foot with others, and with the advent of 5G networks and devices, people around the world will be able to experience those adventures faster and better,” said Omar Téllez, Vice-President of Strategic Partnerships at Niantic.

The collaboration is enabled using MobiledgeX’s recently announced MobiledgeX Edge-Cloud R1.0 product. Key features include device and platform-independent SDKs, a Distributed Matching Engine (DME) and a fully multi-tenant control plane that supports zero-touch provisioning of edge cloud resources as close as possible to the users. Immediate examples of what this enables include performance boosts for Augmented Reality and Mixed Reality (MR) experiences as well as video and image processing that meets local privacy regulations. 

Samsung has been working together with Deutsche Telekom, MobiledgeX, and Niantic on a natively edge-capable connectivity and authentication in Samsung Galaxy S10+ to interface with MobiledgeX Edge-Cloud R1.0 and dynamically access the edge infrastructure it needs so that augmented reality and mixed reality applications can take advantage of edge unmodified. Samsung will continue such collaborations with industry-leading partners not only to embrace a native device functionality of edge discovery and usage for the mobile devices and consumers, but also to seek a way together to create new business models and revenue opportunities leading into 5G era.

Deutsche Telekom’s ultra-low latency network was able to deliver on the bandwidth demands of “Codename: Neon” because it deployed MobiledgeX’s edge software services, built on dynamically managed decentralized cloudlets. “From our initial partnership agreement in October, we are thrilled to showcase the speed at which we can move from idea to experience, with full end-to-end network integration, delivered on Samsung industry leading edge native devices,” said Alex Jinsung Choi, Senior Vice President Strategy and Technology Innovation at Deutsche Telekom.

From the gaming industry to industrial IoT, and computer vision applications, consumer or enterprise, the experience is a great example of interactive AR experiences coming from companies like Niantic in the near future.  As AR/VR/MR immersive experiences continue to shape our expectations, devices, networks and clouds need to seamlessly and dynamically collaborate.

This video from Deutsche Telekom booth shows how the game actually feels like



Niantic CEO John Hanke delivered a keynote at Mobile World Congress 2019 (embedded below). According to Fortune article, "Why the Developer of the New 'Harry Potter' Mobile Game and 'Pokemon Go' Loves 5G":

Hanke showed a video of a prototype game Niantic has developed codenamed Neon that allows multiple people in the same place at the same time to play an augmented reality game. Players can shoot at each other, duck and dodge, and pick up virtual reality items, with each player’s phone showing them the game’s graphics superimposed on the real world. But the game depends on highly responsive wireless connections for all the phones, connections unavailable on today’s 4G LTE networks.

“We’re really pushing the boundaries of what we can do on today’s networks,” Hanke said. “We need 5G to deliver the kinds of experiences that we are imagining.”

Here is the video, it's very interesting and definitely worth a watch. For those who may not know, Niantic spun out of Google in October 2015 soon after Google's announcement of its restructuring as Alphabet Inc. During the spinout, Niantic announced that Google, Nintendo, and The Pokémon Company would invest up to $30 million in Series-A funding.



So what do you think, can AR / MR be the killer App 5G needs?

Thursday, 7 March 2019

Updated 5G Terminology Presentation (Feb 2019)


I made this video before MWC with the intention to educate the attendees about the various architecture options and 5G terminologies being discussed. As always, happy to get feedback on what can be done better. Slides followed by video below.







Complete list of our training resources are available on 3G4G page here.

Saturday, 2 March 2019

Beyond-5G and 6G at #MWC19


MWC is huge and there is absolutely no way that I even managed to cover 1% of the floor, even though I spend half a day, every day looking at the demos and talking to companies. I came across just a couple of companies looking at post 5G research. One was Mehdi Bennis, from University of Oulu and a good friend of this blog and the other one was Interdigital, which has featured heavily on 3G4G blogs too.

From the standards point of view, I am only aware of ITU 'Network 2030' (FG NET-2030) that is looking at how future network architectures, requirements, use cases, and capabilities of the networks will change by 2030 and beyond. I blogged about it here.

It's too early to call anything as 6G because we don't even realise the ways in which 5G will change the world and the limitations that will feed into the requirements of IMT-2030 (just guessing the probable name).

So here is the first video from Mehdi Bennis.






I also caught up with Interdigital and I got a very detailed video on their vision of what comes beyond 5G



Would love to know what else did I miss on 6G and Beyond-5G at MWC 2019.

Related Posts:

Sunday, 17 February 2019

Displaying 5G Network Status Icon on Smartphones and Other Devices

A more updated presentation & video on this topic is available on 3G4G '5G Training' page here.
Who thought displaying of network status icon on 5G devices would be so much fun. Typically the network icons are more of:
2G - Gsm, G, G+, E
3G - 3G, H, H+
4G - 4G, 4G+

Back in 2017, Samsung devices started displaying 4G+ icon. Samsung told mybroadband:

that by default its devices require a network to support Category 6 LTE, and for the total combined bandwidth to exceed 20MHz, before they will display the “4G+” icon.

Networks in South Africa frequently don’t have over 20MHz of aggregated bandwidth available, though.

As a result, one network asked Samsung to reduce the combined bandwidth requirement for the 4G+ icon to display to 15MHz, which Samsung approved.

“Samsung’s global policy regarding the display of the LTE/LTE-A/4G/4G+ network icon is that the network icon display is operator-configurable upon official request and Samsung approval,” it said.

The reason this is interesting is because LTE is really 3.9G but generally called 4G. LTE-A is supposed to be 4G because in theory it meets IMT-Advanced criteria. Then we have LTE-Advanced Pro, which is known as 4.5G. While in majority of the operators display 4.5G as 4G or 4G+, couple of operators has decided to become a bit innovative.

AT&T started by updating the network icons of some of their devices to 5GE, which is their way of saying 4.5G. E stands for Evolution. Or as some people joked, it stands for economy (or value) version, as opposed to premium version.


Brazilian operator Claro, decided to use the 4.5G icon but the 5 is much larger font compared to 4 (see the pic above). Some people call this as dishonest attempt by them.

I see a few people asking how can devices decide if they are on 4G or 4.5G. There is no standard procedure for this and is UE specific. One way is to look at RRC messages. If the system information messages contain optional IE's for 3GPP Release-13, then the network supports LTE-A Pro and if the device supports the features for LTE-A Pro, it can display 4.5G or 5GE, etc. Another approach is the optional IEs present in NAS Attach Accept message. As this comes slightly later in the registration process, the device displays 4G first and once the registration is complete, 4.5G. Note there is no requirement from standards point of  view about displaying of the network status indication icon up to 4G/4.5G.

To avoid such confusion in 5G, 3GPP submitted the first Liaison statement S2-175303. In this, 3GPP said:

With this number of System and Radio access options available, one or more new status icons are expected to appear on the User Interface of future (mobile) devices. A user should expect consistency across devices and networks as to what icons actually mean (i.e. what services might be expected when an icon is displayed).

While 3GPP specifications are not expected to define or discuss Service or RAT indicators in the User Interface themselves, 3GPP should provide the necessary tools in EPS and 5GS to enable them. It is therefore necessary to understand the conditions required for displaying these icons and with which granularity so we can identify what information ought to be available in/made available to the device.

SA2 understands that Status Icons related to 5G might be displayed for example on a UE display taking into account all or some combinations of these items (other items may exist):
- Access Restriction Data in subscription (with the potential exception of emergency access); 
- UE CN registration (i.e. is UE EPC- and/or 5GC-registered?);
- UE capabilities; 
- Network capabilities; 
- UE is camping on a cell of NG-RAN supporting NR only, E-UTRA only or, the ability to activate dual connectivity with another RAT (NR or E-UTRA);
- UE is camping on a cell of E-UTRAN (connected to EPC) with the ability to activate dual connectivity with NR as secondary cell;
- UE is in connected mode using NR, E-UTRA (in 5GS) or dual connectivity between E-UTRA and NR.

Given the above, SA2 would like to kindly ask for any feedback from GSMA FNW and NGMN on requirements and granularity for Service indicators and/or RAT indicators related to 5G.

GSMA responded in R2-1713952. 6 cases have been identified (see the first picture on top) :

The configurations consist of the following states and are as described in Table 1:

  1. EPS NR NSA (EN-DC) capable UE attached to EPC and currently in IDLE state under or in RRC_connected state connected to E-UTRAN cell not supporting LTE-NR dual connectivity 
  2. EPS NR NSA (EN-DC) capable UE attached to EPC and currently in IDLE state under or in RRC_Connected state connected to AND active on LTE for uplink and downlink on only E-UTRAN cell supporting LTE-NR dual connectivity and has not detected NR coverage (i.e. UE is not under NR coverage and/or not configured to make NR measurements)
  3. EPS NR NSA (EN-DC) capable UE attached to EPC and currently in RRC_Connected state connected to E-UTRAN cell (supporting dual connectivity) and active on LTE for uplink and downlink only and has detected NR coverage (i.e. UE is under NR coverage and has been configured to make NR measurements) 
  4. EPS NR NSA (EN-DC) capable UE attached to EPC and currently in IDLE state under E-UTRAN cell supporting LTE-NR dual connectivity and has detected NR coverage (i.e. UE is under NR coverage and has been configured to make NR measurements)
  5. EPS NR NSA (EN-DC) capable UE attached to EPC and currently in RRC_Connected state connected to E-UTRAN cell (supporting dual connectivity) and active on LTE and NR for uplink and/or downlink
  6. 5GS capable UE attached to 5GC and currently in IDLE state under or in RRC_Connected state connected to NG-RAN (eLTE (option 5 or 7) or NR (option 2 or 4) cell)

As there is no consensus on a single preferred configuration, it is desirable to make the display of 5G status icon in the UE configurable such that the display of 5G status icon can be made depending on operator preference. 

This proposal by GSMA was noted by 3GPP in R2-1803949.

RAN WG2 would like to inform GSMA and SA2 that, according to GSMA and SA2 recommendations (LSs R2-1713952 and S2-175270, respectively), RAN WG2 introduced 1 bit indication per PLMN called “upperLayerIndication” within LTE SIB 2. 

This bit enables the realization of the configurations based on UE states as per recommendation from GSMA (e.g. RRC_IDLE UE as for State 2 in LS R2-1713952 from GSMA)”. 

For idle mode UEs this is the only mechanism agreed. 

Actions: RAN WG2 would like to ask GSMA and SA2 to take the information above into account. 

Hopefully there will be less confusion when 5G is rolled out about the status icons. In the meantime we might see some more 4.5G icon innovations.

Tuesday, 12 February 2019

Prof. Andy Sutton: 5G Radio Access Network Architecture Evolution - Jan 2019


Prof. Andy Sutton delivered his annual IET talk last month which was held the 6th Annual 5G conference. You can watch the videos for that event here (not all have been uploaded at the time of writing this post). His talks have always been very popular on this blog with the last year talk being 2nd most popular while the one in 2017 was the most popular one. Thanks also to IET for hosting this annual event and IET Tv for making this videos available for free.

The slides and video is embedded below but for new starters, before jumping to this, you may want to check out about 5G Network Architecture options in our tutorial here.




As always, this is full of useful information with insight into how BT/EE is thinking about deploying 5G in UK.

Related Posts:

Sunday, 10 February 2019

Theoretical Throughput Calculation of FDD 5G New Radio (NR)


A nice video by Peter Clarke on 5G NR throughput calculation for FDD. Right now it's only in the video form but will hopefully be available as a tool on his excellent website here. A tool for 4G throughput calculation is available here.




Related Links:

Wednesday, 6 February 2019

AI in 5G – the why and how

IET recently held the 6th Annual 5G conference bringing together key players in the 5G world. You can watch the videos for that event here (not all have been uploaded at the time of writing this post).

We reached out to Dr. Yue Wang to share her presentation with us and she has kindly done so. The presentation and video are embedded below.






Related posts:

Monday, 4 February 2019

A quick tutorial on Open RAN, vRAN & White Box RAN


I made a short tutorial based on my understanding of Open RAN, Virtualised RAN and WhiteBox RAN. Slides and video embedded below.





Related Posts:

Saturday, 2 February 2019

ITU-NGMN Joint Conference on “Licensing practices in 5G industry segments"


IPR, Licensing and royalties are always an interesting topic. In the end they decide what price a device would be sold at. If I put it simply, the cost of device = cost of manufacturing + marketing + sales and distribution + support and insurance + profit + IPR. The licensing cost is often added in the end as it could be applicable on the selling price of the device.
The above tweet is interesting as it lists out the IPR costs by major patent holders. I wrote a post earlier detailing the 5G patent holders here. Since then this have moved on significantly. In addition to the royalty charged by the 5G patent holders (it also includes legacy technologies like 2G, 3G, 4G &Wi-Fi), there are patents for messaging, Codecs (seperate for audio / video), etc. To be fair it's a complex process. This is why I sometimes get shocked when I see 4G smartphones selling for £20 ($25).

Coming back to the conference, all the presentations are available on ITU page here.

Sylvia Lu, who wears many different hats including one for CW, UK5G and uBlox and a friend of 3G4G blog was one of the speakers at this conference. Here is a tweet on what she had to say about this event:
For those who may not know, FRAND stands for Fair, Reasonable, And Non-Discriminatory. Wikipedia has a nice article explaining it here.

NGMN Press Release on this conference mentions the following:

The Next Generation Mobile Networks (NGMN) Alliance has jointly organised and executed a successful conference on Licensing Practices in 5G Industry Segments with the International Telecommunication Union (ITU), bringing various experts from around the world together to discuss licensing practices and challenges of 5G.

The conference featured moderators and speakers from some of the biggest names in telecoms, including AT&T, Deutsche Telekom, NTT DOCOMO, Orange, Ericsson, Nokia and Microsoft. Also in attendance were key stakeholders of vertical industries, including Audi, Bosch, Panasonic and u-Blox, and patent pool administrators, namely Avanci, MPEG-LA, Sisvel and Via Licensing, who co-sponsored the event, the European Telecommunications Standards Institute (ETSI), the Japanese and the European Patent Offices, and the European Commission.

Focusing on the development of 5G and the Internet of Things (IoT), the conference facilitated sharing and discussing of present-day licensing practices and related issues across different industry segments.

A host of insightful sessions took place igniting an inclusive exchange on:
  • Patent licensing practices with interactive discussions that focused on issues stakeholders need to be aware of.
  • Sharing licensors’, licensees’ and pool administrators’ requirements on patent pools/platforms.
  • Identifying proposed practices and conducts for licensors and licensees.
  • Listing requirements for increasing transparency and assessing essentiality of Standard Essential Patents declared to Standards Developing Organisations.
“It’s great to notice that our joint ITU-NGMN conference has been such a success,” said Dr. Peter Meissner, CEO of NGMN. “Obviously, the 5G Eco-System is different. New use cases beyond mobile broadband - like massive IoT as well as highly demanding requirements from vertical industries on low latency, ultra-high reliability and security - are causing substantial network transformations. All these challenges have implications on the intellectual property of mobile network operators and across the different industry segments. Conferences like this are key in identifying IPR issues and exploring solutions for the enlarged eco-system.”

If you have any insights on this topic, or just any comment in general, feel free to add them as comments below.

Related Article:


Wednesday, 23 January 2019

AI and Analytics Based Network Designing & Planning

Recently I blogged about how Deutsche Telekom is using AI for variety of things. The most interesting being (from this blog point of view), fiber-optic roll-out. According to their press release (shortened for easy reading):

"The shortest route to the customer is not always the most economical. By using artificial intelligence in the planning phase we can speed up our fiber-optic roll-out. This enables us to offer our customers broadband lines faster and, above all, more efficiently," says Walter Goldenits, head of Technology at Telekom Deutschland. It is often more economical to lay a few extra feet of cable. That is what the new software-based technology evaluates using digitally-collected environmental data. Where would cobblestones have to be dug up and laid again? Where is there a risk of damaging tree roots?

The effort and thus costs involved in laying cable depend on the existing structure. First, civil engineers open the ground and lay the conduits and fiber-optic cables. Then they have to restore the surface to its previous condition. Of course, the process takes longer with large paving stones than with dirt roads.

"Such huge amounts of data are both a blessing and a curse," says Prof. Dr. Alexander Reiterer, who heads the project at the Fraunhofer IPM. "We need as many details as possible. At the same time, the whole endeavour is only efficient if you can avoid laboriously combing through the data to find the information you need. For the planning process to be efficient the evaluation of these enormous amounts of data must be automated." Fraunhofer IPM has developed software that automatically recognizes, localizes and classifies relevant objects in the measurement data.

The neural network used for this recognizes a total of approximately 30 different categories through deep learning algorithms. This includes trees, street lights, asphalt and cobblestones. Right down to the smallest detail: Do the pavements feature large pavement slabs or small cobblestones? Are the trees deciduous or coniferous? The trees' root structure also has a decisive impact on civil engineering decisions.

Once the data has been collected, a specially-trained artificial intelligence is used to make all vehicles and individuals unidentifiable. The automated preparation phase then follows in a number of stages. The existing infrastructure is assessed to determine the optimal route. A Deutsche Telekom planner then double-checks and approves it.


In the recent TIP Summit 2018, Facebook talked about ‘Building Better Networks with Analytics’ and showed off their analytics platform. Vincent Gonguet, Product Manager, Connectivity Analytics, Facebook talked about how Facebook is using a three-pronged approach of accelerating fiber deployment, expanding 4G coverage and planning 5G networks. The video from the summit as follows:

TIP Summit 2018 Day 1 Presentation - Building Better Networks with Analytics from Telecom Infra Project on Vimeo.

Some of the points highlighted in the video:
  • Educating people to connect requires three main focus areas, Access, Affordability and Awareness – One of the main focus areas of TIP is access. 
  • 4G coverage went from 20% to 80% of world population in the last 5 years. The coverage growth is plateauing because the last 20% is becoming more and more uneconomical to connect.
  • Demand is outpacing supply is many parts of the world (indicating that networks has to be designed for capacity, not just coverage)
  • 19% of 4G traffic can’t support high quality videos today at about 1.5 Mbps
  • Facebook has a nice aggregated map of percentage of Facebook traffic across the world that is experiencing very low speeds, less than 0.5 Mbps
  • Talk looks at three approaches in which Facebook works with TIP members to accelerate fiber deployment, expand 4G coverage and plan 5G networks.
  • A joint fiber deployment project with Airtel and BCS in Uganda was announced at MWC 2018
  • 700 km of fiber deployment was planned to serve over 3 million people (Uganda’s population is roughly 43 million)
  • The real challenge was not just collecting data about roads, infrastructure, etc. New cities would emerge over the period of months with tens of thousands of people 
  • In such situations it would be difficult for human planners to go through all the roads and select the most economical route. Also, different human planners do thing in different ways and hence there is no consistency. In addition, its very hard to iterate. 
  • To make deployments simpler and easier, it was decided to first provide coverage to people who need less km of fiber. The savings from finding optimal path for these people can go in connecting more people.
  • It is also important for the fiber networks to have redundancy but it’s difficult to do this at scale
  • An example and simulation of how fiber networks are created is available in the video  from 07:45 – 11:00.
  • Another example is that of prioritizing 4G deployments based on user experience, current network availability and presence of 4G capable devices in partnership with XL Axiata is available in the video from 11:00 – 14:13. Over 1000 sites were deployed and more than 2 million people experienced significant improvement in their speeds and the quality of videos. 
  • The final example is planning of 5G mmWave networks. This was done in partnership with Deutsche Telekom, trying to bring high speeds to 25,000 apartment homes in a sq. km in the center of Berlin. The goal was to achieve over 1Gbps connection using a mixture of fiber and wireless. The video looks at the simulation of Lidar data where the wireless infrastructure can be deployed. Relevant part is from 14:13 – 20:25.
Finally, you may remember my blog post on Automated 4G / 5G Hetnet Design by Keima. Some of the work they do overlaps with both examples above. I reached out to Iris Barcia to see if they have any comments on the two different approaches above. Below is her response:

“It is very encouraging that DT and Facebook are seeing the benefits of data and automation for design. I think that is the only way we’re going to be able to plan modern communication networks. We approach it from the RAN planning perspective: 8 years ago our clients could already reduce cost by automatically selecting locations with good RF performance and close to fibre nodes, alternatively locations close to existing fibre routes or from particular providers. Now the range of variables that we are capable of computing is vast and it includes aspects such as accessibility rules, available spectrum, regulations, etc. This could be easily extended to account for capability/cost of deploying fibre per type of road. 

But also, we believe in the benefit of a holistic business strategy, and over the years our algorithms have evolved to prioritise cost and consumers more precisely. For example, based on the deployment needs we can identify areas where it would be beneficial to deploy fibre: the study presented at CWTEC showed a 5G Fixed Wireless analysis per address, allowing fibre deployments to be prioritised for those addresses characterised by poor RF connectivity.”

There is no doubt in my mind that more and more of these kinds of tools that relies on Analytics and Artificial Intelligence (AI) will be required to design and plan the networks. By this I don’t just mean 5G and other future networks but also the existing 2G, 3G & 4G networks and Hetnets. We will have to wait and see what’s next.


Related Blog Posts:

Wednesday, 16 January 2019

5G Slicing Templates

We looked at slicing not long back in this post here, shared by ITU, from Huawei. The other day I read a discussion on how do you define slicing. Here is my definition:

Network slicing allows sharing of the physical network infrastructure resources into independent virtual networks thereby giving an illusion of multiple logically seperate end-to-end networks, each bound by their own SLAs, service quality and peformance guarantees to meet the desired set of requirements. While it is being officially defined for 5G, there is no reason that a proprietary implementation for earlier generations (2G, 3G or 4G)  or Wi-Fi cannot be created.

The picture above from a China Mobile presentation, explain the slice creation process nicely:

  1. Industry customers order network slices from operators and provide the network requirements, including network slice type, capacity, performance, and related coverage. Operators generate network slices according to their needs. Provide the network service requirement as General Service Template (GST).
  2. Transfer GST to NST (Network Slice Template)
  3. Trigger Network Instantiation Process
  4. Allocate the necessary resources and create the slice.
  5. Expose slice management information. Industry customers obtain management information of ordered slices through open interfaces (such as number of access users, etc.).

For each specific requirement, a slicing template is generated that is translated to an actual slice. Let's look at some examples:

Let's take an example of Power Grid. The picture below shows the scenario, requirement and the network slicing template.
As can be seen, the RAN requirement is timing and low latency while the QoS requirement in the core would be 5 ms latency with guaranteed 2 Mbps throughout. There are other requirements as well. The main transport requirement would be hard isolation.

The Network requirement for AR Gaming is high reliability, low latency and high density of devices. This translates to main RAN requirement of low jitter and latency; Transport requirement of Isolation between TICs (telecom integrated cloud) and finally Core QoS requirement of 80 ms latency and 2 Mbps guaranteed bit rate.


More resources on Network Slicing:


Monday, 7 January 2019

The business case of densifying LoRaWAN deployments

LoRaWAN has recently emerged as one of the key radio technologies to address the challenges of Low Power Wide Area Network (LPWAN) deployments, namely power efficiency, long range, scalable deployments, and cost-effectiveness.

The LoRa Alliance has had an exponential growth with 500+ members with the recent arrival of heavyweight members such as Google, Alibaba, and Tencent joining the alliance.

The first wave of LoRaWAN was primarily focused on large country-wide deployments led by operators such as KPN, Orange, Swisscom and many more. However, the next wave that is already coming is the arrival of private LoRaWAN deployments from large enterprises and enabling roaming for inter-connection amongst public/private networks (esp. for use cases which involve LPWAN Geolocation [8] [9]]). As the IoT deployments grow in both the densification and geographical footprint, it is inevitable that network design becomes one of the important factors ensuring long-term success and profitability of both operators and end-customers relying on LoRaWAN connectivity for their IoT use cases.


A typical example is the recent 3 million water meter contract awarded by Veolia Birdz to Orange [12]: such large-scale projects require careful network planning to achieve the required densification and quality of service while optimizing costs.

A Closer Look at Densification techniques for LoRaWAN


LoRaWAN deployments use a star topology with a frequency reuse factor of 1 which allows simplicity in network deployment and ongoing densification: there is no need for frequency pattern planning or reshuffling as more gateways are added to the infrastructure.

Compared to mesh technologies, the single hop to network infrastructure minimizes power consumption as nodes do not need to relay communication from other nodes. Another advantage is that gradual initial network deployment in sparse mode with low node density is possible, compared to mesh which requires minimum node density to operate. Even more importantly, LoRaWAN is immune from the exponential packet loss suffered by multi-hop RF mesh technologies in presence of increasing interferers and noise floor power.

Another unique feature of LoRaWAN networks is that messages in uplink can be received by any gateway (Rx macro-diversity), and it is the function of a network server to remove duplicates in uplink and select the best gateway for downlink transmission based on the uplink RSSI estimates. This allows enabling of features such as geolocation to be easily built into LoRaWAN deployment and enables uplink macro-diversity that significantly improves network capacity and QoS (Quality of Service).

LoRaWAN also supports features such as Adaptive Data Rate (ADR) that allows network server to dynamically change parameters of end-devices such as transmit power, frequency and spreading factor via downlink MAC commands. Optimization of theses settings is key to increase the capacity and reduce the power consumption of end-devices.

The optimization of LoRaWAN parameters along with densification can lead massive amounts of capacity increase in the network. In fact, the LoRaWAN capacity of the network can scale almost indefinitely with densification.

Figure 1: Actility Webinar - Designing LoRaWAN network for Dense Deployment  [1] [2] [3]


The future of LoRaWAN networks, particularly in urban environments where the noise floor is expected to get higher due to increased traffic, goes towards micro-cellular networks

How does densification lead to lower TCO for Enterprise deployment?


As the network is densified by deploying more LoRaWAN Gateways and adaptive data rate and power control algorithms are applied intelligently in the network, this leads in dramatic reduction of power consumption of end-device and thus reduction in Total Cost of Ownership (TCO) of end devices. The figures below show clearly that densification can lead to upto 10X savings in both power consumption and overall reduction in 10-year TCO for enterprise deployment. Changing the batteries require manual labor and is the cost that can significantly dominate 10-year TCO of large-scale enterprise deployment (for ex. Smart gas/water meters).

Figure 2: Battery Lifetime Improvement with densification [1] [2] [3]


Figure 3: Impact on 10-year TCO due to densification [1] [2] [3]



Densification leads to very dramatic reduction in power consumption of the end-devices thus reducing overall Total Cost of Ownership (TCO)


LoRaWAN offers disruptive Deployment Models


LoRaWAN is generally deployed in unlicensed spectrum which allows anyone to roll-out IoT/LPWAN network based on LoRaWAN. This allows three deployment models:


1. Public Operator Network: In this traditional model, the operator invests in a regional or nation-wide network and sells connectivity services to its customers.


2. Private/Enterprise Network: In this model, enterprise customers typically setup LoRaWAN gateways on private premises (e.g. an airport), and either have these gateways managed by an operator, or use their own LoRaWAN network platform.


This mode of deployment is a game changer for dense device use cases, as network capacity and enhanced QoS can be provided at marginal increased cost. It becomes possible because LoRaWAN runs in unlicensed spectrum and gateways are quite inexpensive and easy to deploy.


3. Hybrid model: This is the most interesting model that LoRaWAN allows due to its open architecture.

This is not possible or rather difficult in other competing LPWA technologies or Cellular IoT (due to licensed spectrum and absence of roaming/peering model between private and public networks). There are initiatives like CBRS and MulteFire from 3GPP Players but they are still in progress and far from maturity for large scale IoT deployments (esp. for use cases that demand 10-15 years+ battery lifetime).

In hybrid model, operator provides light country-wide outdoor coverage, but different stakeholders such as private enterprises or individuals help in densifying the network further based on their needs on their premises, via managed networks. This model enables a win-win private/public partnership in sharing the costs and revenues from the network and densify the network where the applications and devices are most present.

This model is possible because multiple gateways can receive LoRaWAN messages and network server removes duplication. In the cases where different operators/enterprises run their networks, LoRa Alliance already has approved roaming architecture in “LoRaWAN Backend Interfaces 1.0 Specification” [6] [7] to enable network collaboration.

This model significantly reduces the operator investment and offers a disruptive business model to build IoT capacity where it is mostly needed.

Figure 4: LoRaWAN Hybrid Deployment Model (source : Actility)


LoRaWAN enables Public-Private deployment that allows disruptive model for cost/revenue sharing and densifying the network where it is needed most, depending on IoT application needs

LoRaWAN densification: A Key driver for reduction in Operator TCO

When designing and deploying a LoRaWAN network, the system operator must balance the cost of a dense network (and it's served sensors) against the cost of a sparse network (and it's served sensors).

Traditional vs Opportunistic network designs


In the traditional deployment model, the operator deploys LoRaWAN gateways on telecom towers. This entails leasing the space from the tower owner, purchasing a waterproof outdoor gateway, climbing the tower to hang the gateway, and perhaps paying for additional power, zoning, permitting, and backhaul. The operator does the detailed RF propagation study and hangs enough gateways to provide coverage for the sensor locations required to provide the services he wants to provide.


Another option is to opportunistically deploy “femto” gateways in devices that the operator is already fielding. The gateways are stateless, and thus do not add much complexity to the hosting device. An 8-channel LoRaWAN reference design is mated to the host device using either USB or I2C. The options here are quite diverse. The operator can embed a simple 8 channel gateway into ongoing WiFi hotspots, power supplies, amplifiers, cable modems, thermostats, virtual assistants, or any mass-produced device that already has backhaul. The Bill of Materials adder is quite modest, the power consumption and heat dissipation are less than 3 Watts, and the size delta is roughly 7 cm by 3 cm.


Calculating the number of opportunistic gateways to provide adequate coverage for a given deployment can be challenging. The height of the gateways has a large impact on the coverage of the gateway. A gateway deployed in a 20th story of an apartment building has a much better coverage pattern than the same gateway deployed in the basement of a single-family home. Gateways deployed in WiFi hotspots mounted on power poles have a different coverage area than a gateway deployed on light poles. So, the actual number of gateways deployed in each scenario varies widely. When you complete the detailed design of each network type, you typically find that an opportunistic deployment model allows the operator to cover a given area by deploying roughly 100 times as many gateways for roughly 1/10th of the cost (when compared to the traditional 3rd party leased tower model).

Example use-case with water meters


For the rest of this analysis, we will assume that the operator needs to deploy a LoRaWAN network to service 100K water meters. Water meters represent a difficult RF propagation model. They are installed at or below ground level, must last 20 years, and suburban meters tend to have accumulations of grass and dirt collect over time. Let’s assume a North American deployment model, and we have the option of using a high power (27dBm) or a low power (17dBm) meter.

One possible design is to use a tower-based approach. In a tower-based approach, the operator typically ends up deploying high power water meters in order to reduce the number of (expensive) tower leases. In order to run at high power, the North American regulations require the sensor to send across 50+ channels, which drives the operator to deploy 64 channel gateways. Let’s assume that the average distance between a water meter and a tower-based gateway is ~3km and the sensors need to send one reading per day. Many of the meters thus operate at SF10 at 27dBm. The sensor designer includes a high-power RF amplifier, calculates the energy requirements over the life of the sensor, and sizes the battery appropriately.

Another possible design is to opportunistically deploy thousands of femto gateways into the area. The question boils down to “How many femto gateways do I need to cover the desired area?”. Working backwards from the densest possible deployment, most MSOs (Multiple-System Operator) serve 1/3 of the households in their footprint. In many urban environments, the average distance between a given operator’s subscribers is 30 meters. If such an operator could opportunistically deploy in most of those sites, they would have inter-gateway distances as small as 30 meters. For the purposes of this analysis, let’s say that the average distance between the sensor and the closest gateway is reduced from 3000 meters to 100 meters. When a sensor is 100 meters from a gateway, it can typically operate at SF7 at 17dBm (or lower). Clearly, the network designer must account for a distribution of distances between a given sensor and its closest gateway, but the overall power savings is significant.

It is also instructive to compare the overall capacity of a tower-based LoRaWAN network to the overall capacity of the opportunistic LoRaWAN network. Remembering that 100 eight channel opportunistic gateways cost about 1/10th of a single 64 channel gateway, we realize that we get ~13 times as much network capacity for 1/10th of the cost. As the sensor density increases, we could deploy additional opportunistic gateways and get ~130 times as much network capacity for the same cost as a tower-based network.

When we compare the cost to build a sensor designed to last 20 years using SF10 at 27dBm to the cost to build a sensor designed to last 20 years using SF7 at 17dBm, we find that we can save more than $10 per sensor by deploying the denser network.

So, in addition to saving a significant amount of capital by opportunistically deploying the gateways, the operator can save more than $10 per water meter by opportunistically deploying a dense network. This saves more than $1M on the 100K water meter deployment. When one layers in additional use cases, the dense LoRaWAN network provides sensor savings on each additional set of sensors. Most of the sensors do not have the 20 years requirement and thus do not save the same amount of money, but batteries are one of the primary drivers for any sensor’s cost.


Conclusion

This analysis is somewhat simplified, and a very large-scale deployment may require a certain amount of traditional gateway placement to provide an “umbrella” of coverage that is then densified using opportunistic methods. By densifying the network, the overall sensor power budget is decreased significantly. One could also envision a deployment model in which an opportunistic gateway is deployed in conjunction with a set of services. The operator would add IoT based services to an existing bundle (let’s say voice/video/data, thermostat control or personal assistant) and know that the sensors would be co-resident with the gateway.

What is the future of LoRaWAN?



LoRaWAN exhibits significant capacity gains and massive reduction in power consumption and TCO when ADR algorithms are used intelligently in the network. We showed how LoRaWAN networks are deployed for coverage and how network capacity can be scaled gracefully by adding more gateways.

There are already 16 channels in EU, but there have been recent modifications of the regulatory framework to relax the spectrum requirements and increase transmit power, duty cycle and number of channels [22].

Moreover, Semtech released the latest version of LoRa chipsets [23] with the following key features:
  • 50% less power in receive mode
  • 20% extended cell range
  • +22 dBm transmit power
  • A 45% reduction in size: 4mm by 4mm
  • Global continuous frequency coverage: 150-960MHz
  • Simplified user interface with implementation of commands
  • New spreading factor of SF5 to support dense networks
  • Protocol compatible with existing deployed LoRaWAN networks

The above LoRaWAN features and upcoming changes to EU regulations will allow significantly scaling of unlicensed LoRaWAN deployments for years to come to meet the needs of IoT applications and use cases. LoRaWAN capacity depends indeed on the regional and morphology parameters. As we have showed in the above results, if the network is deployed carefully and advanced algorithms such as ADR are used, there can be dramatic increase in network capacity and massive reduction in TCO. This will be one of the main factors that will determine the success of LoRaWAN deployments as the demands and breadth of IoT applications scale in future.

We also showed earlier how LoRaWAN offers innovative public/private deployment model in which operators can build capacity incrementally and supplement with extra capacity by leveraging gateways deployed from private individuals/enterprises. Typically, for cellular networks there can be anywhere from 5-10% IoT devices on cell-edge which are in outage [10]. This applies especially to deep indoor nodes (for example, smart meters with additional 30 dB penetration loss). Such nodes can only be covered by densification of cellular network which is expensive considering it is being done only for 5-10% of IoT devices. One way to address this problem is deploying private LoRaWAN on cell-edge and using multi-technology IoT platform that combines both LoRaWAN and Cellular IoT [11].

On the other hand, LoRaWAN offers a cost-effective way to augment network capacity where it's needed most. LoRaWAN gateways are very cost-effective and can be deployed using Ethernet/3G/4G backhaul with minimal investment in comparison to 3GPP small cells. This allows building IoT network in cost-effective manner and scale it progressively based on the application needs. We believe that his deployment model has dramatic effect on ROI for IoT connectivity based on LoRaWAN.

The LoRa Alliance has standardized the roaming feature, which enables multiple LoRaWAN networks to collaboratively serve IoT devices. Macro-diversity used across deployments enables operators/enterprises to jointly densify their networks, hence providing better coverage at lower costs. The future of LoRaWAN as shown below will be private/enterprise network deployments and disruptive business models through roaming with the public networks [4] [5] [6] [7].

Figure 5: Future of LoRaWAN deployments


LoRaWAN does provide horizontal connectivity solution to address wide-ranging needs for IoT applications for LPWAN deployments. However, these benefits are only possible with intelligent network server algorithms proprietary to network solution vendors

For any questions, contact the author below,

https://www.linkedin.com/in/rohit-gupta-2b51503a/



References:
[1] Actility webinar Replay: Designing a LoRaWAN Network for Dense Deployment,
https://www.youtube.com/watch?v=xQOZWUQdvf0

[2] Actility webinar slides: Designing a LoRaWAN Network for Dense Deployment, https://www.slideshare.net/Actility/designing-lorawan-for-dense-iot-deployments-webinar.

[3] Actility Whitepaper: Designing a LoRaWAN Network for Dense Deployment, https://www.slideshare.net/Actility/designing-lorawan-networks-for-dense-iot-deployments

[4] Actility webinar slides: Industrial IoT - Transforming businesses today with LoRaWAN, https://www.slideshare.net/Actility/actility-and-factory-systemes-explain-how-iot-is-transforming-industry

[5] Actility webinar Replay: Industrial IoT - Transforming businesses today with LoRaWAN, https://www.youtube.com/watch?v=pRoEbWjffBA

[6] Actility webinar slides: LoRaWAN Roaming Webinar, https://www.slideshare.net/Actility/lorawan-roaming

[7] Actility webinar Replay: LoRaWAN Roaming webinar, https://www.youtube.com/watch?v=tWP6VV1CKEg

[8] Actility webinar slides: Multi-technology IoT Geolocation, https://www.slideshare.net/Actility/multi-technology-geolocation-webinar

[9] Actility webinar Replay: Multi-technology IoT Geolocation, https://www.youtube.com/watch?v=YzFZqMBI2QA

[10] http://vbn.aau.dk/files/236150948/vtcFall2016.pdf

[11] Actility Whitepaper: How to build a multi-technology scalable IoT connectivity Platform, https://www.slideshare.net/Actility/whitepaper-how-to-build-a-mutiltechnology-scalable-iot-connectivity-platform

[12] https://www.orange.com/en/Press-Room/press-releases/press-releases-2018/Nova-Veolia-and-its-subsidiary-Birdz-choose-Orange-Business-Services-to-help-them-digitalize-Veolia-s-remote-water-meter-reading-services-in-France