Pages

Parallel Wireless is Hiring

Showing posts with label 3GPP. Show all posts
Showing posts with label 3GPP. Show all posts

Sunday, 7 May 2017

10 years battery life calculation for Cellular IoT

I made an attempt to place the different cellular and non-cellular LPWA technologies together in a picture in my last post here. Someone pointed out that these pictures above, from LoRa alliance whitepaper are even better and I agree.

Most IoT technologies lists their battery life as 10 years. There is an article in Medium rightly pointing out that in Verizon's LTE-M network, IoT devices battery may not last very long.

The problem is that 10 years battery life is headline figure and in real world its sometimes not that critical. It all depends on the application. For example this Iota Pet Tracker uses Bluetooth but only claims battery life of  "weeks". I guess ztrack based on LoRa would give similar results. I have to admit that non-cellular based technologies should have longer battery life but it all depends on applications and use cases. An IoT device in the car may not have to worry too much about power consumption. Similarly a fleet tracker that may have solar power or one that is expected to last more than the fleet duration, etc.


So coming back to the power consumption. Martin Sauter in his excellent Wireless Moves blog post, provided the calculation that I am copying below with some additions:

The calculation can be found in 3GPP TR 45.820, for NB-IoT in Chapter 7.3.6.4 on ‘Energy consumption evaluation’.

The battery capacity used for the evaluation was 5 Wh. That’s about half or even only a third of the battery capacity that is in a smartphone today. So yes, that is quite a small battery indeed. The chapter also contains an assumption on how much power the device draws in different states. In the ‘idle’ state the device is in most often, power consumption is assumed to be 0.015 mW.

How long would the battery be able to power the device if it were always in the idle state? The calculation is easy and you end up with 38 years. That doesn’t include battery self-discharge and I wondered how much that would be over 10 years. According to the Varta handbook of primary lithium cells, self-discharge of a non-rechargable lithium battery is less than 1% per year. So subtract roughly 4 years from that number.

Obviously, the device is not always in idle and when transmitting the device is assumed to use 500 mW of power. Yes, with this power consumption, the battery would not last 34 years but less than 10 hours. But we are talking about NB-IoT so the device doesn’t transmit for most of the time. The study looked at different transmission patterns. If 200 bytes are sent once every 2 hours, the device would run on that 5 Wh battery for 1.7 years. If the device only transmits 50 bytes once a day the battery would last 18.1 years.

So yes, the 10 years are quite feasible for devices that collect very little data and only transmit them once or twice a day.

The conclusions from the report clearly state:

The achievable battery life for a MS using the NB-CIoT solution for Cellular IoT has been estimated as a function of reporting frequency and coupling loss. 

It is important to note that these battery life estimates are achieved with a system design that has been intentionally constrained in two key respects:

  • The NB-CIoT solution has a frequency re-use assumption that is compatible with a stand-alone deployment in a minimum system bandwidth for the entire IoT network of just 200 kHz (FDD), plus guard bands if needed.
  • The NB-CIoT solution uses a MS transmit power of only +23 dBm (200 mW), resulting in a peak current requirement that is compatible with a wider range of battery technologies, whilst still achieving the 20 dB coverage extension objective.  

The key conclusions are as follows:

  • For all coupling losses (so up to 20 dB coverage extension compared with legacy GPRS), a 10 year battery life is achievable with a reporting interval of one day for both 50 bytes and 200 bytes application payloads.
  • For a coupling loss of 144 dB (so equal to the MCL for legacy GPRS), a 10 year battery life is achievable with a two hour reporting interval for both 50 bytes and 200 bytes application payloads. 
  • For a coupling loss of 154 dB, a 10 year battery life is achievable with a 2 hour reporting interval for a 50 byte application payload. 
  • For a coupling loss of 154 dB with 200 byte application payload, or a coupling loss of 164 dB with 50 or 200 byte application payload, a 10 year battery life is not achievable for a 2 hour reporting interval. This is a consequence of the transmit energy per data bit (integrated over the number of repetitions) that is required to overcome the coupling loss and so provide an adequate SNR at the receiver. 
  • Use of an integrated PA only has a small negative impact on battery life, based on the assumption of a 5% reduction in PA efficiency compared with an external PA.

Further improvements in battery life, especially for the case of high coupling loss, could be obtained if the common assumption that the downlink PSD will not exceed that of legacy GPRS was either relaxed to allow PSD boosting, or defined more precisely to allow adaptive power allocation with frequency hopping.

I will look at the technology aspects in a future post how 3GPP made enhancements in Rel-13 to reduce power consumption in CIoT.

Also have a look this GSMA whitepaper on 3GPP LPWA lists the applications requirements that are quite handy.

Monday, 1 May 2017

Variety of 3GPP IoT technologies and Market Status - May 2017



I have seen many people wondering if so many different types of IoT technologies are needed, 3GPP or otherwise. The story behind that is that for many years 3GPP did not focus too much on creating an IoT variant of the standards. Their hope was that users will make use of LTE Cat 1 for IoT and then later on they created LTE Cat 0 (see here and here).

The problem with this approach was that the market was ripe for a solution to a different types of IoT technologies that 3GPP could not satisfy. The table below is just an indication of the different types of technologies, but there are many others not listed in here.


The most popular IoT (or M2M) technology to date is the humble 2G GSM/GPRS. Couple of weeks back Vodafone announced that it has reached a milestone of 50 million IoT connections worldwide. They are also adding roughly 1 million new connections every month. The majority of these are GSM/GPRS.

Different operators have been assessing their strategy for IoT devices. Some operators have either switched off or are planning to switch off they 2G networks. Others have a long term plan for 2G networks and would rather switch off their 3G networks to refarm the spectrum to more efficient 4G. A small chunk of 2G on the other hand would be a good option for voice & existing IoT devices with small amount of data transfer.

In fact this is one of the reasons that in Release-13 GSM is being enhanced for IoT. This new version is known as Extended Coverage – GSM – Internet of Things (EC-GSM-IoT ). According to GSMA, "It is based on eGPRS and designed as a high capacity, long range, low energy and low complexity cellular system for IoT communications. The optimisations made in EC-GSM-IoT that need to be made to existing GSM networks can be made as a software upgrade, ensuring coverage and accelerated time to-market. Battery life of up to 10 years can be supported for a wide range use cases."

The most popular of the non-3GPP IoT technologies are Sigfox and LoRa. Both these technologies have gained significant ground and many backers in the market. This, along with the gap in the market and the need for low power IoT technologies that transfer just a little amount of data and has a long battery life motivated 3GPP to create new IoT technologies that were standardised as part of Rel-13 and are being further enhanced in Rel-14. A summary of these technologies can be seen below


If you look at the first picture on the top (modified from Qualcomm's original here), you will see that these different IoT technologies, 3GPP or otherwise address different needs. No wonder many operators are using the unlicensed LPWA IoT technologies as a starting point, hoping to complement them by 3GPP technologies when ready.

Finally, looks like there is a difference in understanding of standards between Ericsson and Huawei and as a result their implementation is incompatible. Hopefully this will be sorted out soon.


Market Status:

Telefonica has publicly said that Sigfox is the best way forward for the time being. No news about any 3GPP IoT technologies.

Orange has rolled out LoRa network but has said that when NB-IoT is ready, they will switch the customers on to that.

KPN deployed LoRa throughout the Netherlands thereby making it the first country across the world with complete coverage. Haven't ruled out NB-IoT when available.

SK Telecom completed nationwide LoRa IoT network deployment in South Korea last year. It sees LTE-M and LoRa as Its 'Two Main IoT Pillars'.

Deutsche Telekom has rolled out NarrowBand-IoT (NB-IoT) Network across eight countries in Europe (Germany, the Netherlands, Greece, Poland, Hungary, Austria, Slovakia, Croatia)

Vodafone is fully committed to NB-IoT. Their network is already operational in Spain and will be launching in Ireland and Netherlands later on this year.

Telecom Italia is in process of launching NB-IoT. Water meters in Turin are already sending their readings using NB-IoT.

China Telecom, in conjunction with Shenzhen Water and Huawei launched 'World's First' Commercial NB-IoT-based Smart Water Project on World Water Day.

SoftBank is deploying LTE-M (Cat-M1) and NB-IoT networks nationwide, powered by Ericsson.

Orange Belgium plans to roll-out nationwide NB-IoT & LTE-M IoT Networks in 2017

China Mobile is committed to 3GPP based IoT technologies. It has conducted outdoor trials of NB-IoT with Huawei and ZTE and is also trialing LTE-M with Ericsson and Qualcomm.

Verizon has launched Industry’s first LTE-M Nationwide IoT Network.

AT&T will be launching LTE-M network later on this year in US as well as Mexico.

Sprint said it plans to deploy LTE Cat 1 technology in support of the Internet of Things (IoT) across its network by the end of July.

Further reading:

Saturday, 15 April 2017

Self-backhauling: Integrated access and backhaul links for 5G


One of the items that was proposed during the 3GPP RAN Plenary #75 held in Dubrovnik, Croatia, was Study on Integrated Access and Backhaul for NR (NR = New Radio). RP-17148 provides more details as follows:

One of the potential technologies targeted to enable future cellular network deployment scenarios and applications is the support for wireless backhaul and relay links enabling flexible and very dense deployment of NR cells without the need for densifying the transport network proportionately. 

Due to the expected larger bandwidth available for NR compared to LTE (e.g. mmWave spectrum) along with the native deployment of massive MIMO or multi-beam systems in NR creates an opportunity to develop and deploy integrated access and backhaul links. This may allow easier deployment of a dense network of self-backhauled NR cells in a more integrated manner by building upon many of the control and data channels/procedures defined for providing access to UEs. An example illustration of a network with such integrated access and backhaul links is shown in Figure 1, where relay nodes (rTRPs) can multiplex access and backhaul links in time, frequency, or space (e.g. beam-based operation).

The operation of the different links may be on the same or different frequencies (also termed ‘in-band’ and ‘out-band’ relays). While efficient support of out-band relays is important for some NR deployment scenarios, it is critically important to understand the requirements of in-band operation which imply tighter interworking with the access links operating on the same frequency to accommodate duplex constraints and avoid/mitigate interference. 

In addition, operating NR systems in mmWave spectrum presents some unique challenges including experiencing severe short-term blocking that cannot be readily mitigated by present RRC-based handover mechanisms due to the larger time-scales required for completion of the procedures compared to short-term blocking. Overcoming short-term blocking in mmWave systems may require fast L2-based switching between rTRPs, much like dynamic point selection, or modified L3-based solutions. The above described need to mitigate short-term blocking for NR operation in mmWave spectrum along with the desire for easier deployment of self-backhauled NR cells creates a need for the development of an integrated framework that allows fast switching of access and backhaul links. Over-the-air (OTA) coordination between rTRPs can also be considered to mitigate interference and support end-to-end route selection and optimization.

The benefits of integrated access and backhaul (IAB) are crucial during network rollout and the initial network growth phase. To leverage these benefits, IAB needs to be available when NR rollout occurs. Consequently, postponing IAB-related work to a later stage may have adverse impact on the timely deployment of NR access.


There is also an interesting presentation on this topic from Interdigital on the 5G Crosshaul group here. I found the following points worth noting:

  • This will create a new type of interference (access-backhaul interference) to mitigate and will require sophisticated (complex) scheduling of the channel resources (across two domains, access and backhaul).
  • One of the main drivers is Small cells densification calling for cost-effective and low latency backhauling
  • The goal would be to maximize efficiency through joint optimization/integration of access and backhaul resources
  • The existing approach of Fronthaul using CPRI will not scale for 5G, self-backhaul may be an alternative in the shape of wireless fronthaul

Let me know what you think.

Related Links:



Monday, 6 March 2017

IMT-2020 (5G) Requirements


ITU has just agreed on key 5G performance requirements for IMT-2020. A new draft report ITU-R M.[IMT-2020.TECH PERF REQ] is expected to be finally approved by  ITU-R Study Group 5 at its next meeting in November 2017. The press release says "5G mobile systems to provide lightning speed, ultra-reliable communications for broadband and IoT"


The following is from the ITU draft report:

The key minimum technical performance requirements defined in this document are for the purpose of consistent definition, specification, and evaluation of the candidate IMT-2020 radio interface technologies (RITs)/Set of radio interface technologies (SRIT) in conjunction with the development of ITU-R Recommendations and Reports, such as the detailed specifications of IMT-2020. The intent of these requirements is to ensure that IMT-2020 technologies are able to fulfil the objectives of IMT-2020 and to set a specific level of performance that each proposed RIT/SRIT needs to achieve in order to be considered by ITU-R for IMT-2020.


Peak data rate: Peak data rate is the maximum achievable data rate under ideal conditions (in bit/s), which is the received data bits assuming error-free conditions assignable to a single mobile station, when all assignable radio resources for the corresponding link direction are utilized (i.e., excluding radio resources that are used for physical layer synchronization, reference signals or pilots, guard bands and guard times). 

This requirement is defined for the purpose of evaluation in the eMBB usage scenario. 
The minimum requirements for peak data rate are as follows:
Downlink peak data rate is 20 Gbit/s.
Uplink peak data rate is 10 Gbit/s.


Peak spectral efficiency: Peak spectral efficiency is the maximum data rate under ideal conditions normalised by channel bandwidth (in bit/s/Hz), where the maximum data rate is the received data bits assuming error-free conditions assignable to a single mobile station, when all assignable radio resources for the corresponding link direction are utilized (i.e. excluding radio resources that are used for physical layer synchronization, reference signals or pilots, guard bands and guard times).

This requirement is defined for the purpose of evaluation in the eMBB usage scenario.
The minimum requirements for peak spectral efficiencies are as follows: 
Downlink peak spectral efficiency is 30 bit/s/Hz.
Uplink peak spectral efficiency is 15 bit/s/Hz.


User experienced data rate: User experienced data rate is the 5% point of the cumulative distribution function (CDF) of the user throughput. User throughput (during active time) is defined as the number of correctly received bits, i.e. the number of bits contained in the service data units (SDUs) delivered to Layer 3, over a certain period of time.

This requirement is defined for the purpose of evaluation in the related eMBB test environment.
The target values for the user experienced data rate are as follows in the Dense Urban – eMBB test environment: 
Downlink user experienced data rate is 100 Mbit/s
Uplink user experienced data rate is 50 Mbit/s


5th percentile user spectral efficiency: The 5th percentile user spectral efficiency is the 5% point of the CDF of the normalized user throughput. The normalized user throughput is defined as the number of correctly received bits, i.e., the number of bits contained in the SDUs delivered to Layer 3, over a certain period of time, divided by the channel bandwidth and is measured in bit/s/Hz. 

This requirement is defined for the purpose of evaluation in the eMBB usage scenario.
Indoor Hotspot – eMBB - Downlink: 0.3 bit/s/Hz Uplink: 0.21 bit/s/Hz
Dense Urban – eMBB - Downlink: 0.225 bit/s/Hz Uplink: 0.15 bit/s/Hz
Rural – eMBB - Downlink: 0.12 bit/s/Hz Uplink: 0.045 bit/s/Hz


Average spectral efficiency: Average spectral efficiency  is the aggregate throughput of all users (the number of correctly received bits, i.e. the number of bits contained in the SDUs delivered to Layer 3, over a certain period of time) divided by the channel bandwidth of a specific band divided by the number of TRxPs and is measured in bit/s/Hz/TRxP.

This requirement is defined for the purpose of evaluation in the eMBB usage scenario.
Indoor Hotspot – eMBB - Downlink: 9 bit/s/Hz/TRxP Uplink: 6.75 bit/s/Hz/TRxP
Dense Urban – eMBB - Downlink: 7.8 bit/s/Hz/TRxP Uplink: 5.4 bit/s/Hz/TRxP
Rural – eMBB - Downlink: 3.3 bit/s/Hz/TRxP Uplink: 1.6 bit/s/Hz/TRxP


Area traffic capacity: Area traffic capacity is the total traffic throughput served per geographic area (in Mbit/s/m2). The throughput is the number of correctly received bits, i.e. the number of bits contained in the SDUs delivered to Layer 3, over a certain period of time.

This requirement is defined for the purpose of evaluation in the related eMBB test environment.
The target value for Area traffic capacity in downlink is 10 Mbit/s/m2 in the Indoor Hotspot – eMBB test environment.


User plane latency: User plane latency is the contribution of the radio network to the time from when the source sends a packet to when the destination receives it (in ms). It is defined as the one-way time it takes to successfully deliver an application layer packet/message from the radio protocol layer 2/3 SDU ingress point to the radio protocol layer 2/3 SDU egress point of the radio interface in either uplink or downlink in the network for a given service in unloaded conditions, assuming the mobile station is in the active state. 
This requirement is defined for the purpose of evaluation in the eMBB and URLLC usage scenarios.
The minimum requirements for user plane latency are
4 ms for eMBB
1 ms for URLLC 
assuming unloaded conditions (i.e., a single user) for small IP packets (e.g., 0 byte payload + IP header), for both downlink and uplink.


Control plane latency: Control plane latency refers to the transition time from a most “battery efficient” state (e.g. Idle state) to the start of continuous data transfer (e.g. Active state).
This requirement is defined for the purpose of evaluation in the eMBB and URLLC usage scenarios.
The minimum requirement for control plane latency is 20 ms. Proponents are encouraged to consider lower control plane latency, e.g. 10 ms.


Connection density: Connection density is the total number of devices fulfilling a specific quality of service (QoS) per unit area (per km2).

This requirement is defined for the purpose of evaluation in the mMTC usage scenario.
The minimum requirement for connection density is 1 000 000 devices per km2.


Energy efficiency: Network energy efficiency is the capability of a RIT/SRIT to minimize the radio access network energy consumption in relation to the traffic capacity provided. Device energy efficiency is the capability of the RIT/SRIT to minimize the power consumed by the device modem in relation to the traffic characteristics. 
Energy efficiency of the network and the device can relate to the support for the following two aspects:
a) Efficient data transmission in a loaded case;
b) Low energy consumption when there is no data.
Efficient data transmission in a loaded case is demonstrated by the average spectral efficiency 

This requirement is defined for the purpose of evaluation in the eMBB usage scenario.
The RIT/SRIT shall have the capability to support a high sleep ratio and long sleep duration. Proponents are encouraged to describe other mechanisms of the RIT/SRIT that improve the support of energy efficient operation for both network and device.


Reliability: Reliability relates to the capability of transmitting a given amount of traffic within a predetermined time duration with high success probability

This requirement is defined for the purpose of evaluation in the URLLC usage scenario. 
The minimum requirement for the reliability is 1-10-5 success probability of transmitting a layer 2 PDU (protocol data unit) of 32 bytes within 1 ms in channel quality of coverage edge for the Urban Macro-URLLC test environment, assuming small application data (e.g. 20 bytes application data + protocol overhead). 
Proponents are encouraged to consider larger packet sizes, e.g. layer 2 PDU size of up to 100 bytes.


Mobility: Mobility is the maximum mobile station speed at which a defined QoS can be achieved (in km/h).

The following classes of mobility are defined:
Stationary: 0 km/h
Pedestrian: 0 km/h to 10 km/h
Vehicular: 10 km/h to 120 km/h
High speed vehicular: 120 km/h to 500 km/h

Mobility classes supported:
Indoor Hotspot – eMBB: Stationary, Pedestrian
Dense Urban – eMBB: Stationary, Pedestrian, Vehicular (up to 30 km/h)
Rural – eMBB: Pedestrian, Vehicular, High speed vehicular 


Mobility interruption time: Mobility interruption time is the shortest time duration supported by the system during which a user terminal cannot exchange user plane packets with any base station during transitions.

This requirement is defined for the purpose of evaluation in the eMBB and URLLC usage scenarios.
The minimum requirement for mobility interruption time is 0 ms.


Bandwidth: Bandwidth is the maximum aggregated system bandwidth. The bandwidth may be supported by single or multiple radio frequency (RF) carriers. The bandwidth capability of the RIT/SRIT is defined for the purpose of IMT-2020 evaluation.

The requirement for bandwidth is at least 100 MHz
The RIT/SRIT shall support bandwidths up to 1 GHz for operation in higher frequency bands (e.g. above 6 GHz). 

In case you missed, a 5G logo has also been released by 3GPP


Related posts:



Thursday, 26 January 2017

3GPP Rel-14 IoT Enhancements


A presentation (embedded below) by 3GPP RAN3 Chairman - Philippe Reininger - at the IoT Business & Technologies Congress (November 30, in Singapore). Main topics are eMTC, NB-IOT and EC-GSM-IoT as completed in 3GPP Release 13 and enhanced in Release 14. Thanks to Eiko Seidel for sharing the presentation.


Saturday, 12 November 2016

Verizon's 5G Standard

Earlier this year I wrote a Linkedin post on how operators are setting a timetable for 5G (5G: Mine is bigger than yours) and recently Dean Bubley of Disruptive Analysis wrote a similar kind of post also on Linkedin with a bit more detail (5G: Industry Politics, Use-Cases & a Realistic Timeline)


Some of you may be unaware that the US operator Verizon has formed 'Verizon 5G Technology Forum' (V5GTF) with the intention of developing the first set of standards that can also influence the direction of 3GPP standardization and also provide an early mover advantage to itself and its partners.

The following from Light Reading news summarizes the situation well:

Verizon has posted its second round of work with its partners on a 5G specification. The first round was around the 5G radio specification; this time the work has been on the mechanics of connecting to the network. The operator has been working on the specification with Cisco Systems Inc., Ericsson AB, Intel Corp., LG Electronics Inc., Nokia Corp., Qualcomm Inc. and Samsung Corp. via the 5G Technology Forum (V5GTF) it formed late in 2015.

Sanyogita Shamsunder, director of strategy at Verizon, says that the specification is "75% to 80% there" at least for a "fixed wireless use case." Verizon is aiming for a "friendly, pre-commercial launch" of a fixed wireless pilot in 2017, Koeppe notes.

Before we go further, lets see this excellent video by R&S wherein Andreas Roessler explains what Verizon is up to:



Verizon and SKT are both trying to be the 5G leaders and trying to roll out a pre-standard 5G whenever they can. In fact Qualcomm recently released a 28 GHz modem that will be used in separate pre-standard 5G cellular trials by Verizon and Korea Telecom

Quoting from the EE times article:

The Snapdragon X50 delivers 5 Gbits/second downlinks and multiple gigabit uplinks for mobile and fixed-wireless networks. It uses a separate LTE connection as an anchor for control signals while the 28 GHz link delivers the higher data rates over distances of tens to hundreds of meters.

The X50 uses eight 100 MHz channels, a 2x2 MIMO antenna array, adaptive beamforming techniques and 64 QAM to achieve a 90 dB link budget. It works in conjunction with Qualcomm’s SDR05x mmWave transceiver and PMX50 power management chip. So far, Qualcomm is not revealing more details of modem that will sample next year and be in production before June 2018.

Verizon and Korea Telecom will use the chips in separate trials starting late next year, anticipating commercial services in 2018. The new chips mark a departure from prototypes not intended as products that Qualcomm Research announced in June.

Korea Telecom plans a mobile 5G offering at the February 2018 Winter Olympics. Verizon plans to launch in 2018 a less ambitious fixed-wireless service in the U.S. based on a specification it released in July. KT and Verizon are among a quartet of carriers that formed a group in February to share results of early 5G trials.

For its part, the 3GPP standards group is also stepping up the pace of the 5G standards efforts it officially started earlier this year. It endorsed last month a proposal to consider moving the date for finishing Phase I, an initial version of 5G anchored to LTE, from June 2018 to as early as December 2017, according to a recent Qualcomm blog.

Coming back to Verizon's 5G standard, is it good enough and compatible with 3GPP standards? The answer right now seems to be NO.


The following is from Rethink Wireless:

The issue is that Verizon’s specs include a subcarrier spacing value of 75 kHz, whereas the 3GPP has laid out guidelines that subcarrier spacing must increase by 30 kHz at a time, according to research from Signals Research Group. This means that different networks can work in synergy if required without interfering with each other.

Verizon’s 5G specs do stick to 3GPP requirements in that it includes MIMO and millimeter wave (mmWave). MmWave is a technology that both AT&T and Verizon are leading the way in – which could succeed in establishing spectrum which is licensed fairly traditionally as the core of the US’s high frequency build outs.

A Verizon-fronted group recently rejected a proposal from AT&T to push the 3GPP into finalizing an initial 5G standard for late 2017, thus returning to the original proposed time of June 2018. Verizon was supported by Samsung, ZTE, Deutsche Telecom, France Telecom, TIM and others, which were concerned the split would defocus SA and New Radio efforts and even delay those standards being finalized.

Verizon has been openly criticized in the industry, mostly by AT&T (unsurprisingly), as its hastiness may lead to fragmentation – yet it still looks likely to beat AT&T to be the first operator to deploy 5G, if only for fixed access.

Verizon probably wants the industry to believe that it was prepared for eventualities such as this – prior to the study from Signal Research Group, the operator said its pre-standard implementation will be close enough to the standard that it could easily achieve full compatibility with simple alterations. However, Signals Research Group’s president Michael Thelander has been working with the 3GPP since the 5G standard was birthed, and he begs to differ.

Thelander told FierceWireless, “I believe what Verizon is doing is not hardware-upgradeable to the real specification. It’s great to be trialing, even if you define your own spec, just to kind of get out there and play around with things. That’s great and wonderful and hats off to them. But when you oversell it and call it 5G and talk about commercial services, it’s not 5G. It’s really its own spec that has nothing to do with Release 16, which is still three years away. Just because you have something that operates in millimeter wave spectrum and uses Massive MIMO and OFDM, that doesn’t make it a 5G solution.”

Back in the 3G days, NTT Docomo was the leader in standards and it didn't have enough patience to wait for 3GPP standards to complete. As a result it released its first 3G network called FOMA (Freedom of Mobile Access) based on pre-standard version of specs. This resulted in handset manufacturers having to tweak their software to cope with this version and it suffered from economy of scale. Early version of 3G phones were also not able to roam on the Docomo network. In a way, Verizon is going down the same path.

While there can be some good learning as a result of this pre-5G standard, it may be a good idea not to get too tied into it. A standard that is not compliant will not achieve the required economy of scale, either with handsets or with dongles and other hotspot devices.


Related posts:



Sunday, 6 November 2016

LTE, 5G and V2X

3GPP has recently completed the Initial Cellular V2X standard. The following from the news item:

The initial Cellular Vehicle-to-Everything (V2X) standard, for inclusion in the Release 14, was completed last week - during the 3GPP RAN meeting in New Orleans. It focuses on Vehicle-to-Vehicle (V2V) communications, with further enhancements to support additional V2X operational scenarios to follow, in Release 14, targeting completion during March 2017.
The 3GPP Work Item Description can be found in RP-161894.
V2V communications are based on D2D communications defined as part of ProSe services in Release 12 and Release 13 of the specification. As part of ProSe services, a new D2D interface (designated as PC5, also known as sidelink at the physical layer) was introduced and now as part of the V2V WI it has been enhanced for vehicular use cases, specifically addressing high speed (up to 250Kph) and high density (thousands of nodes).

...


For distributed scheduling (a.k.a. Mode 4) a sensing with semi-persistent transmission based mechanism was introduced. V2V traffic from a device is mostly periodic in nature. This was utilized to sense congestion on a resource and estimate future congestion on that resource. Based on estimation resources were booked. This technique optimizes the use of the channel by enhancing resource separation between transmitters that are using overlapping resources.
The design is scalable for different bandwidths including 10 MHz bandwidth.
Based on these fundamental link and system level changes there are two high level deployment configurations currently defined, and illustrated in Figure 3.
Both configurations use a dedicated carrier for V2V communications, meaning the target band is only used for PC5 based V2V communications. Also in both cases GNSS is used for time synchronization.
In “Configuration 1” scheduling and interference management of V2V traffic is supported based on distributed algorithms (Mode 4) implemented between the vehicles. As mentioned earlier the distributed algorithm is based on sensing with semi-persistent transmission. Additionally, a new mechanism where resource allocation is dependent on geographical information is introduced. Such a mechanism counters near far effect arising due to in-band emissions.
In “Configuration 2” scheduling and interference management of V2V traffic is assisted by eNBs (a.k.a. Mode 3) via control signaling over the Uu interface. The eNodeB will assign the resources being used for V2V signaling in a dynamic manner.

5G Americas has also published a whitepaper on V2X Cellular Solutions. From the press release:

Vehicle-to-Everything (V2X) communications and solutions enable the exchange of information between vehicles and much more - people (V2P), such as bicyclists and pedestrians for alerts, vehicles (V2V) for collision avoidance, infrastructure (V2I) such as roadside devices for timing and prioritization, and the network (V2N) for real time traffic routing and other cloud travel services. The goal of V2X is to improve road safety, increase the efficiency of traffic, reduce environmental impacts and provide additional traveler information services. 5G Americas, the industry trade association and voice of 5G and LTE for the Americas, today announced the publication of a technical whitepaper titled V2X Cellular Solutions that details new connected car opportunities for the cellular and automotive industries.




The whitepaper describes the benefits that Cellular V2X (C-V2X) can provide to support the U.S. Department of Transportation objectives of improving safety and reducing vehicular crashes. Cellular V2X can also be instrumental in transforming the transportation experience by enhancing traveler and traffic information for societal goals.

C-V2X is part of the 3GPP specifications in Release 14. 3GPP announced the completion of the initial C-V2X standard in September 2016. There is a robust evolutionary roadmap for C-V2X towards 5G with a strong ecosystem in place. C-V2X will be a key technology enabler for the safer, more autonomous vehicle of the future.

The whitepaper is embedded below:




Related posts:
Further Reading:



Sunday, 14 August 2016

3GPP Release-14 & Release-15 update

3GPP is on track for 5G as per a news item on the 3GPP website. In 5G World in London in June, Erik Guttman, 3GPP TSG SA Chairman, and Consultant for Samsung Electronics spoke about progress on Release-14 and Release-15. Here is his presentation.



According to 3GPP:

The latest plenary meeting of the 3GPP Technical Specifications Groups (TSG#72) has agreed on a detailed workplan for Release-15, the first release of 5G specifications.
The plan includes a set of intermediate tasks and check-points (see graphic below) to guide the ongoing studies in the Working Groups. These will get 3GPP in a position to make the next major round of workplan decisions when transitioning from the ongoing studies to the normative phase of the work in December 2016:- the start of SA2 normative work on Next Generation (NexGen) architecture and in March 2017:- the beginning of the RAN Working Group’s specification of the 5G New Radio (NR).
3GPP TSG RAN further agreed that the target NR scope for Release 15 includes support of the following:
  • ■ Standalone and Non-Standalone NR operation (with work for both starting in conjunction and running together)
    • ■ Non-standalone NR in this context implies using LTE as control plane anchor. Standalone NR implies full control plane capability for NR.
    • ■ Some potential architecture configuration options are shown in RP-161266 for information and will be analyzed further during the study
  • ■ Target usecases: Enhanced Mobile Broadband (eMBB), as well as Low Latency and High Reliability to enable some Ultra-Reliable and Low Latency Communications (URLCC) usecases
  • ■ Frequency ranges below 6GHz and above 6GHz
During the discussion at TSG#72 the importance of forward compatibility - in both radio and protocol design - was stressed, as this will be key for phasing-in the necessary features, enabling all identified usecases, in subsequent releases of the 5G specification.


Telecom TV has posted a video interview with Erik Guttman which is embedded below:



Related posts:



Wednesday, 13 July 2016

Feasibility Study on New Services and Markets Technology Enablers for 5G

3GPP SA1 (see tutorial about 3GPP if you dont know) recently released four new Technical Reports outlining the New Services and Markets Technology Enablers (SMARTER) for next generation mobile telecommunications.

3GPP TR 22.891 has already identified over 70 different which are into different groups as can be seen in the picture above. These groups are massive Internet of Things (MTC), Critical Communications, enhanced Mobile Broadband, Network Operation and Enhancement of Vehicle-to-Everything (eV2X).

The first 4 items have their own technical reports (see below) but work on the last item has only recently started and does not yet have a TR to show to the outside world. It is foreseen that when there are results from the eV2X study these will be taken on board in the Smarter work. (thanks to Toon Norp for this info)

The four Technical Reports (TR) are:
  • TR 22.861, FS_SMARTER – massive Internet of Things (MTC): Massive Internet of Things focuses on use cases with massive number of devices (e.g., sensors and wearables). This group of use cases is particularly relevant to the new vertical services, such as smart home and city, smart utilities, e-Health, and smart wearables.
  • TR 22.862, FS_SMARTER – Critical Communications: The main areas where improvements are needed for Critical Communications are latency, reliability, and availability to enable, for example, industrial control applications and tactile Internet. These requirements can be met with an improved radio interface, optimized architecture, and dedicated core and radio resources.
  • TR 22.863, FS_SMARTER – enhanced Mobile Broadband: Enhanced Mobile Broadband includes a number of different use case families related to higher data rates, higher density, deployment and coverage, higher user mobility, devices with highly variable user data rates, fixed mobile convergence, and small-cell deployments.
  • TR 22.864, FS_SMARTER – Network Operation: The use case group Network Operation addresses the functional system requirements, including aspects such as: flexible functions and capabilities, new value creation, migration and interworking, optimizations and enhancements, and security.
Embedded below is 3GPP TR 22.891 which has a lot of interesting use cases and makes a useful reading.




Tuesday, 29 March 2016

5G Study Item (SI) for RAN Working Groups Approved


This is from a Linkedin post by Eiko Seidel.

Earlier this month (7-10 March 2016), 3GPP TSG RAN Plenary RAN Meeting #71 took place in Göteborg, Sweden. The first 5G study item for the working groups is was approved. It involves RAN1, RAN2, RAN3 and RAN4. For details please have a look at RP-160671

The study aims to develop an next generation radio access technology to meet a broad range of use cases including enhanced mobile broadband, massive MTC, critical MTC, and additional requirements defined during the RAN requirements study. 

The new RAT will consider frequency ranges up to 100 GHz. 

Detailed objectives of the study item is a single technical framework addressing all usage scenarios, requirements and deployment scenarios including Enhanced mobile broadband, Massive machine-type-communications and Ultra reliable and low latency communications. 

The new RAT shall be inherently forward compatible. It is assumed that the normative specification would occur in two phases: Phase I (to be completed in June 2018) and Phase II (to be completed in December 2019). 

The fundamental physical layer signal waveform will be based on OFDM, with potential support of non-orthogonal waveform and multiple access. Basic frame structure(s) and Channel coding scheme(s) will be developed. 

Architecture work is going to be interesting, with a study of different options of splitting the architecture into a “central unit” and a “distributed unit”, with potential interface in between, including transport, configuration and other required functional interactions between these nodes. Furthermore RAN-CN interface and functional split needs to be studied, the realization of Network Slicing, QoS support etc.


The proposed timeline for 5G was also presented in a presentation as follows:



Saturday, 12 December 2015

LTE-Advanced Pro (a.k.a. 4.5G)

3GPP announced back in October that the next evolution of the 3GPP LTE standards will be known as LTE-Advanced Pro. I am sure this will be shortened to LTE-AP in presentations and discussions but should not be confused with access points.

The 3GPP press release mentioned the following:

LTE-Advanced Pro will allow mobile standards users to associate various new features – from the Release’s freeze in March 2016 – with a distinctive marker that evolves the LTE and LTE-Advanced technology series.

The new term is intended to mark the point in time where the LTE platform has been dramatically enhanced to address new markets as well as adding functionality to improve efficiency.

The major advances achieved with the completion of Release 13 include: MTC enhancements, public safety features – such as D2D and ProSe - small cell dual-connectivity and architecture, carrier aggregation enhancements, interworking with Wi-Fi, licensed assisted access (at 5 GHz), 3D/FD-MIMO, indoor positioning, single cell-point to multi-point and work on latency reduction. Many of these features were started in previous Releases, but will become mature in Release 13.

LTE-evolution timelinea 350pxAs well as sign-posting the achievements to date, the introduction of this new marker confirms the need for LTE enhancements to continue along their distinctive development track, in parallel to the future proposals for the 5G era.


Some vendors have been exploring ways of differentiating the advanced features of Release-13 and have been using the term 4.5G. While 3GPP does not officially support 4.5G (or even 4G) terminology, a new term has been welcomed by operators and vendors alike.

I blogged about Release-13 before, here, which includes a 3GPP presentation and 4G Americas whitepaper. Recently Nokia (Networks) released a short and sweet video and a whitepaper. Both are embedded below:



The Nokia whitepaper (table of contents below) can be downloaded from here.


Wednesday, 18 November 2015

Cellular IoT (CIoT) or LoRa?

Back in September, 3GPP reached a decision to standardise NarrowBand IOT (NB-IOT). Now people familiar with the evolution of LTE-A UE categories may be a bit surprised with this. Upto Release-11, the lowest data rate device was UE Cat-1, which could do 10Mbps in DL and 5Mbps in UL. This was power hungry and not really that useful for low data rate sensor devices. Then we got Cat-0 as part of Release-12 which simplified the design and have 1Mbps in DL & UL.

Things start to become a bit complex in Release-13. The above picture from Qualcomm explains the evolution and use cases very well. However, to put more details to the above picture, here is some details from the 4G Americas whitepaper (embedded below)


In support of IoT, 3GPP has been working on all several related solutions and generating an abundance of LTE-based and GSM-based proposals. As a consequence, 3GPP has been developing three different cellular IoT standard- solutions in Release-13:
  • LTE-M, based on LTE evolution
  • EC-GSM, a narrowband solution based on GSM evolution, and
  • NB-LTE, a narrowband cellular IoT solution, also known as Clean Slate technologies
However, in October 2015, the 3GPP RAN body mutually agreed to study the combination of the two different narrowband IoT technical solutions, EC-GSM and NB-LTE, for standardization as a single NB-IoT technology until the December 2015 timeframe. This is in consideration of the need to support different operation modes and avoid divided industry support for two different technical solutions. It has been agreed that NB-IoT would support three modes of operation as follows:
  • ‘Stand-alone operation’ utilizing, for example, the spectrum currently being used by GERAN systems as a replacement of one or more GSM carriers,
  • ‘Guard band operation’ utilizing the unused resource blocks within a LTE carrier’s guard-band, and
  • ‘In-band operation’ utilizing resource blocks within a normal LTE carrier.

Following is a brief description of the various standard solutions being developed at 3GPP by October 2015:

LTE-M: 3GPP RAN is developing LTE-Machine-to-Machine (LTE-M) specifications for supporting LTE-based low cost CIoT in Rel-12 (Low-Cost MTC) with further enhancements planned for Rel-13 (LTE eMTC). LTE-M supports data rates of up to 1 Mbps with lower device cost and power consumption and enhanced coverage and capacity on the existing LTE carrier.

EC-GSM: In the 3GPP GERAN #62 study item “Cellular System Support for Ultra Low Complexity and Low Throughput Internet of Things”, narrowband (200 kHz) CIoT solutions for migration of existing GSM carriers sought to enhance coverage by 20 dB compared to legacy GPRS, and achieve a ten year battery life for devices that were also cost efficient. Performance objectives included improved indoor coverage, support for massive numbers of low-throughput devices, reduced device complexity, improved power efficiency and latency. Extended Coverage GSM (EC-GSM) was fully compliant with all five performance objectives according to the August 2015 TSG GERAN #67 meeting report. GERAN will continue with EC-GSM as a work item within GERAN with the expectation that standards will be frozen by March 2016. This solution necessarily requires a GSM network.

NB-LTE: In August 2015, work began in 3GPP RAN Rel-13 on a new narrowband radio access solution also termed as Clean Slate CIoT. The Clean Slate approach covers the Narrowband Cellular IoT (NB-CIoT), which was the only one of six proposed Clean Slate technologies compliant against a set of performance objectives (as noted previously) in the TSG GERAN #67 meeting report and will be part of Rel-13 to be frozen in March 2016. Also contending in the standards is Narrowband LTE Evolution (NB-LTE) which has the advantage of easy deployment across existing LTE networks.

Rel-12 introduces important improvements for M2M like lower device cost and longer battery life. Further improvements for M2M are envisioned in Rel-13 such as enhanced coverage, lower device cost and longer battery life. The narrowband CIoT solutions also aim to provide lower cost and device power consumption and better coverage; however, they will also have reduced data rates. NB CleanSlate CIoT is expected to support data rates of 160bps with extended coverage.

Table 7.1 provides some comparison of the three options to be standardized, as well as the 5G option, and shows when each release is expected to be finalized.

Another IoT technology that has been giving the cellular IoT industry run for money is the LoRa alliance. I blogged about LoRa in May and it has been a very popular post. A extract from a recent article from Rethink Research as follows:

In the past few weeks, the announcements have been ramping up. Semtech (the creator of the LoRa protocol itself, and the key IP owner) has been most active, announcing that The Lace Company, a wireless operator, has deployed LoRa network architecture in over a dozen Russian cities, claiming to cover 30m people over 9,000km2. Lace is currently aiming at building out Russian coverage, but will be able to communicate to other LoRa devices over the LoRa cloud, as the messages are managed on cloud servers once they have been transmitted from end-device to base unit via LoRaWAN.

“Our network allows the user to connect to an unlimited number of smart sensors,” said Igor Shirokov, CEO of Lace Ltd. “We are providing connectivity to any device that supports the open LoRaWAN standard. Any third party company can create new businesses and services in IoT and M2M market based on our network and the LoRaWAN protocol.”

Elsewhere, Saudi Arabian telco Du has launched a test LoRa network in Dubai, as part of a smart city test project. “This is a defining moment in the UAE’s smart city transformation,” said Carlos Domingo, senior executive officer at Du. “We need a new breed of sensor friendly network to establish the smart city ecosystem. Thanks to Du, this capability now exists in the UAE Today we’ve shown how our network capabilities and digital know-how can deliver the smart city ecosystem Dubai needs. We will not stop in Dubai; our deployment will continue country-wide throughout the UAE.”

But the biggest recent LoRa news is that Orange has committed itself to a national French network rollout, following an investment in key LoRa player Actility. Orange has previously trialed a LoRa network in Grenoble, and has said that it opted for LoRa over Sigfox thanks to its more open ecosystem – although it’s worth clarifying here that Semtech still gets a royalty on every LoRa chip that’s made, and will continue to do so until it chooses not to or instead donates the IP to the non-profit LoRa Alliance itself.

It would be interesting to see if this LoRa vs CIoT ends up the same way as WiMAX vs LTE or not.

Embedded below is the 4G Americas whitepaper as well as a LoRa presentation from Semtech:






Further reading: