Monday, 9 December 2013

Rise of the "Thing"

Light Reading carried an interesting cartoon on how M2M works. I wouldnt be surprised if some of the M2M applications at present do work like this. Jokes apart, last week the UK operator EE did a very interesting presentation on Scaling the network for the Rise of the Thing.

A question often asked is "What is the difference between the 'Internet of Things' (IoT) and 'Machine to Machine' (M2M)?". This can generate big discussions and can be a lecture on its own. Quora has a discussion on the same topic here. The picture above from the EE presentation is a good way of showing that M2M is a subset of IoT. 

Its also interesting to note how these 'things' will affect the signalling. I often come across people who tell me that since most M2M devices just use small amounts of data transfer, why is there a need to move from GPRS to LTE. The 2G and 3G networks were designed primarily for Voice with Data secondary function. These networks may work well now but what happens when the predicted 50 Billion connected devices are here by 2020 (or 500 Billion by 2030). The current networks would drown in the control signalling that would often result in congested networks. Congestion control is just one of the things 3GPP is working on for M2M type devices as blogged earlier here. In fact the Qualcomm presentation blogged about before does a decent job of comparing various technologies for IoT, see here.

The EE presentation is embedded as follows:



Another good example website I was recently made aware of is http://postscapes.com/internet-of-things-examples/ - worth checking how IoT would help us in the future.

Sunday, 1 December 2013

Quick summary on LTE and UMTS / HSPA Release-12 evolution by 3GPP



A quick summary from 3GPP about the Release-12 progress (Jun. 2014 release planned) from the recent ETSI Future Mobile Summit. Presentation and video embedded below





Wednesday, 27 November 2013

ETSI Summit on Future Mobile and Standards for 5G



Edited from the original in 3GPP News:

The ETSI Future Mobile Summit has heard how the mobile internet will evolve over the next ten to fifteen years, and how 3GPP systems will ensure future stability as the network copes with an explosive growth in complexity and usage.


With 3GPP providing the evolutionary framework for mobility, via its Releases of new functionality and features, the more radical thinking, at the Summit, came in the form of Research projects and some future focused industry initiatives, such as the WWRF, the METIS Project and the DVB Project.

In his keynote address, Mario Campolargo - of the European Commission - introduced a new initiative on research & innovation that will provide momentum to funded work on research. The 5G Public Private Partnership is being launched as a blueprint for the deployment of 5G, in the years after 2020. 



In summing up the Summit’s main themes, the ETSI CTO, Adrian Scrase identified some certainties; “...traffic will continue to increase, connected devices will increase dramatically over time, new device types will significantly contribute to that increase (e.g., probes, sensors, meters, machines etc) and new sectors will bring new priorities (e.g, critical infrastructures).”

On the concept of 5G, Mr. Scrase reported that ultra-reliable 5G networks should, among other things, enable the tactile internet, the perception of infinite capacity and bring in augmented reality.



Download the presentations:
5G, the way forward!
Mario Campolargo, Director, Net Futures, DG Connect, European Commission
A new initiative 5GPPP, to accelerate and structure research & innovation."...Industry to co-create the "vision" and build global convergence by end 2015.
Who needs 5G?
Hans D. Schotten, University of Kaiserslautern
Long Term Evolution of LTE (linear evolution) or Something new (5G)?
Why 5G?
Rahim Tafazolli, Director of CCSR and 5GIC, The university of Surrey
Perceived infinite capacity, a new communication paradigm for 5G and Beyond
The 5G mobile and wireless communications system 
Afif Osseiran, Project Coordinator of METIS
Explanation of 5G scenarios (selected) and examples of 5G technology components
Next generation wireless for a cognitive & energy-efficient future
Nigel Jefferies, Wireless World Research Forum Chairman
"New technology challenges: huge number of nodes, latency , energy efficiency, backhaul and over the air signaling design...May require a whole new approach to: physical layer, air interface and spectrum usage, resources management & optimization..."
 3GPP RAN has started a new innovation cycle which will be shaping next generation cellular systems
Spectrum for 5G, a big deal?
Jens Zander, KTH, Royal Institute of Technology  
 A World Divided - The coverage world versus the capacity world
Opportunities for TV services over future mobile networks
Nick Wells, Chairman Technical Module, DVB
 Can broadcasters and mobile industry cooperate to define a new worldwide standard that will benefit both broadcasters and mobile industry?
3GPP core network & services evolution
Atle Monrad, 3GPP CT Chairman
Architecture evolution, More new nodes, CS-domain removal?, new ways of design of networks?
The impact of NFV on future mobile
Uwe Janssen, Deutsche Telekom, lead delegate to Network Functions Virtualisation ISG
 The challenge for Operators, Suppliers and Standards Bodies
The tactile internet - Driving 5G
Gerhard Fettweis, Technical University of Dresden
 3D Chip-Stacks & High-Rate Inter-Chip Communications, Monitoring / Sensing, Tactile internet - Latency Goals
Summit conclusions
Adrian Scrase, ETSI CTO, Head of 3GPP MCC
 Includes the 'Standardization Challenges' raised by the Summit.

Saturday, 23 November 2013

Bandwidth is not the answer – it’s stationarity


Martin Geddes did an interesting presentation in Future of Broadband workshop. The ITU has the following write-up on that workshop

Eye-opening, evangelical and extremely well attended: this afternoon’s Future of Broadband workshop was all about exploding established concepts on how telcos should go about improving both customer experience and their bottom line.
Ranking broadband in terms of speed is the standard approach, but speed is not the only thing that matters in this business, according to Martin Geddes of Geddes Consulting, running the workshop in conjunction with Neill Davies of Predictable Network Solutions.  He illustrated his point with a series of examples drawn from customers accessing broadband at different speeds – but with unexpectedly different experiences.
Slower broadband, whether over cable, satellite or fibre, in many cases offered a better quality of customer experience than the faster variant. Why? Variability, or rather lack of variability, is the key. A stable service, even it is slower, enables POTS-quality VoiP, whereas a highly-variable, faster service delivers a less satisfactory customer experience – and, by definition, an unhappier customer.
“The hidden secret of networking is that the network delivers loss and delay between packets,” said Geddes, “There is more to broadband than speed or capacity: with many customers wanting lots of different things at once, we also need an absence of variability, and that is what we call stationarity.”
Looked at from the network operator side, there are two key areas to consider: what is driving the cost of broadband and pushing capex sky high, and how to retain and increase your customer base to bring in the revenue. The answers, it seems, are not immediately obvious.
To start with, the knee-jerk telco reaction of pouring capex into infrastructure upgrades and increased capacity is simply not the way to ensure good quality of service and happy customers.  Demand for broadband is highly elastic, expanding to consume whatever supply is on offer and creating a “jack-hammer effect” – which produces variability. Paradoxically, increased investment in bandwidth may be behind that very poor service which leads to customer churn and the panicked assumption that another upgrade is necessary – an “investment cycle of doom.”
This is a deep systemic problem in the industry investment machine. Rushing to premature upgrades masks the real core issue, that of quality of service.  The presenters demonstrated this in heaven-hell model, where full network capacity and happy customers is telco heaven – and the converse, unhappy customers and underused network, is of course telco hell.  Getting the balance is not easy, as increasing local networks pushes down the quality of experience for applications with strong stationarity requirements – exactly what the customer is after.
For Martin, there is a tiny root cause of this: all current packet-based infrastructure relies on it being idle and keeping queues empty to ensure good quality. So your assets must stay idle to keep your customer. The solution lies in thinking about how to reframe both this problem, and the exact nature of the resource the operators are selling.
“Don’t make packets move for their own sake, but focus on customer experience. Change the resource model,” urged Martin. “Throw away the bandwidth model and thought process.” Efficiently allocating resources to customers is more important than bandwidth. Increase capacity, but only in a very targeted way.  In other words, meet heterogeneous  demand with a differentiated product.
This, then, is how to ensure a future of broadband heaven: understand that quality of experience is a function of loss and delay. Characterize your supply requirements properly. Work out what customers are after, certifying fitness of purpose for a particular, actual customer demand rather than a generalised one-size-fits-all concept. And, in the words of the workshop presenters: “Don’t sell bandwidth – sell differential experiences.”

His presentation is embedded as follows:



Thursday, 21 November 2013

Tuesday, 12 November 2013

Mobile Video Offload using Wi-Fi is the only solution in the coming years

A very interesting infographic from Skyfire some months back highlighted some very valid issues about Video on mobiles.


Personally, I do watch quite a bit of video on my phone and tablet but only when connected using Wi-Fi. Occasionally when I am out, if someone sends me video clip on Whatsapp or some link to watch Video on youtube, I do try and see it. Most of the time the quality is too disappointing. It could be because my operator has been rated as the worst operator in UK. Anyway, as the infographic above suggests, there needs to be some kind of an optimisation done to make sure that end users are happy. OR, the users cn offload to Wi-Fi when possible to get a better experience.

This is one of the main reasons why operators are actively considering offloading to Wi-Fi and have carrier WiFi solutions in place. The standards are actively working in the same direction. Two of my recent posts on the topic of 'roaming using ANDSF' and 'challenges with seamless cellular/Wi-Fi handover' have been quite popular.



Recently I attended a webinar on the topic of 'Video Offload'. While the webinar reinforced my beliefs about why offload should be done, it did teach me a thing or two (like when is a Hotspot called a Homespot - see here). The presentation and the Video is embedded below. Before that, I want to show the result of a poll conducted during the webinar where the people present (and I would imagine there were quite a few people) were asked about how they think MNO will approach the WiFi solution in their network. Result as follows:



Here is the presentation:



Here is the video of the event:


Sunday, 10 November 2013

SIPTO Evolution


Couple of years back I did a post on SIPTO (Selected IP Traffic Offload) and related technologies coming as part of Rel-10. I also put up a comparison for SIPTO, LIPA and IFOM here. Having left it for couple of years, I found that there have been some enhancements to the architecture from the basic one described here.

I have embedded the NEC paper below for someone wanting to investigate further the different options shown in the picture above. I think that even though the operator may offload certain type of traffic locally, they would still consider that data as part of the bundle and would like to charge for it. At the same time there would be a requirement on the operator for lawful interception, so not sure how this will be managed for different architectures. Anyway, feel free to leave comments if you have any additional info.



Wednesday, 6 November 2013

The Relentless Rise of Mobile Technology


Mobiles have been rising and rising. Couple of weeks back I read 'Mobile is considered the first and most important screen by nearly half of the 18- to 34-year-old demographic, according to research commissioned by Weve.'


The finding placed mobile ahead of laptops or PCs (chosen by 30.6 per cent) and way ahead of TV (12.4 per cent) as the first and most important screen in the lives of people between the ages of 18 and 34. 
Just 5.8 per cent of those surveyed in the age group chose a tablet as their "first screen".
The research also found that 45 per cent of 18- to 34-year-olds consider their mobile their first choice of device when interacting with online content, placing the platform just ahead of laptops and PCs, which scored 43 per cent. 
Among the wider 18 to 55 age group surveyed, a PC or laptop was seen as the "first screen" with 39.8 per cent naming either computer as their most important screen, while smartphones came second on 28 per cent. 
TV was in third place with 27 per cent of people naming it as their most important screen. Five per cent of the total group said they considered a tablet their "first screen". 
Only a quarter of the 18 to 55 age group said mobile would be their first choice platform if they wanted to access the internet, while nearly two thirds preferred to use a PC or laptop.
Tomi Ahonen has always been referring to Mobile as the 7th Mass Media.

So when I saw this above picture (and there are more of them) in Ben Evaans slide deck (embedded below), it just reiterated my belief that Mobile will take over the world sooner or later. Anyway, the slides are interesting to go through.



Monday, 4 November 2013

Key challenges with automatic Wi-Fi / Cellular handover

Recently in a conference I mentioned that the 3GPP standards are working on standards that will allow automatic and seamless handovers between Cellular and Wi-Fi. At the same time operators may want to have a control where they can automatically switch on a users Wi-Fi radio (if switched off) and offload to Wi-Fi whenever possible. It upset quite a few people who were reasoning against the problems this could cause and the issues that need to be solved.

I have been meaning to list the possible issues which could be present in this scenario of automatically handing over between Wi-Fi and cellular, luckily I found that they have been listed very well in the recent 4G Americas whitepaper. The whitepaper is embedded below but here are the issues I had been wanting to discuss:

In particular, many of the challenges facing Wi-Fi/Cellular integration have to do with realizing a complete intelligent network selection solution that allows operators to steer traffic in a manner that maximizes user experience and addresses some of the challenges at the boundaries between RATs (2G, 3G, LTE and Wi-Fi).
Figure 1 (see above) below illustrates four of the key challenges at the Wi-Fi/Cellular boundary.
1) Premature Wi-Fi Selection: As devices with Wi-Fi enabled move into Wi-Fi coverage, they reselect to Wi-Fi without comparative evaluation of existing cellular and incoming Wi-Fi capabilities. This can result in degradation of end user experience due to premature reselection to Wi-Fi. Real time throughput based traffic steering can be used to mitigate this.
2) Unhealthy choices: In a mixed wireless network of LTE, HSPA and Wi-Fi, reselection may occur to a strong Wi-Fi network, which is under heavy load. The resulting ‘unhealthy’ choice results in a degradation of end user experience as performance on the cell edge of a lightly loaded cellular network may be superior to performance close to a heavily loaded Wi-Fi AP. Real time load based traffic steering can be used to mitigate this.
3) Lower capabilities: In some cases, reselection to a strong Wi-Fi AP may result in reduced performance (e.g. if the Wi-Fi AP is served by lower bandwidth in the backhaul than the cellular base station presently serving the device). Evaluation of criteria beyond wireless capabilities prior to access selection can be used to mitigate this.
4) Ping-Pong: This is an example of reduced end user experience due to ping-ponging between Wi-Fi and cellular accesses. This could be a result of premature Wi-Fi selection and mobility in a cellular environment with signal strengths very similar in both access types. Hysteresis concepts used in access selection similar to cellular IRAT, applied between Wi-Fi and cellular accesses can be used to mitigate this.
Here is the paper: