Showing posts with label Edge and Fog Computing. Show all posts
Showing posts with label Edge and Fog Computing. Show all posts

Tuesday, 12 March 2019

Can Augmented & Mixed Reality be the Killer App 5G needs?


Last October Deutsche Telekom, Niantic and MobiledgeX announced a partnership to create advanced augmented reality experiences over mobile network technologies. I was lucky to find some time to go and play it at Deutsche Telekom booth. The amount of processing needed for this to work at best also meant that the new Samsung Galaxy S10+ were needed but I felt that it also occasionally struggled with the amount of data being transferred.


The pre-MWC press release said:

Deutsche Telekom, Niantic Inc., MobiledgeX and Samsung Showcase World’s First Mobile Edge Mixed Reality Multi-Gamer Experience

At the Deutsche Telekom booth at MWC 2019 (hall 3, booth 3M31) the results of the previously announced collaboration between Deutsche Telekom, Niantic, Inc., and MobiledgeX are on display and you’re invited to play. Niantic’s “Codename: Neon”, the world’s first edge-enhanced Mixed Reality Multiplayer Experience, delivered by ultra-low latency, Deutsche Telekom edge-enabled network, and Samsung Galaxy S10+ with edge computing enablement, will be playable by the public for the first time. 

“The ultra-low latency that Mobile Edge Computing (MEC) enables, allows us to create more immersive, exciting, and entertaining gameplay experiences. At Niantic, we’ve long celebrated adventures on foot with others, and with the advent of 5G networks and devices, people around the world will be able to experience those adventures faster and better,” said Omar Téllez, Vice-President of Strategic Partnerships at Niantic.

The collaboration is enabled using MobiledgeX’s recently announced MobiledgeX Edge-Cloud R1.0 product. Key features include device and platform-independent SDKs, a Distributed Matching Engine (DME) and a fully multi-tenant control plane that supports zero-touch provisioning of edge cloud resources as close as possible to the users. Immediate examples of what this enables include performance boosts for Augmented Reality and Mixed Reality (MR) experiences as well as video and image processing that meets local privacy regulations. 

Samsung has been working together with Deutsche Telekom, MobiledgeX, and Niantic on a natively edge-capable connectivity and authentication in Samsung Galaxy S10+ to interface with MobiledgeX Edge-Cloud R1.0 and dynamically access the edge infrastructure it needs so that augmented reality and mixed reality applications can take advantage of edge unmodified. Samsung will continue such collaborations with industry-leading partners not only to embrace a native device functionality of edge discovery and usage for the mobile devices and consumers, but also to seek a way together to create new business models and revenue opportunities leading into 5G era.

Deutsche Telekom’s ultra-low latency network was able to deliver on the bandwidth demands of “Codename: Neon” because it deployed MobiledgeX’s edge software services, built on dynamically managed decentralized cloudlets. “From our initial partnership agreement in October, we are thrilled to showcase the speed at which we can move from idea to experience, with full end-to-end network integration, delivered on Samsung industry leading edge native devices,” said Alex Jinsung Choi, Senior Vice President Strategy and Technology Innovation at Deutsche Telekom.

From the gaming industry to industrial IoT, and computer vision applications, consumer or enterprise, the experience is a great example of interactive AR experiences coming from companies like Niantic in the near future.  As AR/VR/MR immersive experiences continue to shape our expectations, devices, networks and clouds need to seamlessly and dynamically collaborate.

Niantic CEO John Hanke delivered a keynote at Mobile World Congress 2019 (embedded below). According to Fortune article, "Why the Developer of the New 'Harry Potter' Mobile Game and 'Pokemon Go' Loves 5G":

Hanke showed a video of a prototype game Niantic has developed codenamed Neon that allows multiple people in the same place at the same time to play an augmented reality game. Players can shoot at each other, duck and dodge, and pick up virtual reality items, with each player’s phone showing them the game’s graphics superimposed on the real world. But the game depends on highly responsive wireless connections for all the phones, connections unavailable on today’s 4G LTE networks.

“We’re really pushing the boundaries of what we can do on today’s networks,” Hanke said. “We need 5G to deliver the kinds of experiences that we are imagining.”

Here is the video, it's very interesting and definitely worth a watch. For those who may not know, Niantic spun out of Google in October 2015 soon after Google's announcement of its restructuring as Alphabet Inc. During the spinout, Niantic announced that Google, Nintendo, and The Pokémon Company would invest up to $30 million in Series-A funding.



So what do you think, can AR / MR be the killer App 5G needs?

Tuesday, 12 February 2019

Prof. Andy Sutton: 5G Radio Access Network Architecture Evolution - Jan 2019


Prof. Andy Sutton delivered his annual IET talk last month which was held the 6th Annual 5G conference. You can watch the videos for that event here (not all have been uploaded at the time of writing this post). His talks have always been very popular on this blog with the last year talk being 2nd most popular while the one in 2017 was the most popular one. Thanks also to IET for hosting this annual event and IET Tv for making this videos available for free.

The slides and video is embedded below but for new starters, before jumping to this, you may want to check out about 5G Network Architecture options in our tutorial here.




As always, this is full of useful information with insight into how BT/EE is thinking about deploying 5G in UK.

Related Posts:

Tuesday, 1 May 2018

MAMS (Multi Access Management Services) at MEC integrating LTE and Wi-Fi networks

Came across Multi Access Management Services (MAMS) a few times recently so here is a quick short post on the topic. At present MAMS is under review in IETF and is being supported by Nokia, Intel, Broadcom, Huawei, AT&T, KT.

I heard about MAMS for the first time at a Small Cell Forum event in Mumbai, slides are here for this particular presentation from Nokia.

As you can see from the slide above, MAMS can optimise inter-working of different access domains, particularly at the Edge. A recent presentation from Nokia (here) on this topic provides much more detailed insight.

From the presentation:

        MAMS (Multi Access Management Services) is a framework for

-            Integrating different access network domains based on user plane (e.g. IP layer) interworking,

-            with ability to select access and core network paths independently

-            and user plane treatment based on traffic types

-            that can dynamically adapt to changing network conditions

-            based on negotiation between client and network
        The technical content is available as the following drafts*



-            MAMS User Plane Specification: https://tools.ietf.org/html/draft-zhu-intarea-mams-user-protocol-02




*Currently under review, Co-authors: Nokia, Intel, Broadcom, Huawei, AT&T, KT,

The slides provide much more details, including the different use cases (pic below) for integrating LTE and Wi-Fi at the Edge.


Here are the references for anyone wishing to look at this in more detail:

Sunday, 22 January 2017

Augmented / Virtual Reality Requirements for 5G


Ever wondered whether 5G would be good enough for Augmented and Virtual Reality or will we need to wait for 6G? Some researchers are trying to identify the AR / VR requirements, challenges from a mobile network point of view and possible options to solve these challenges. They have recently published a research paper on this topic.

Here is a summary of some of the interesting things I found in this paper:

  • Humans process nearly 5.2 gigabits per second of sound and light.
  • Without moving the head, our eyes can mechanically shift across a field of view of at least 150 degrees horizontally (i.e., 30:000 pixels) and 120 degrees vertically (i.e., 24:000 pixels).
  • The human eye can perceive much faster motion (150 frames per second). For sports, games, science and other high-speed immersive experiences, video rates of 60 or even 120 frames per second are needed to avoid motion blur and disorientation.
  • 5.2 gigabits per second of network throughput (if not more) is needed.
  • At today’s 4K resolution, 30 frames per second and 24 bits per pixel, and using a 300 : 1 compression ratio, yields 300 megabits per second of imagery. That is more than 10x the typical requirement for a high-quality 4K movie experience.
  • 5G network architectures are being designed to move the post-processing at the network edge so that processors at the edge and the client display devices (VR goggles, smart TVs, tablets and phones) carry out advanced image processing to stitch camera feeds into dramatic effects.
  • In order to tackle these grand challenges, the 5G network architecture (radio access network (RAN), Edge and Core) will need to be much smarter than ever before by adaptively and dynamically making use of concepts such as software defined networking (SDN), network function virtualization (NFV) and network slicing, to mention a few facilitating a more flexible allocating resources (resource blocks (RBs), access point, storage, memory, computing, etc.) to meet these demands.
  • Immersive technology will require massive improvements in terms of bandwidth, latency and reliablility. Current remotereality prototype requires 100-to-200Mbps for a one-way immersive experience. While MirrorSys uses a single 8K, estimates about photo-realistic VR will require two 16K x 16K screens (one to each eye).
  • Latency is the other big issue in addition to reliability. With an augmented reality headset, for example, real-life visual and auditory information has to be taken in through the camera and sent to the fog/cloud for processing, with digital information sent back to be precisely overlaid onto the real-world environment, and all this has to happen in less time than it takes for humans to start noticing lag (no more than 13ms). Factoring in the much needed high reliability criteria on top of these bandwidth and delay requirements clearly indicates the need for interactions between several research disciplines.


These key research directions and scientific challenges are summarized in Fig. 3 (above), and discussed in the paper. I advice you to read it here.

Related posts:

Saturday, 21 November 2015

'Mobile Edge Computing' (MEC) or 'Fog Computing' (fogging) and 5G & IoT


Picture Source: Cisco

The clouds are up in the sky whereas the fog is low, on the ground. This is how Fog Computing is referred to as opposed to the cloud. Fog sits at the edge (that is why edge computing) to reduce the latency and do an initial level of processing thereby reducing the amount of information that needs to be exchanged with the cloud.

The same paradigm is being used in case of 5G to refer to edge computing, which is required when we are referring to 1ms latency in certain cases.

As this whitepaper from Ovum & Eblink explains:

Mobile Edge Computing (MEC): Where new processing capabilities are introduced in the base station for new applications, with a new split of functions and a new interface between the baseband unit (BBU) and the remote radio unit (RRU).
...
Mobile Edge Computing (MEC) is an ETSI initiative, where processing and storage capabilities are placed at the base station in order to create new application and service opportunities. This new initiative is called “fog computing” where computing, storage, and network capabilities are deployed nearer to the end user.

MEC contrasts with the centralization principles discussed above for C-RAN and Cloud RAN. Nevertheless, MEC deployments may be built upon existing C-RAN or Cloud RAN infrastructure and take advantage of the backhaul/fronthaul links that have been converted from legacy to these new centralized architectures.

MEC is a long-term initiative and may be deployed during or after 5G if it gains support in the 5G standardization process. Although it is in contrast to existing centralization efforts, Ovum expects that MEC could follow after Cloud RAN is deployed in large scale in advanced markets. Some operators may also skip Cloud RAN and migrate from C-RAN to MEC directly, but MEC is also likely to require the structural enhancements that C-RAN and Cloud RAN will introduce into the mobile network.

The biggest challenge facing MEC in the current state of the market is its very high costs and questionable new service/revenue opportunities. Moreover, several operators are looking to invest in C-RAN and Cloud RAN in the near future, which may require significant investment to maintain a healthy network and traffic growth. In a way, MEC is counter to the centralization principle of Centralized/Cloud RAN and Ovum expects it will only come into play when localized applications are perceived as revenue opportunities.

And similarly this Interdigital presentation explains:

Extends cloud computing and services to the edge of the network and into devices. Similar to cloud, fog provides network, compute, storage (caching) and services to end users. The distinguishing feature of Fog reduces latency & improves QoS resulting in a superior user experience

Here is a small summary of the patents with IoT and Fog Computing that has been flied.