Showing posts with label AR / VR / MR / XR. Show all posts
Showing posts with label AR / VR / MR / XR. Show all posts

Wednesday 14 August 2024

3GPP Release 18 Description and Summary of Work Items

The first official release of 3GPP TR 21.918: "Release 18 Description; Summary of Rel-18 Work Items" has been published. It's the first official version of 5G-Advanced. Quoting from the report: 

Release 18 specifies further improvements of the 5G-Avanced system. 

These improvements consist both in enhancements of concepts/Features introduced in the previous Releases and in the introduction of new topics.

Some of the key improvements are:

  • a further integration of the Satellite (NTN) access (introduced in Rel-17) in the 5G System (5GS), 
  • a more efficient support of Internet of Things (IoT), Machine-Type Communication (MTC), including by satellite coverage
  • and also several aspects of proximity communication and location (Sidelink, Proximity, Location and Positioning, better support of the industrial needs (Verticals, Industries, Factories, Northbound API), Multicast and Broadcast Services (MBS), Network Slicing or Uncrewed Aerial Vehicles (UAV).

As for the new topics, some of the key aspects are:

  • Energy Efficiency (EE)
  • Artificial Intelligence (AI)/Machine Learning (ML)
  • eXtended, Augmented and Virtual Reality (XR, AR, VR), immersive communications

The following list is from the v1.0.0 table of contents to make it easier to find the list of topics. If it interests you, download the latest version technical report from the directory here.

5 Satellite / Non-Terrestrial Network (NTN)
5.1 General aspects
5.1.1 User plane: “5G system with satellite backhaul”
5.1.2 Discontinuous coverage: “Satellite access Phase 2”
5.1.3 Radio: "NR NTN enhancements"
5.1.4 Charging and Management aspects of Satelite
5.2 Specific aspects
5.2.1 IoT (Internet of Things) NTN enhancements
5.2.2 Guidelines for Extra-territorial 5G Systems
5.2.3 5G system with satellite access to Support Control and/or Video Surveillance
5.2.4 Introduction of the satellite L-/S-band for NR
5.2.5 Other band-related aspects of satellite

6 Internet of Things (IoT), Machine-Type Communication (MTC)
6.1 Personal IoT and Residential networks
6.2 Enhanced support of Reduced Capability (RedCap) NR devices
6.3 NR RedCap UE with long eDRX for RRC_INACTIVE State
6.4 Application layer support for Personal IoT Network
6.5 5G Timing Resiliency System
6.6 Mobile Terminated-Small Data Transmission (MT-SDT) for NR
6.7 Adding new NR FDD bands for RedCap in Rel-18
6.8 Signal level Enhanced Network Selection
6.9 IoT NTN enhancements

7 Energy Efficiency (EE)
7.1 Enhancements of EE for 5G Phase 2
7.2 Network energy savings for NR
7.3 Smart Energy and Infrastructure

8 Uncrewed Aerial Vehicles (UAV), UAS, UAM
8.1 Architecture for UAV and UAM Phase 2
8.2 Architecture for UAS Applications, Phase 2
8.3 NR support for UAV
8.4 Enhanced LTE Support for UAV

9 Sidelink, Proximity, Location and Positioning
9.1 5GC LoCation Services - Phase 3
9.2 Expanded and improved NR positioning
9.3 NR sidelink evolution
9.4 NR sidelink relay enhancements
9.5 Proximity-based Services in 5GS Phase 2
9.6 Ranging-based Service and sidelink positioning
9.7 Mobile Terminated-Small Data Transmission (MT-SDT) for NR
9.8 5G-enabled fused location service capability exposure

10 Verticals, Industries, Factories, Northbound API
10.1 Low Power High Accuracy Positioning for industrial IoT scenarios
10.2 Application enablement aspects for subscriber-aware northbound API access
10.3 Smart Energy and Infrastructure
10.4 Generic group management, exposure and communication enhancements
10.5 Service Enabler Architecture Layer for Verticals Phase 3
10.6 SEAL data delivery enabler for vertical applications
10.7 Rel-18 Enhancements of 3GPP Northbound and Application Layer interfaces and APIs
10.8 Charging Aspects of B2B
10.9 NRF API enhancements to avoid signalling and storing of redundant data
10.10 GBA_U Based APIs
10.11 Other aspects

11 Artificial Intelligence (AI)/Machine Learning (ML)
11.1 AI/ML model transfer in 5GS
11.2 AI/ML for NG-RAN
11.3 AI/ML management & charging
11.4 NEF Charging enhancement to support AI/ML in 5GS

12 Multicast and Broadcast Services (MBS)
12.1 5G MBS Phase 2
12.2 Enhancements of NR MBS
12.3 UE pre-configuration for 5MBS
12.4 Other MBS aspects

13 Network Slicing
13.1 Network Slicing Phase 3
13.2 Enhancement of NSAC for maximum number of UEs with at least one PDU session/PDN connection
13.3 Enhancement of Network Slicing UICC application for network slice-specific authentication and authorization
13.4 Charging Aspects of Network Slicing Phase 2
13.5 Charging Aspects for NSSAA
13.6 Charging enhancement for Network Slice based wholesale in roaming
13.7 Network Slice Capability Exposure for Application Layer Enablement
13.8 Other slice aspects

14 eXtended, Augmented and Virtual Reality (XR, AR, VR), immersive
14.1 XR (eXtended Reality) enhancements for NR
14.2 Media Capabilities for Augmented Reality
14.3 Real-time Transport Protocol Configurations
14.4 Immersive Audio for Split Rendering Scenarios  (ISAR)
14.5 Immersive Real-time Communication for WebRTC
14.6 IMS-based AR Conversational Services
14.7 Split Rendering Media Service Enabler
14.8 Extended Reality and Media service (XRM)
14.9 Other XR/AR/VR items

15 Mission Critical and emergencies
15.1 Enhanced Mission Critical Push-to-talk architecture phase 4
15.2 Gateway UE function for Mission Critical Communication
15.3 Mission Critical Services over 5MBS
15.4 Mission Critical Services over 5GProSe
15.5 Mission Critical ad hoc group Communications
15.6 Other Mission Critical aspects

16 Transportations (Railways, V2X, aerial)
16.1 MBS support for V2X services
16.2 Air-to-ground network for NR
16.4 Interconnection and Migration Aspects for Railways
16.5 Application layer support for V2X services; Phase 3
16.6 Enhanced NR support for high speed train scenario in frequency range 2 (FR2)

17 User Plane traffic and services
17.1 Enhanced Multiparty RTT
17.2 5G-Advanced media profiles for messaging services
17.3 Charging Aspects of IMS Data Channel
17.4 Evolution of IMS Multimedia Telephony Service
17.5 Access Traffic Steering, Switch and Splitting support in the 5G system architecture; Phase 3
17.6 UPF enhancement for Exposure and SBA
17.7 Tactile and multi-modality communication services
17.8 UE Testing Phase 2
17.9 5G Media Streaming Protocols Phase 2
17.10 EVS Codec Extension for Immersive Voice and Audio Services
17.11 Other User Plane traffic and services items

18 Edge computing
18.1 Edge Computing Phase 2
18.2 Architecture for enabling Edge Applications Phase 2
18.3 Edge Application Standards in 3GPP and alignment with External Organizations

19 Non-Public Networks
19.1 Non-Public Networks Phase 2
19.2 5G Networks Providing Access to Localized Services
19.3 Non-Public Networks Phase 2

20 AM and UE Policy
20.1 5G AM Policy
20.2 Enhancement of 5G UE Policy
20.3 Dynamically Changing AM Policies in the 5GC Phase 2
20.4 Spending Limits for AM and UE Policies in the 5GC
20.5 Rel-18 Enhancements of UE Policy

21 Service-based items
21.1 Enhancements on Service-based support for SMS in 5GC
21.2 Service based management architecture
21.3 Automated certificate management in SBA
21.4 Security Aspects of the 5G Service Based Architecture Phase 2
21.5 Service Based Interface Protocol Improvements Release 18

22 Security-centric aspects
22.1 IETF DTLS protocol profile for AKMA and GBA
22.2 IETF OSCORE protocol profiles for GBA and AKMA
22.3 Home network triggered primary authentication
22.4 AKMA phase 2
22.5 5G Security Assurance Specification (SCAS) for the Policy Control Function (PCF)
22.6 Security aspects on User Consent for 3GPP services Phase 2
22.7 SCAS for split-gNB product classes
22.8 Security Assurance Specification for AKMA Anchor Function Function (AAnF)
22.9 Other security-centric items

23 NR-only items
23.1 Not band-centric
23.1.1 NR network-controlled repeaters
23.1.2 Enhancement of MIMO OTA requirement for NR UEs
23.1.3 NR MIMO evolution for downlink and uplink
23.1.4 Further NR mobility enhancements
23.1.5 In-Device Co-existence (IDC) enhancements for NR and MR-DC
23.1.6 Even Further RRM enhancement for NR and MR-DC
23.1.7 Dual Transmission Reception (TxRx) Multi-SIM for NR
23.1.8 NR support for dedicated spectrum less than 5MHz for FR1
23.1.9 Enhancement of NR Dynamic Spectrum Sharing (DSS)
23.1.10 Multi-carrier enhancements for NR
23.1.11 NR RF requirements enhancement for frequency range 2 (FR2), Phase 3
23.1.12 Requirement for NR frequency range 2 (FR2) multi-Rx chain DL reception
23.1.13 Support of intra-band non-collocated EN-DC/NR-CA deployment
23.1.14 Further enhancements on NR and MR-DC measurement gaps and measurements without gaps
23.1.15 Further RF requirements enhancement for NR and EN-DC in frequency range 1 (FR1)
23.1.16 Other non-band related items
23.2 Band-centric
23.2.1 Enhancements of NR shared spectrum bands
23.2.2 Addition of FDD NR bands using the uplink from n28 and the downlink of n75 and n76
23.2.3 Complete the specification support for BandWidth Part operation without restriction in NR
23.2.4 Other NR band related topics

24 LTE-only items
24.1 High Power UE (Power Class 2) for LTE FDD Band 14
24.2 Other LTE-only items

25 NR and LTE items
25.1 4Rx handheld UE for low NR bands (<1GHz) and/or 3Tx for NR inter-band UL Carrier Aggregation (CA) and EN-DC
25.2 Enhancement of UE TRP and TRS requirements and test methodologies for FR1 (NR SA and EN-DC)
25.3 Other items

26 Network automation
26.1 Enablers for Network Automation for 5G phase 3
26.2 Enhancement of Network Automation Enablers

27 Other aspects
27.1 Support for Wireless and Wireline Convergence Phase 2
27.2 Secondary DN Authentication and authorization in EPC IWK cases
27.3 Mobile IAB (Integrated Access and Backhaul) for NR
27.4 Further NR coverage enhancements
27.5 NR demodulation performance evolution
27.6 NR channel raster enhancement
27.7 BS/UE EMC enhancements for NR and LTE
27.8 Enhancement on NR QoE management and optimizations for diverse services
27.9 Additional NRM features phase 2
27.10 Further enhancement of data collection for SON (Self-Organising Networks)/MDT (Minimization of Drive Tests) in NR and EN-DC
27.11 Self-Configuration of RAN Network Entities
27.12 Enhancement of Shared Data ID and Handling
27.13 Message Service within the 5G system Phase 2
27.14 Security Assurance Specification (SCAS) Phase 2
27.15 Vehicle-Mounted Relays
27.16 SECAM and SCAS for 3GPP virtualized network products
27.17 SECAM and SCAS for 3GPP virtualized network products
27.18 MPS for Supplementary Services
27.19 Rel-18 enhancements of session management policy control
27.20 Seamless UE context recovery
27.21 Extensions to the TSC Framework to support DetNet
27.22 Multiple location report for MT-LR Immediate Location Request for regulatory services
27.23 Enhancement of Application Detection Event Exposure
27.24 General Support of IPv6 Prefix Delegation in 5GS
27.25 5G Timing Resiliency System
27.26 MPS when access to EPC/5GC is WLAN
27.27 Data Integrity in 5GS
27.28 Security Enhancement on RRCResumeRequest Message Protection

28 Administration, Operation, Maintenance and Charging-centric Features
28.1 Introduction
28.2 Intent driven Management Service for Mobile Network phase 2
28.3 Management of cloud-native Virtualized Network Functions
28.4 Management of Trace/MDT phase 2
28.5 Security Assurance Specification for Management Function (MnF)
28.6 5G performance measurements and KPIs phase 3
28.7 Access control for management service
28.8 Management Aspects related to NWDAF
28.9 Management Aspect of 5GLAN
28.10 Charging Aspects of TSN
28.11 CHF Distributed Availability
28.12 Management Data Analytics phase 2
28.12 5G System Enabler for Service Function Chaining
28.13 Other Management-centric items

29 Other Rel-18 Topics

If you find them useful then please get the latest document from here.

Related Posts

Monday 25 July 2022

Demystifying and Defining the Metaverse

There is no shortage of Metaverse papers and articles as it is the latest trend in the long list of technologies promising to change the world. Couple of months back I wrote a post about it in the 6G blog here.

IEEE hosted a Metaverse Congress with the Kickoff Session 'Demystifying and Defining the Metaverse' this month as can be seen in the Tweet above. The video embedded below covers the following talks:

  • 0:01:24 - Opening Remarks by Eva Kaili (Vice President, European Parliament)
  • 0:09:51 - Keynote - Metaverse Landscape and Outlook by Yu Yuan (President-Elect, IEEE Standards Association)
  • 0:29:30 - Keynote - Through the Store Window by Thomas Furness (“Grandfather of Virtual Reality”)
  • 0:52:30 - Keynote - XR: The origin of the Metaverse as Water-Human-Computer Interaction (WaterHCI) by Steve Mann (“Father of Wearable Computing”)
  • 1:22:17 - Keynote - A Vision of the Metaverse: AI Infused, Physically Accurate Virtual Worlds by Rev Lebaredian (VP of Omniverse & Simulation Technology, NVIDIA)

Some fantastic definitions, explanations, use cases and vision on Metaverse. The final speaker nicely summarised Metaverse as shown in this slide below.

Worth highlighting point 6 that the Metaverse is device independent. I argued about something similar when we try and link everything to 6G (like we linked everything to 5G before). We are just in the beginning phase, a lot of updates and clarifications will come in the next few years before Metaverse starts taking a final shape.

Related Posts

Saturday 4 April 2020

5G eXtended Reality (5G-XR) in 5G System (5GS)


We have been meaning to make a tutorial on augmented reality (AR), virtual reality (VR), mixed reality (MR) and extended reality (XR) for a while but we have only managed to do it. Embedded below is video and slides for the tutorial and also a playlist of different use cases on XR from around the world.

If you are not familiar with the 5G Service Based Architecture (SBA) and 5G Core (5GC), best to check this earlier tutorial before going further. A lot of comments are generally around Wi-Fi instead of 5G being used for indoors and we completely agree. 3GPP 5G architecture is designed to cater for any access in addition to 5G access. We have explained it here and here. This guest post also nicely explains Network Convergence of Mobile, Broadband and Wi-Fi.





XR use cases playlist



A lot of info on this topic is from Qualcomm, GSMA, 3GPP and 5G Americas whitepaper, all of them in the links in the slides.


Related Posts:

Friday 20 March 2020

Real-life 5G Use Cases for Verticals from China

GSMA have recently published a series of reports related to China. This includes the 'The Mobile Economy China' report as well as reports on ‘Impacts of mmWave 5G in China’, ‘5G use cases for verticals China 2020’ and ‘Powered by SA case studies’. They are all available here.

China currently has 1.65bn subscribers (Excluding licensed cellular IoT) which is expected to grow to 1.73bn in 2025. The report quotes 1.20bn unique mobile subscribers that is expected to grow to 1.26bn by 2025. With a population of 1.44 billion, this would be assuming everyone over 10 years has a smartphone. 2G and 3G is being phased out so only 4G and 5G will be around in 2025. This would be different for IoT.

The 5G Use Cases for Verticals China 2020 report is comprised of 15 outstanding examples of 5G-empowered applications for verticals, ranging from industrial manufacturing, transportation, electric power, healthcare, education, to content creation, and zooms into the practical scenarios, technical features, and development opportunities for the next generation technology. Every use case represents the relentless efforts of 5G pioneers who are open, cooperative, and innovative.

  1. Flexible Smart Manufacturing with 5G Edge Computing (RoboTechnik, China Mobile, Ericsson)
  2. 5G Smart Campus in Haier Tianjin Washing Machine Factory (China Mobile, Haier)
  3. Aircraft Surface Inspection with 5G and 8K at Commercial Aircraft Corporation of China (Comac, China Unicom, Huawei)
  4. Xinfengming Group’s Smart Factory Based on MEC Technology (Xinfengming, China Mobile, ZTE)
  5. SANY Heavy Industry 5G and Smart Manufacturing (Sany, China Mobile, China Telecom, ZTE)
  6. Xiangtan Iron & Steel's 5G Smart Plant (Xisc, China Mobile, Huawei)
  7. The Tianjin 5G Smart Port (Tianjin, China Unicom, ZTE, Trunk)
  8. 5G Intelligent Connected Vehicle Pilot in Wuhan (China Mobile, Huawei, et al.)
  9. 5G BRT Connected Vehicle-Infrastructure Cooperative System (China Unicom, DTmobile, et al.)
  10. 5G for Smart Grid (China Mobile, Huawei, et al.)
  11. Migu's "Quick Gaming" Platform (China Mobile, et al.)
  12. 5G Cloud VR Demonstration Zone in Honggutan, Nanchang, Jiangxi Province (Besttone, China Telecom, Huawei)
  13. 5G Cloud VR Education Application Based on AI QoE (China Telecom, Nokia, et al.)
  14. China MOOC Conference: 5G + Remote Virtual Simulation Experiment (China Unicom, Vive HTC, Dell Technologies, et al.)
  15. 5G-empowered Hospital Network Architecture Standard (CAICT, China Mobile, China Telecom, China Unicom, Huawei, et al.)

They are all detailed in the report here.

I have written about 5G Use Cases in a blog post earlier, which also contains a video playlist of use cases from around the world. Not many from China in there at the moment but should be added as and when they are available and I discover them.


Related Posts:

Friday 4 October 2019

CW Seminar: The present, the future & challenges of AR/VR (#CWFDT)


One of my roles is as a SIG champion of the CW (Cambridge Wireless) Future Devices & Technologies Group. We recently organised an event on "The present, the future & challenges of AR/VR". The CW team has kindly even summarised it here. I have also tried to collect all the tweets from the day here.

Why is this important? Most of the posts on this blog is about the mobile technology and I am guessing most of the readers are from that industry too. While we are focussed too much on connectivity, it's the experience that makes the difference for most of the consumers. On the operator watch blog, I wrote recently about South Korea and the operator LG Uplus. Average data usage by 5G users in Korea is as high as 18.3GB, and average 4G users use 9GB in the same period, according to MSIT in May 2019. 5G data is about 2 times than that of 4G. This remarkable traffic growth is driven by UHD and AR/VR contents. According to the operator LG Uplus, new services featuring AR and VR functions are proving popular and already account for 20% of 5G traffic, compared with 5% for 4G.

Coming back to the CW event, some of the presentations were shared and they are available here for a limited time. There were so many learnings for me, it's difficult to remember and add all of them here.

Our newest SIG champ Nadia Aziz covered many different topics (presentation here) including how to quickly start making your own AR/VR apps and how AR apps will be used more and more for social media marketing in future.


Mariano Cigliano, Creative Developer at Unit9 (presentation here) discussed the journey of their company and what they have learned along the way whilst developing their solution to disrupt the design process through integrating immersive technologies.


Aki Jarvinen from Digital Catapult (presentation here) explained about Brown-boxing and Bodystorming. Both very simple techniques but can help get the app designers story straight and save a lot of time, effort and money while creating the app.


James Watson from Immerse (presentation here) talked about VR training. So many possibilities if done correctly and can be more interactive than the online or classroom training's.



Schuyler Simpson, Vice President - Strategic Partnerships & Operations at Playfusion (presentation here) discussed the reality of enhanced reality, diving deep into the challenges about creating an experience that resonates best with audiences. In his own words, "Enhanced Reality blends visual, audio, haptic, and intelligent components to create highly personalized, immersive, and most importantly, valuable experiences for organizations and their audiences."

The most valuable learning of the day was to create an AR/VR app (just in theory), assuming there is no technology limitation. The whole journey consisted of:

  • Brainstorming of the Use Case
  • Key Pain Points
  • Sort the pain points in priority and select top 3 or 5
  • Map customer journey
  • Define persona for which the app is being designed
  • Map their journey
  • Touch points
  • What can be improved on those touch points 
  • Design a VR/AR application for the defined problem 
  • Storyboarding AR/VR use case
  • UX design considerations – spatial, emotional.. 
  • Scribe a prototype 
  • Playback to others.


Thanks to everyone who helped make this whole event possible, from the SIG champs to the CW team and the host & sponsors NTT Data. Special thanks to our newest SIG champ, Nadia Aziz for tirelessly working to make this event a success.

Related Articles:

Tuesday 12 March 2019

Can Augmented & Mixed Reality be the Killer App 5G needs?


Last October Deutsche Telekom, Niantic and MobiledgeX announced a partnership to create advanced augmented reality experiences over mobile network technologies. I was lucky to find some time to go and play it at Deutsche Telekom booth. The amount of processing needed for this to work at best also meant that the new Samsung Galaxy S10+ were needed but I felt that it also occasionally struggled with the amount of data being transferred.


The pre-MWC press release said:

Deutsche Telekom, Niantic Inc., MobiledgeX and Samsung Showcase World’s First Mobile Edge Mixed Reality Multi-Gamer Experience

At the Deutsche Telekom booth at MWC 2019 (hall 3, booth 3M31) the results of the previously announced collaboration between Deutsche Telekom, Niantic, Inc., and MobiledgeX are on display and you’re invited to play. Niantic’s “Codename: Neon”, the world’s first edge-enhanced Mixed Reality Multiplayer Experience, delivered by ultra-low latency, Deutsche Telekom edge-enabled network, and Samsung Galaxy S10+ with edge computing enablement, will be playable by the public for the first time. 

“The ultra-low latency that Mobile Edge Computing (MEC) enables, allows us to create more immersive, exciting, and entertaining gameplay experiences. At Niantic, we’ve long celebrated adventures on foot with others, and with the advent of 5G networks and devices, people around the world will be able to experience those adventures faster and better,” said Omar Téllez, Vice-President of Strategic Partnerships at Niantic.

The collaboration is enabled using MobiledgeX’s recently announced MobiledgeX Edge-Cloud R1.0 product. Key features include device and platform-independent SDKs, a Distributed Matching Engine (DME) and a fully multi-tenant control plane that supports zero-touch provisioning of edge cloud resources as close as possible to the users. Immediate examples of what this enables include performance boosts for Augmented Reality and Mixed Reality (MR) experiences as well as video and image processing that meets local privacy regulations. 

Samsung has been working together with Deutsche Telekom, MobiledgeX, and Niantic on a natively edge-capable connectivity and authentication in Samsung Galaxy S10+ to interface with MobiledgeX Edge-Cloud R1.0 and dynamically access the edge infrastructure it needs so that augmented reality and mixed reality applications can take advantage of edge unmodified. Samsung will continue such collaborations with industry-leading partners not only to embrace a native device functionality of edge discovery and usage for the mobile devices and consumers, but also to seek a way together to create new business models and revenue opportunities leading into 5G era.

Deutsche Telekom’s ultra-low latency network was able to deliver on the bandwidth demands of “Codename: Neon” because it deployed MobiledgeX’s edge software services, built on dynamically managed decentralized cloudlets. “From our initial partnership agreement in October, we are thrilled to showcase the speed at which we can move from idea to experience, with full end-to-end network integration, delivered on Samsung industry leading edge native devices,” said Alex Jinsung Choi, Senior Vice President Strategy and Technology Innovation at Deutsche Telekom.

From the gaming industry to industrial IoT, and computer vision applications, consumer or enterprise, the experience is a great example of interactive AR experiences coming from companies like Niantic in the near future.  As AR/VR/MR immersive experiences continue to shape our expectations, devices, networks and clouds need to seamlessly and dynamically collaborate.

This video from Deutsche Telekom booth shows how the game actually feels like



Niantic CEO John Hanke delivered a keynote at Mobile World Congress 2019 (embedded below). According to Fortune article, "Why the Developer of the New 'Harry Potter' Mobile Game and 'Pokemon Go' Loves 5G":

Hanke showed a video of a prototype game Niantic has developed codenamed Neon that allows multiple people in the same place at the same time to play an augmented reality game. Players can shoot at each other, duck and dodge, and pick up virtual reality items, with each player’s phone showing them the game’s graphics superimposed on the real world. But the game depends on highly responsive wireless connections for all the phones, connections unavailable on today’s 4G LTE networks.

“We’re really pushing the boundaries of what we can do on today’s networks,” Hanke said. “We need 5G to deliver the kinds of experiences that we are imagining.”

Here is the video, it's very interesting and definitely worth a watch. For those who may not know, Niantic spun out of Google in October 2015 soon after Google's announcement of its restructuring as Alphabet Inc. During the spinout, Niantic announced that Google, Nintendo, and The Pokémon Company would invest up to $30 million in Series-A funding.



So what do you think, can AR / MR be the killer App 5G needs?

Tuesday 16 January 2018

3GPP-VRIF workshop on Virtual Reality Ecosystem & Standards in 5G

Its been a year since I last posted about Augmented / Virtual Reality Requirements for 5G. The topic of Virtual Reality has since made good progress for 5G. There are 2 technical reports that is looking at VR specifically. They are:

The second one is work in progress though. 

Anyway, back in Dec. 3GPP and Virtual Reality Industry Forum (VRIF) held a workshop on VR Ecosystem & Standards. All the materials, including agenda is available here. The final report is not there yet but I assume that there will be a press release when the report is published.

While there are some interesting presentations, here is what I found interesting:

From presentation by Gordon Castle, Head of Strategy Development, Ericsson





From presentation by Martin Renschler, Senior Director Technology, Qualcomm


For anyone wanting to learn more about 6 degrees of freedom (6- DoF), see this Wikipedia entry. According to the Nokia presentation, Facebook’s marketing people call this “6DOF;” the engineers at MPEG call it “3DOF+.”
XR is 'cross reality', which is any hardware that combines aspects of AR, MR and VR; such as Google Tango.

From presentation by Devon Copley, Former Head of Product, Nokia Ozo VR Platform
Some good stuff in the pres.

From presentation by Youngkwon Lim, Samsung Research America; the presentation provided a link to a recent YouTube video on this presentation. I really liked it so I am embedding that here:



Finally, from presentation by Gilles Teniou, SA4 Vice chairman - Video SWG chairman, 3GPP





You can check and download all the presentations here.

Further Reading:

Sunday 22 January 2017

Augmented / Virtual Reality Requirements for 5G


Ever wondered whether 5G would be good enough for Augmented and Virtual Reality or will we need to wait for 6G? Some researchers are trying to identify the AR / VR requirements, challenges from a mobile network point of view and possible options to solve these challenges. They have recently published a research paper on this topic.

Here is a summary of some of the interesting things I found in this paper:

  • Humans process nearly 5.2 gigabits per second of sound and light.
  • Without moving the head, our eyes can mechanically shift across a field of view of at least 150 degrees horizontally (i.e., 30:000 pixels) and 120 degrees vertically (i.e., 24:000 pixels).
  • The human eye can perceive much faster motion (150 frames per second). For sports, games, science and other high-speed immersive experiences, video rates of 60 or even 120 frames per second are needed to avoid motion blur and disorientation.
  • 5.2 gigabits per second of network throughput (if not more) is needed.
  • At today’s 4K resolution, 30 frames per second and 24 bits per pixel, and using a 300 : 1 compression ratio, yields 300 megabits per second of imagery. That is more than 10x the typical requirement for a high-quality 4K movie experience.
  • 5G network architectures are being designed to move the post-processing at the network edge so that processors at the edge and the client display devices (VR goggles, smart TVs, tablets and phones) carry out advanced image processing to stitch camera feeds into dramatic effects.
  • In order to tackle these grand challenges, the 5G network architecture (radio access network (RAN), Edge and Core) will need to be much smarter than ever before by adaptively and dynamically making use of concepts such as software defined networking (SDN), network function virtualization (NFV) and network slicing, to mention a few facilitating a more flexible allocating resources (resource blocks (RBs), access point, storage, memory, computing, etc.) to meet these demands.
  • Immersive technology will require massive improvements in terms of bandwidth, latency and reliablility. Current remotereality prototype requires 100-to-200Mbps for a one-way immersive experience. While MirrorSys uses a single 8K, estimates about photo-realistic VR will require two 16K x 16K screens (one to each eye).
  • Latency is the other big issue in addition to reliability. With an augmented reality headset, for example, real-life visual and auditory information has to be taken in through the camera and sent to the fog/cloud for processing, with digital information sent back to be precisely overlaid onto the real-world environment, and all this has to happen in less time than it takes for humans to start noticing lag (no more than 13ms). Factoring in the much needed high reliability criteria on top of these bandwidth and delay requirements clearly indicates the need for interactions between several research disciplines.


These key research directions and scientific challenges are summarized in Fig. 3 (above), and discussed in the paper. I advice you to read it here.

Related posts:

Tuesday 4 December 2012

5 videos on Augmented Reality

Looks like Augmented Reality (AR) is getting hot, just in time for Christmas. I wonder how many products will be sold based on AR. As I suggested in an earlier post, there may be 1 Billion users by 2020. Here are the videos:

Google's Ingress is an AR baased game:



Augmented Reality Book of Spells, Harry Potter experience:

Wonder when/ever it will come to a mobile near you.

LightBeam - Interacting with Augmented Real-World Objects in Pico Projections:



The next is a bit old but worth mentioning:

LuminAR from MIT


Finally, the science of Haptics will allows us to "touch" objects in a virtual world in future

Augmented Reality and Touch



Saturday 16 June 2012

1 Billion Augmented Reality (AR) users by 2020

It has been slow but I am getting more and more convinced that AR can do far more than what we think it can. Part of my pessimism was due to the fact that this is placed on the Peak of Inflated expectations on the Gartner Hype cycle and was predicted to go to the trough. But in the end success depends of what the available apps are like.

Part of my optimism stems from the fact that things have been changing rapidly. Take for example the 'Augmented Future' video. When I watched it I thought this would happen but maybe quite a few years down the road. Then came the 'Project Glass' video and suddenly you are thinking from 'how it would be done' to 'when will this be available'. The latest news I read was that the prototypes are being tested Google's offices.

I am sure the first few releases would be far from perfect and will have few features, security issues, etc. but we certainly think its possible. I dont know its working but it could be actually synched with a device in your pocket and is just an add on that communicates via something like Bluetooth.

In a recent event, Intel showed off their new Ultrabook features using Augmented Reality. See the video:



And there is another video of BBC frozen planet where people can put themselves with the Augmented creatures. See below:



These just go to show what can be done via Augmented reality. With more and more powerful devices that are available to us at reasonable prices, all that needs to be done is to create Apps and they will find the users trying to make most out of them.

I have already posted some videos and presentations from an event back in March that talk more about the apps and the platforms here.

The idea of 1 Billion AR users is not mine but has been used by Tomi Ahonen in a recent TEDx presentation and his blog post. The TEDx video as follows:



You can read  more about Tomi's idea on Aurmented Reality in his blog post here.

Sunday 8 April 2012

Security issues in new technologies

I have attended a lot of events/talks in the last month where people talked about Augmented Reality, Proximity Marketing, QR codes, etc. but nobody seems to talk about security. Its being taken for granted. For example MAC's have been said to be Virus proof and they probably are but other Apps may be infectable and in this case its the Java that has allowed a MAC botnet about 0.6 Million strong.

Some years back proximity marketing via Bluetooth was a big thing and we were lucky to be involved with couple of projects making it possible but then the Bluetooth virus came to light and people stopped leaving their Bluetooth on in public places. Doesnt look like Bluetooth based proximity marketing has gone very far since those days.

QR codes is a simple way to for advertisers redirect the end users to their websites but then recently I read that a rogue QR code can be used to redirect the end users to a site that can be used to hack their phones. The main thing pointed out is that 99% of the time QR codes are read by mobile phones and 99% of these phones are either iPhones or Android's, which can help narrow down the exploits.

There is a good chance that when there is mass adoption of these new technologies, Security is going to be a big issue. Not sure if enough is being done. If there are any pointers on security issues please feel free to comment.

Wednesday 4 April 2012

Project Glass: One day... By Google


I seem to like the Corning ones more that I blogged here.

** New Edits 05/04/12 09:40 **
From CNET:

Google's augmented reality glasses are real! Dubbed Project Glass, the long-rumoured lenses that show you heads-up information about the world around you have been confirmed by the company.
At the moment, Google's announcement is limited to a Google+ page
Here is a parody on above video from Tom Scott: