Originally Posted on CW Blog here.
Thursday, 16 March 2017
Originally Posted on CW Blog here.
Friday, 2 September 2016
Now that I have had time to think about the questions, here are a bit more detailed thoughts. As always, feedback, comments & suggestions welcome
Q: What will network architecture look like in the 5G era?
I have long argued that 5G will not be a single technology but a combination of multiple old and new technologies. You will often find various terms like Multi-stream Aggregation (MSA), Opportunistic Aggregation and Multi-connectivity being used to explain this. Not only will 2G, 3G and 4G have a role to play, Wi-Fi and other unlicensed technologies would be a part of 5G too.
I have had many discussions on this topic with respected analysts and many of them agree.
One of the approaches being proposed for the initial version of 5G is the non-standalone version of 5G which will use LTE as the control plane anchor and new 5G radio for user plane. Not only will this be easier to deploy along with the existing LTE network, it would be faster and hopefully less costly.This #5G chart sums it up. Start from where we are, go in 4 different directions at once, give it all the same name https://t.co/ETldL8Sr8N— Dan Warren (@TMGB) June 28, 2016
Q: To what extent is 5G dependent on virtualization?
Networks and Network Functions are progressively being virtualized, independently of 5G. Having said that, virtualization will play a big role in achieving the 5G architecture. Mobile operators can’t be expected to keep paying for proprietary hardware; virtualization would help with cost reduction and quick deployments.
Network slicing for instance will help partition the network for different requirements, on the fly depending on what is going on at any particular time.
Q: What is your view on the interplay between standards and open-source developments?
Standards enable cost reduction by achieving economy of scale whereas open-source development enable innovation and quick deployment. They are both needed and they will willingly or unwillingly co-exist.
Q: What do you see as the 3 greatest technical uncertainties or challenges on route to 5G?
While there are many known and unknown challenges with 5G, some obvious ones that we can see are:
- Spectrum identification and harmonization.
- Getting to the right architecture which is backward compatible and future proof, without making it too complex
- SON – Once you have everything in place you have to make many different parts of the network work together with different kinds of loads and traffic. SON will play a crucial role here.
Q: What would 5G actually mean for consumers, business and IoT? / What will 5G allow me to do that I can’t right now with 4G?
There are a lot of interesting use cases being discussed like remote operations and remote controlled cars but most of them do not represent the general consumers and some of them are just gimmicks.
I really like the NGMN whitepaper that laid out some simple use cases.
If done properly, 5G will allow:
- Simplification of the network resulting in low latency – this means that your content will load faster and the delay between requests and responses are small.
- Reasonable speed broadband everywhere - This will also depend on the operators’ rollouts plan but different technologies in 5G network would (should) enable a good speed reliable broadband not just in the middle of the cell but also on the edges. In fact, the concept of edges should be looked at in 5G and a solution to avoid data rates falling off should be found.
- Connectivity on the move – Whether we are talking about connectivity in trains/buses or from public safety point of view, it is important to define group connectivity, direct communications, etc.
Q: What will set companies apart in the development of 5G?
The days of vendor lock-ins are over. What will set companies apart is their willingness to be open to working with other companies by having open API’s and interfaces. Operator networks will include solutions from many different vendors. For them to be quick to bring innovative solutions to the market, they need vendors to work together rather than against each other.
Q: There is a lot of talk about the vision for 2020. What do you think the world will look like in terms of connectivity in 2030?
It would be fair to say that by 2030, connectivity would have reached a completely new dimension. One of the big areas of development that is being ignored by mainstream mobile community is the development of satellite communications. There are many low earth orbit (LEO) constellations and high-throughput satellites (HTS) being developed. These LEO and HTS combination can provide high speed connectivity with 4G like latency and high throughputs for planes/ships which cannot be served by ground based mobile technology. Broadband access everywhere will only become a reality with satellite technology complementing mobile technology.
Disclaimer: This blog is maintained in my personal capacity and this post expresses my own personal views, not the views of my employer or anyone else.
Saturday, 12 March 2016
In one of my roles, I get to see some of these developments happening in the satellite world. Here are some of the recent things that I have learned.
In a recent presentation by Intelsat (embedded below), they showed how we will have a truly high throughput global coverage with the help of GEO and LEO satellites. Depending on the applications, they can take advantage of either or both. Ubiquitously connected cars, planes, trains, ships and other vehicles will soon be a reality. See their presentation below:
Intelsat is not the only operator innovating and coming up with some amazing solutions.
here and here.
Eutelsat on the other hand is trying something that has not been done before. Their Quantum class satellites will be creating and modifying the beams dynamically to provide coverage whenever and wherever needed. See their presentation here.
These are just a few examples, there are many other operators I have not mentioned here. Most of them have some sort of ambitious plan which will be there before 2020.
So what role will these satellites play in the 5G world? We will look at this question in the Satellite Applications & Services Conference in October but I am interested in hearing your thoughts.
Saturday, 19 December 2015
One of the things that the World Radio Conference 2015 (WRC-15) enabled was to provide a universal spectrum allocation for flight tracking. What this means in simple terms is that once completely implemented, flights will hopefully no longer be lost, like MH370. It will now be possible to accurately track flights with satellites across nearly 100% of the globe, up from 30% today, by 2018.
To make you better understand this, see this video below:
Automatic Dependent Surveillance (ADS) is a surveillance technique in which aircraft automatically provide, via a data link, data derived from on-board navigation and position-fixing systems, including aircraft identification, four-dimensional position and additional data as appropriate. ADS data is displayed to the controller on a screen that replicates a radar screen. ICAO Doc 4444 PANS-ATM notes that air traffic control service, may be predicated on the use of ADS provided that identification of the aircraft involved is unambiguously established. Two main versions of ADS are currently in use:
- Automatic Dependent Surveillance-Broadcast (ADS-B) is a function on an aircraft or surface vehicle that broadcasts position, altitude, vector and other information for use by other aircraft, vehicles and by ground facilities. It has become the main application of the ADS principle.
- Automatic Dependent Surveillance-Contract (ADS-C) functions similarly to ADS-B but the data is transmitted based on an explicit contract between an ANSP and an aircraft. This contract may be a demand contract, a periodic contract, an event contract and/or an emergency contract. ADS-C is most often employed in the provision of ATS over transcontinental or transoceanic areas which see relatively low traffic levels.
The ITU press release on this topic:
The frequency band 1087.7-1092.3 MHz has been allocated to the aeronautical mobile-satellite service (Earth-to-space) for reception by space stations of Automatic Dependent Surveillance-Broadcast (ADS-B) emissions from aircraft transmitters.
The frequency band 1087.7-1092.3 MHz is currently being utilized for the transmission of ADS-B signals from aircraft to terrestrial stations within line-of-sight. The World Radiocommunication Conference (WRC-15) has now allocated this frequency band in the Earth-to-space direction to enable transmissions from aircraft to satellites. This extends ADS-B signals beyond line-of-sight to facilitate reporting the position of aircraft equipped with ADS-B anywhere in the world, including oceanic, polar and other remote areas.
WRC-15 recognized that as the standards and recommended practices (SARP) for systems enabling position determination and tracking of aircraft are developed by the International Civil Aviation Organization (ICAO), the performance criteria for satellite reception of ADS-B signals will also need to be addressed by ICAO.
This agreement follows the disappearance and tragic loss of Malaysian Airlines Flight MH370 in March 2014 with 239 people on board, which spurred worldwide discussions on global flight tracking and the need for coordinated action by ITU and other relevant organizations.
For more details see: globalflightsafety.org
Monday, 9 November 2015
While there are many parameters to consider when designing the next generation network, speed is the simplest one to understand and sell to the end user.
Last week, I did a keynote at the International Telecom Sync Forum (ITSF) 2015. As an analyst keynote, I looked at how the networks are evolving and getting more complex, full of interesting options and features available for the operator to decide which ones to select.
There wont just be multiple generations of technologies existing at the same time but there will also be small cells based networks, macro networks, drones and balloons based networks and satellite based networks.
My presentation is embedded below. For any reason, if you want to download it, please fill the form at the bottom of this page and download.
Just after my keynote, I came across this news in Guardian about 'Alphabet and Facebook develop rival secret drone plans'; its an interesting read. As you may be aware Google is actively working with Sri Lanka and Indonesia for providing seamless internet access nationally.
It was nice to hear EE provide the second keynote which focused on 5G. I especially liked this slide which summarised their key 5G research areas. Their presentation is embedded below and available to download from slideshare.
The panel discussion was interesting as well. As the conference focused on timing and synchronisation, the questions were on those topics too. I have some of them below, interested to hear your thoughts:
- Who cares about syncing the core? - Everything has moved to packets, the only reason for sync is to coordinate access points in wireless for higher level services. We have multiple options to sync the edge, why bother to sync the core at all?
- We need synchronisation to improve the user’s experience right? - Given the ever improving quality of the time-bases embedded within equipment, what exactly would happen to the user experience if synchronisation collapsed… or is good sync all about operators experience?
- IoT… and the impact on synchronisation- can we afford it? - M2M divisions of network operators make a very small fraction of the operator’s revenue, is that going to change and will it allow the required investment in sync technology that it might require?
Sunday, 25 October 2015
Dish states that there are misconceptions about what satellite technology can deliver for 5G networks. Essentially Dish says that satellites will be capable of delivering two-way communications to support 5G.
A hybrid ground and space 5G network would use small satellites that each use a "spot beam" to provide a dedicated area of two-way coverage on the ground. This is different than the old model of using one satellite with a single beam to provide a one-way service like a TV broadcast over a landmass.
Dish argues that newer, smaller satellites, equipped with the latest multi-antenna arrays (MIMO) would allow for "ubiquitous connectivity through hybrid satellite and terrestrial networks," the operator writes. In this model, satellites could connect areas that it would be hard to network otherwise like mountains and lakes.
The presentation from Dish is as follows:
Alcatel-Lucent provided a whitepaper along with the presentation. The paper provides an interesting view of 5G from their point of view. Its embedded below:
The presentation from Kyocera focused on TD-LTE which I think will play a prominent role in 5G. In case of wide channels, TD-LTE can help predict the channel accurately, which is a drawback for FDD at high frequencies. Their presentation is available here.
The presentation from NEC focussed on different technologies that will play a role in 5G. Their presentation is available here.
commercialise 5G by 2017, even though 5G will not be fully specified according to 3GPP by then. Anyway, here is the presentation by KT.
Sunday, 16 August 2015
If we look at the different access technologies, each has its own evolution in the coming years. Some of these are:
- Fixed/Terrestrial broadband: (A)DSL, Cable, Fiber
- Mobile Broadband: 3G, 4G and soon 5G
- Wireless Broadband: WiFi
- Laser communications
- LiFi or LED based communications
- High frequency sound based communications
- Seamless handovers between cellular and WiFi and vice versa.
There has been an interest in moving on to higher frequencies. These bands can be used for access as well as backhaul. The same applies for most of the access technologies listed above which can work as a backhaul to enable other access technologies.
While planned networks would be commonplace, other topologies like mesh network will gain ground too. Device to device and direct communications will help create ad-hoc networks.
While the current networks are mostly stationary, mobile networks will also become common. Opportunity Driven Multiple Access (ODMA) or Multihop Cellular Networks (MCN) would help devices use other devices to reach their destination. Non-standardised proprietary solutions (for example Firechat) will become common too. Security, Privacy and Trust will play an important role here.
Satellite networks, the truly global connectivity providers will play an important role too. While backhauling the small cells on planes, trains and ships will be an important part of satellite networks, they may be used for access too. Oneweb plans to launch 900 micro satellites to provide high speed global connectivity. While communications at such high frequencies mean that small form factor devices like mobile cant receive the signals easily, connected cars could use the satellite connectivity very well.
Samsung has an idea to provide connectivity through 4,600 satellites to be able to transmit 200GB monthly to 5 Billion people worldwide. While this is very ambitious, its not the only innovative and challenging idea. I am sure we all now about the Google loon. Facebook on the other hand wants to use a solar powered drone (UAV) to offer free internet access services to users who cannot get online.
As I mentioned, security and privacy will be a big challenge for devices being able to connect to multiple access networks and other devices. An often overlooked challenge is the timing and sync between different networks. In an ideal world all these networks would be phase and time synchronised to each other so as not to cause interference but in reality this will be a challenging task, especially with ad-hoc and moing networks.
I will be giving a keynote at the ITSF 2015 in November at Edinburgh. This is a different type of conference that looks at Time and Synchronisation aspects in Telecoms. While I will be providing a generic overview on where the technologies are moving (continuing from my presentation in Phase ready conference), I am looking forward to hearing about these challenges and their solutions in this conference.
Andy Sutton (Principal Network Architect) and Martin Kingston (Principal Designer) with EE have shared some of their thought on this topic which is as follows and available to download here.
Sunday, 21 June 2015
Last week I attended an event in the University of Surrey that was about providing high speed connectivity to un-served and under-served areas in future. While there is no arguing that satellites are a great option for unserved areas, the underserved areas can really benefit by such initiatives.
The way this is being proposed is to have a specialised Intelligent User Gateway (IUG) that can connect to ADSL, Mobile and Satellite. The assumption is that in areas of poor conectivity, ADSL can provide 2Mbps and the mobile could do something similar, upto 8Mbps. The satellites can easily do 20Mbps.
While the satellite broadband has the advantage of high speeds, they often suffer from high latencies. ADSL on the other hand has very small latency but may not be good enough for streaming kind of applications. Mobile generally falls in between for latency and speed. Using Multipath TCP and some intelligent routing algorithms, decisions can be taken to optimise for latency and speeds.Evolution path of Satellite Broadband: 2005: 2 - 3Mbps 2010: 10 - 20 Mbps 2015: 30 - 30 Mbps 2020: roughly 100 Mbps— Zahid Ghadialy (@zahidtg) June 11, 2015
I did see some impressive demo's in the lab and it did what is says on the tin. The real challenge would be the business models. While ADSL can offer unlimited internet, both Mobile and Satellite broadband will have caps. I was told that limits could be imposed so that once the Mobile/Satellite data allowance is over, only ADSL would be used. Maybe a more complex algorithm could be implemented in future that can include cost and priority of the application/service being used.
An example would be that sometimes I want to watch some long videos over Youtube but I am happy to start buffering an hour in advance. Its not critical that I have to watch that now. I would be more than happy to save my Mobile/Satellite broadband data allowance for some other day when I need to watch things more urgently. If the end of month is coming and I have a lot of data allowance left then maybe I dont mind using the quota otherwise I will anyway lose the allowance. Its always challenging to put this intelligence in the routing decision algorithms though.
Anyway, the combined presentations are embedded below and you can download them from the BATS project page here:
Saturday, 16 May 2015
Saw the above picture recently on Twitter. While its great to see how connected our future homes and even cities would be, it would be interesting to see what technologies are used for connecting these devices.
Cambridge Wireless had a smart homes event last month, there were some interesting presentations that I have detailed below.
The first of these technologies discussed is LoRa. As can be seen, its billed as ultimate long range (10 mile) and low power (10 year battery lifetime) technology. It uses spread-spectrum making it robust to channel noise. Here is the presentation:
The next technology is Zigbee 3.0. According to Zigbee Alliance:
The new standard unifies ZigBee standards found in tens of millions of devices delivering benefits to consumers today. The ZigBee 3.0 standard enables communication and interoperability among devices for home automation, connected lighting, energy efficiency and other markets so more diverse, fully interoperable solutions can be delivered by product developers and service providers. All device types, commands, and functionality defined in current ZigBee PRO-based standards are available to developers in the new standard.
ZigBee 3.0 defines the widest range of device types including home automation, lighting, energy management, smart appliance, security, sensors, and health care monitoring products. It supports both easy-to-use DIY installations as well as professionally installed systems. Based on IEEE 802.15.4, which operates at 2.4 GHz (a frequency available for use around the world), ZigBee 3.0 uses ZigBee PRO networking to enable reliable communication in the smallest, lowest-power devices. Current ZigBee Certified products based on ZigBee Home Automation and ZigBee Light Link are interoperable with ZigBee 3.0. A complete list of standards that have been merged to create ZigBee 3.0 can be seen on the website at www.ZigBee.org.
“The ZigBee Alliance has always believed that true interoperability comes from standardization at all levels of the network, especially the application level which most closely touches the user,” said Tobin J. M. Richardson, President and CEO of the ZigBee Alliance. “Lessons learned by Alliance members when taking products to market around the world have allowed us to unify our application standards into a single standard. ZigBee 3.0 will allow product developers to take advantage of ZigBee’s unique features such as mesh networking and Green Power to deliver highly reliable, secure, low-power, low-cost solutions to any market.”
Finally, we have Bluetooth Smart mesh.
CSRmesh enables Bluetooth® low energy devices not only to receive and act upon messages, but also to repeat those messages to surrounding devices thus extending the range of Bluetooth Smart and turning it into a mesh network for the Internet of Things.
While the CW event was not able to discuss all possible technologies (and believe me there are loads of them), there are other popular contenders. Cellular IoT (CIoT) is one if them. I have blogged about the LTE Cat-0 here and 5G here.
A new IEEE Wi-Fi standard 802.11ah using the 900MHz band has been in works and will solve the need of connectivity for a large number of things over long distances. A typical 802.11ah access point could associate more than 8,000 devices within a range of 1 km, making it ideal for areas with a high concentration of things. The Wi-Fi Alliance is committed to getting this standard ratified soon. With this, Wi-Fi has the potential to become a ubiquitous standard for IoT. See also this article by Frank Rayal on this topic.
Finally, there is SIGFOX. According to their website:
SIGFOX uses a UNB (Ultra Narrow Band) based radio technology to connect devices to its global network. The use of UNB is key to providing a scalable, high-capacity network, with very low energy consumption, while maintaining a simple and easy to rollout star-based cell infrastructure.
The network operates in the globally available ISM bands (license-free frequency bands) and co-exists in these frequencies with other radio technologies, but without any risk of collisions or capacity problems. SIGFOX currently uses the most popular European ISM band on 868MHz (as defined by ETSI and CEPT) as well as the 902MHz in the USA (as defined by the FCC), depending on specific regional regulations.
Communication on SIGFOX is secured in many ways, including anti-replay, message scrambling, sequencing, etc. The most important aspect of transmission security is however that only the device vendors understand the actual data exchanged between the device and the IT systems. SIGFOX only acts as a transport channel, pushing the data towards the customer's IT system.
An important advantage provided by the use of the narrow band technology is the flexibility it offers in terms of antenna design. On the network infrastructure end it allows the use of small and simple antennas, but more importantly, it allows devices to use inexpensive and easily customizable antennas.
Sigfox is also working on project Mustang, a three-year effort to build a hybrid satellite/terrestrial IoT (internet of things) network. According to Rethink Research:
The all-French group also contains aerospace firm Airbus, research institute CEA-Leti and engineering business Sysmeca. The idea is to use Sigfox as the terrestrial data link, with satellite backhaul and connections to planes and boats provided by a low-earth orbit (LEO) satellite constellation.
The satellite link could be added to either the end devices or the base station, so that if a device was unable to connect to the terrestrial Sigfox network, it could fall back to the satellite.
While the power requirements for this would be prohibitive for ultra-low power, battery-operated devices, for those with a wired power supply and critical availability requirements (such as smart meters, alarms, oil tankers and rigs) the redundancy would be an asset. These devices may transmit small amounts of data but when they do need to communicate, the signal must be assured.
The Sigfox base station could be fitted with a satellite uplink as a primary uplink as well as a redundancy measure in some scenarios where terrestrial network reach cannot be achieved. With a three-year lifecycle, Mustang’s participants are looking to create a seamless global network, and note that the planned dual-mode terrestrial/satellite terminal will enable switching between the two channels in response to resource availability.
The group says that the development of this terminal modem chipset is a priority, with later optimization of the communication protocols being the next step before an application demonstration using an airplane.
The project adds that the full potential of the IoT can only be achieved by offering affordable mobile communications at a global scale and reach. Key to this is adapting existing networks, according to the group, which explains why Sigfox has been chosen – given that the company stresses the affordability of its system.
Saturday, 28 March 2015
Last week at work, we released a report titled "UK Spectrum Usage & Demand". The only time most people hear about spectrum is when there are some auctions going on. Often a small chunk of spectrum gets sold off for billion(s) of dollars/pounds and these surely make a headline. As I recently found out, 50% of spectrum in UK is shared and 25% is license exempt.
Anyway, this first edition of the report focuses on Public Mobile, Utilities, Business Radio and Space/Satellites. Space is becoming an important area of focus here as it is a significant contributor to the UK economy.
Anyway, the report is embedded below and is available to download from here:
Sunday, 15 March 2015
Some of you may remember that couple of years back Ericsson showed an example of using LTE in extreme conditions. The video below shows that LTE can work in these scenarios.
Now there are various acronyms being used for these type of communications but the one most commonly used is Direct-Air-to-Ground Communications (DA2GC), Air-to-Ground (A2G) and Ground-to-Air (G2A).
While for short distance communications, LTE or any cellular technology (see my post on Flying Small Cells) may be a good option, a complete solution including communication over sea would require satellite connectivity as well. As I have mentioned in a blog post before, 75Mbps connectivity would soon be possible with satellites.
For those interested in working of the Air-Ground-Air communications, would find the presentation below useful. A much detailed ECC CEPT report from last year is available here.
The next challenge is to explore whether LTE can be used for Mission Critical Air Ground Air communications. 3GPP TSG RAN recently conducted study on the feasibility and the conclusions are as follows:
- Air-to-Ground communications can be provided using the LTE standards (rel-8 and beyond depending on the targeted scenarios).
- 3GPP UE RF requirements might need to be adapted
- It may be possible to enhance the performance of the communications with some standards changes, but these are in most cases expected to be non-fundamental optimizations
- Engineering and implementation adaptations are required depending on the deployment scenario. In particular, the ECC report  comments that from implementation point of view synchronization algorithms are to be modified compared to terrestrial mobile radio usage in order to cope with high Doppler frequency shift of the targeted scenario. In addition, some network management adaptations might be needed. From engineering perspective the Ground base station antenna adjustment has to be matched to cover indicated aircraft heights above ground up to 12 km by antenna up-tilt. It is also expected that the inter-site distances would be dominated by the altitudes to be supported .
- A2G technology using legacy LTE has been studied and successfully trialed covering different kinds of services: Surfing, downloading, e-mail transmission, use of Skype video, audio applications and Video conferencing. Related results can be found in several documents from ECC and from companies , , . The trials in  and  assumed in general a dedicated spectrum, and the fact that the communications in the aircraft cabin are using WIFI or GSMOBA standards, while LTE is used for the Broadband Direct-Air-to-Ground connection between the Aircraft station and the Ground base station.
- It is understood that it is possible to operate A2G communications over spectrum that is shared with ground communications. However, due to interference it is expected that the ground communications would suffer from capacity losses depending on the deployment scenario. Therefore, it is recommended to operate A2G communication over a dedicated spectrum.
- It can be noted that ETSI studies concluded that Spectrum above 6 GHz is not appropriate for such applications .
- LTE already provides solutions to allow seamless mobility in between cells. Cells can be intended for terrestrial UEs and cells intended for A2G UEs which might operate in different frequencies.
- Cell range in LTE is limited by the maximum timing advance (around 100km). Larger ranges could be made possible by means of implementation adaptations.
Friday, 21 November 2014
Came across the following Inmarsat press release:
His presentation is above and the video is as follows. Please forward to 1:36:00 to watch his part
Friday, 26 November 2010
To download see: http://www.cambridgewireless.co.uk
Thursday, 1 July 2010
Friday, 23 April 2010
Satellite navigation systems take their location cues from 30 GPS satellites that circle the Earth twice a day transmitting status, date and time, and orbital information. Soon there will be around 100 satellites to lock on to as GPS is joined by global constellations from Europe (Galileo), Russia (GLONASS), and China (Compass).
GPS wasn't built to help us find our way to the shops - it was a Cold War project funded by the US Department of Defense to ensure that nuclear submarines could surface and target their missiles accurately. There are strategic rumblings about the new satellite constellations too, but the current consensus is that civilians have most to gain from more accurate and reliable location and tracking applications. That's if receiver designers can get the power consumption under control.
Russia's GLONASS system used to be famous for its satellites failing faster than they were launched, but since last month it has had 24 functioning satellites in orbit. Meanwhile, Europe's much-delayed Galileo system will have 14 satellites operating by 2014, according to the European Commission, with the full 30 available by 2017. The US GPS system is being modernised to become GPS III by 2013, with additional navigation signals for both civilian and military use. Information about China's Compass system is sketchier - it was going to be a regional system but is now understood to be global.
'All this activity is great news because whatever the application, there will potentially be multiple constellations to get a position fix from, which will help with signal integrity in safety-critical environments such as maritime, aviation or rail, and accuracy for mobile phone users in urban areas,' says Andrew Sage, director of Helios, a consultancy specialising in satellite navigation.
A GPS receiver should be able to 'see' at least four GPS satellites anytime, anywhere on the globe and establish three position coordinates (latitude, longitude, and altitude). But in city streets hemmed in by tall buildings, a receiver is unlikely to be able detect more than two satellites and the signals will often have bounced off structures.
'For the average pedestrian, the position fix can be a long way out and very unpredictable,' says Sage. 'Most users don't see that today because GPS receivers match us to maps and smooth the errors out. But if you are walking around a city and not on a road in a car, multi-path reflections are a problem.'
The more satellites visible from within these 'urban canyons', the easier it is to carry out consistency checks on the received signals. 'Even when you can't isolate the multipath-contaminated signals, the more signals you have, the more your errors average out,' says Dr Paul Groves, lecturer in global navigation satellite systems (GNSS), navigation and location technology at UCL.
Better GNSS integrity would enable new applications, such as road-user charging, enforcing bail conditions and pay-as-you-drive insurance. 'Clearly, if position information might be used as legal evidence, it has to be reliable,' says Groves.
The delayed arrival of Galileo and the resurrection of GLONASS have complicated matters for receiver makers. Galileo was designed to offer the simplest possible upgrade path from GPS to a dual-constellation system. Agreements were made to put the carrier frequencies of the main open services in the same part of the spectrum as GPS, at around 1575MHz, so receivers could share the same radio, analogue components and antenna. Both systems also send their signals using a spread-spectrum code-division multiple-access (CDMA) approach. GLONASS uses a frequency-division multiple-access coding technique (FDMA) and a main open-service carrier frequency of 1602.2MHz.
Thursday, 25 June 2009
But imagine that you have a navigation tool or gadget which acts as your own personal travel guide. It has satellite navigation, so when you get into your car it can direct you to where you want to go. It can choose the most carbon-efficient route and make sure you avoid crowded town centres, traffic jams and road works. It can let you know where the next petrol station is, and whether there is an Italian restaurant near your hotel. Before you arrive you will know which of the town car parks have spaces left. And when you've finally parked the car, take your guide with you and it will direct you, on foot, to your final destination.
For anyone who has found themselves stuck in a traffic jam, or has been unable to find a car park in a busy town centre, or has got lost on foot, it sounds too good to be true. Yet the technology to make it happen is already here. So why aren't we all carrying such a device in our pockets?
The question which then arises is that why the universal travel widget isn't at hand. One of the reasons for that is that several different worlds have to collide and co-operate. First of all there is a massive competition together with a huge confusion regarding the platforms in which such device can be built on. To start with we have got proprietary platforms like TomTom and Garmin, and then we've got the at least five major mobile phone operating systems.
The obvious competition between these different platforms has instigated some suspicion but apart from this the mobile companies also have yet to ¬recognize the potential of phones as navigation devices.
You can argue that many mobile phones are already GPS-enabled but in my opinion this doesn't necessary make them effective at navigation. For instance try using your blackberry as a navigation device and you’ll find that battery has quickly drained out. The mobile phone world is slowly coming to terms with the needs of navigation on mobiles, such as better ¬battery life and bigger screens. Infact GPS alone doesn't offer the precision needed to navigate pedestrians, and so to be useful needs to be combined with another positioning service such as Wi-Fi. This has been done with the iPhone, for example.
The accuracy and granularity of data used in satellite navigation systems is very critical and has to be improving all the time. The real problem lies in integration where the data needed to provide a coherent information service to a navigation device is held by different organisations in a number of different places. While there are companies that are providing some location-based information such as information about ATMs, speed cameras, train times or tourist sites but there is no company in my knowledge that offers everything.
Combining all the information and hence provided through a single device at a one point of time that information isn't going to be easy. The challenges which lies in this are not solely technical for example there's a data aggregation problem to bring it altogether, including highway changes, updates from local authorities and then there's a physical problem in gathering all that up.
Even if the above issues are solved there is still a major part of the problem which is revenue. How one would make money out of integrated Satnav device? There's a difference between what can be done technically and a viable ¬product that can be sold. How do you turn that into something that fits in a business model?"
Organisations that have valuable data rarely want to give it away for free, licences to reuse companies or government’s mapping data commercially are expensive. Similarly, there is no incentive for the Highways Agency or local authorities, for example, to share information about traffic conditions. Even the government website Transport Direct, which provides free up-to-date transport information, has restrictions on the integration of its content with other services.
So now you may realize that how trying to highlight the potential of the problem. It’s a mammoth task to bring all the above information together into one place as everyone wants their pound of flesh because everyone has developed their own data infrastructure and it's just very difficult to get them to agree.
I certainly hold the opinion that inspite of all these hiccups the demand for an all-in-one travel service almost certainly exists. People simply really want a so called integration or integrated device which can work across different ¬locations i.e. home, work, on the move etc.
It’s evident from the above facts that the emergence of a ¬genuinely integrated solution will depend on a government initiative to force public sector organisations such as Highways Agencies, Transport for London and local authorities to collaborate, or on a private sector organisation taking a ¬commanding lead in terms of developing location technologies.
Google is one such company which is creeping up with a whole series of ¬initiatives that are steadily putting the pieces in place. Best example for this is Google Maps which are now readily available on all mobile platforms and is integrated with traffic data from the Highways Agency. Not only this, the Google Maps application interface (API) allows third parties to build their own applications as well.
Google, no doubt is leading with an example in terms of it’s initiatives towards serving the customers in best possible way. Google certainly knows what the customers want which I believe a mini innovation in these current economic climate.
Location has always been such an absolutely fundamental framework for our lives, and we inevitably must embrace tools that allow us to manage that. I envisage a society in 20 years' time revolutionised by the ability to know all the location based information.
Wednesday, 17 December 2008
Advances in information technology are fundamentally changing the way military conflicts are resolved. The ability to transmit detailed information quickly and reliably to and from all parts of the globe will help streamline military command and control and ensure information superiority, enabling faster deployment of highly mobile forces capable of adapting quickly to changing conditions in the field. Satellite communications play a pivotal role in providing the interoperable, robust, "network-centric" communications needed for future operations.
Military satellite communications (or milsatcom) systems are typically categorized as wideband, protected, or narrowband. Wideband systems emphasize high capacity. Protected systems stress antijam features, covertness, and nuclear survivability. Narrowband systems emphasize support to users who need voice or low-data-rate communications and who also may be mobile or otherwise disadvantaged (because of limited terminal capability, antenna size, environment, etc.).
For wideband communication needs, the Wideband Gapfiller Satellite program and the Advanced Wideband System will augment and eventually replace the Defense Satellite Communications System (DSCS). These satellites will transmit several gigabits of data per second—up to ten times the data flow of the satellites being replaced. Protected communications will be addressed by a global extremely high frequency (EHF) system, composed of the Advanced Extremely High Frequency System and Advanced Polar System. These systems are expected to provide about ten times the capacity of current protected satellites (the Milstar satellites). Narrowband needs are supported by the UFO (Ultrahigh-frequency Follow-On) constellation, which will be replaced by a component of the Advanced Narrowband System
Although the EHF band is a relatively lightly used part of the electromagnetic spectrum (30-300 GHz), it is for good reason. Atmospheric attenuation is the biggest problem faced in this band, especially around 60 GHz, however the frequencies are viable for short distance terrestrial based communication links, such as microwave Internet and telecommunication links (which already operate in this band). Millimetre wave radar, probably best known as the radar that can see through your clothes but not your skin, also operates in this band.
Designed to avoid problematic frequencies that are more susceptible to attenuation, but accepting increased overall atmospheric attenuation, are an increasing number of military and civil satellite systems that are using this band for uplink and downlink, as well as inter-satellite communication. Inter-satellite communication is really where EHF equipment shines (no atmosphere, small antennas, high data rates).
Civilian systems are currently around the Ku band (Intelsat), providing data rates of up to 2-4 Mbps (14 GHz uplink, 12 GHz downlink) however these rates have still to trickle into everyday user's hands for remote and mobile Internet access. It is more common that an aggregator will access this link/rate and use that to then portion out local Internet access. Systems such as this are in use for remote Australian territories like Cocos and Christmas Islands, and formed the backbone of Boeing's stillborn Connexion in-flight Internet access. High ongoing access costs (basically a share of the overall cost of the satellite) and limited access slots help keep the technology away from everyday use at this time. Militaries and governments around the globe also lease access on these circuits when they need the added capability, with Intelsat and Inmarsat systems being used in the first Gulf War.
Advanced EHF is designed to provide 24 hour coverage from 65 North, to 65 South across the K and Ka sub bands, and when combined with the prototyped Extended Data Rate (XDR) terminals and systems, will offer up to 8.2 Mbps data rates for around 4,000 terminals in concurrent use per satellite footprint (whether that scales to 12,000 systems in concurrent use globally isn't clear from source material).
Within the tri-satellite constellation, inter-satellite EHF links will allow terminals on opposite sides of the globe to communicate in near real-time without the use of a terrestrial link. Combined with smaller, directional antennas and the various options for anti-jamming technology, it represents a significant military capability for the US.
Already plans are being drawn up for the Transformational Satellite Communications System (T-Sat) which will replace Advanced EHF starting sometime in 2013, however it is already facing funding troubles. This could be problematic, with Advanced EHF still struggling to reach capability and the final launch not scheduled until April 2010. Dropping the fourth satellite of the Advanced EHF constellation has been planned to give the USAF time to implement T-Sat more rapidly.
If GPS and remote imaging (think Google Earth) have proven anything, it is that technology initially developed for military purposes, and extremely expensive for initial civil use, will eventually reach the point where it forms part of our daily lives without us ever being conscious of the massive investment to get to that point.