Tuesday 26 August 2008

Data revenues can go even higher

Over the past few years the telecoms world has experienced number of emerging technologies with a great degree of innovation coupled with intensive research. In the year 1999 when I completed my engineering in electronics and communication, most of the telecomm companies in India were only interested in GSM/GPRS. At that time I had no idea how the technology will evolve, as it has over the period of time.

Initially it was GPRS and then it lead to the major shift towards 3G. Once the 3G found its place further developments were carried out in terms of improving the user experience while on the move. This sole idea of giving the user the best, lead to the emergence of new technologies like HSDPA, HSUP. HSPA+ and LTE. Clearly improved data rates were the key factor for introduction of each technology from GPRS to LTE.

Everybody in the industry realised that if they have to win the customers and to compete with the fixed technology then they have to provide better data rates while the user is on the move. Till today the vendors and operators together with 3GPP has worked very hard to come forward with technologies like HSPA+ etc which can serve plenty of mega bytes per second to the users.
Although the wireless operators insist that we are still in the early stages of wireless data adoption but the data revenues are already playing the major part in the overall revenues of the companies. Recently when the operators announced Q2 data revenues they reported that data revenues account for nearly 25 percent of their average revenue per user (ARPU). Verizon Wireless is the prime example of the above fact which is the data leader in US, with 24.4 percent of its $51.53 ARPU coming from data. AT&T is a close second with 22.9 percent of its ARPU coming from data.

During the earnings calls with analysts, both the above operators together with the likes of Vodafone talked about the continued growth potential for data. There is a clear trend that operators are leaving no stone unturned in order to provide as high data rates as possible to their users. Operators are working feverishly to upgrade their network and the competition is intensifying for the better user experience. Youth and businesses are the main targets for the companies which are always in demand of high data rates for their own reasons.

These days one can easily get access to mobile broadband with reasonable amount of monthly payment. There are so many competitive deals available in the market in order to lure the customers towards browsing and emailing while on the move. There is no doubt that operators are successfully adding the customers and hence increasing their revenues mostly generated by data use. Verizon's data revenue grew 45 percent year over year. AT&T's data revenue grew 52 percent year over year. Vodafone and T-Mobile’s data revenue too grew by more than 50% over the last couple of years. But I'm wondering how high the data revenues can really climb. Is this strong growth rate sustainable?

Executives from the telecomm giants like Vodafone, AT&T etc predict there are still much more growth to come as consumers upgrade to integrated devices and smart phones that can take advantage of the 3G network. The companies say that nearly 20% of their customers has either upgrade or are in the process of upgrading to an integrated device. Meanwhile, Verizon recently said that 60 percent or 40.5 million, of its retail customers have upgraded to 3G data-capable devices.

Analysts believe that the likes of Vodafone, T-Mobile, AT&T, Verizon are the clear leader in monetizing data and that it will continue to lead the industry in data ARPU as it increases the number of data applications and data-centric devices.

I think the key to sustaining this growth rate lies not in the number of data-capable devices in consumer hands but in the availability of compelling data applications at reasonable price points. Without the continued push for better, more user friendly applications, data revenues are not going to be able to sustain this current growth trajectory.

Vodafone for example is already taking the necessary steps in that directions and it is looking to boost the usage of 3G data. Vodafone has announced an agreement with the laptop vendor Lenovo that will see its new X200 computer pre-installed with a Vodafone SIM and supporting software. The broadband connectivity comes at no extra cost and, when activated by the purchaser of the laptop, will offer the user a 30-day free trial. "The connection manager will ask for your name and email, but no bank details," said Alec Howard, head of PC connectivity at Vodafone. "Users will be prompted to take out a contract at the end of the free trial and the prices are around £12 a month for broadband, with automatic roaming in Europe at £8.50 a day. But like any other products this also has a disadvantage. Users dissatisfied with the Vodafone service will struggle if they want to connect to another mobile operator thus installing a new SIM and downloading and configuring a new connection manager instead of using the built-in software which only works with Vodafone.

I certainly believe that this embedded 3G initiative would significantly lower the cost of built-in mobile broadband technology across the entire range of laptops. I myself have used a dell laptop with an embedded data card.
Where you just have to put the right SIM and then connect to wireless broadband with the help of a connection manager. I think embedded modems are cool and are fun to use. There is no doubt in my mind that the use of embedded modems for mobile broadband connectivity is set to increase rapidly in the next few years, with sales estimated to grow at a rate of well above 80% per cent from 2008 to 2012. Laptops with an embedded modem are one of the data applications which will enhance the user experience and hence lead to the increase in ARPU.

Most of the vendors are also working in the direction where they can enhance the handset architecture with enhanced multimedia functionalities. Nokia surprised analysts with its Q2 revenues with better-than-expected second-quarter earnings. Nokia thinks inline with some of the operators and firmly believe that the global handset market could grow more than its previous estimate of 10 percent in 2008.

For the new devices, Nokia concentrated a lot on the services front, and hence enriched customers with the handsets supporting next generation multimedia services e.g. supporting Sony BMG Entertainment with Music service. The Nokia Music Store is now available in 10 markets and the company expects to have 14 stores open by year-end. In addition, N-Gage mobile games service, which became available during the quarter, has had more than 406,000 downloads.

So in my view if the companies are innovative just like Nokia has, then there is every possible chance to push the data rates to new high. Vendors with their excellent architecture and high degree of data applications can definitely push the data throughput and hence contribute in high data revenues.

As everyday passes by we are seeing new handset with amazing designs and new architecture. These handsets are designed to perform faster and can support very high data rates. Today’s youth can play online games on these devices, can watch live TV, send and receive multimedia messages and so many other things. The business users can exchange email while on the move. The installation of HSPA+ by vendors will further enhance the experience of the data users. The number of HSPA sunscribers is growing many folds i.e. at the rate of 4 million subscribers a year. Companies like At&T are aggressive towards their HSPA roll out plans and it looks to rollout the HSPA together with the 3G iphone.

Very high speed of up to 20 Mbps in a 5 Mhz channel is already achieved by HSPA and Qualcomm is one of the many to prove this.

At the moment things looks very promising and I strongly believe that industry will keep coming out with bright ideas to generate the increased data revenues. LTE is another step towards more revenue generation with enhanced user experience in view. Let’s see how high the data throughput together with the data revenues will go.

Sunday 24 August 2008

Need for femtocells -> from Youtube

Someone sent me this link from youtube. Even though this is more like a marketing presentation from Soundpartners for a market report, it gives a good idea from operators point of view, why they will be looking at Femtocells to strengthen their market position and as a way of optimising their networks.

Friday 22 August 2008

802.11n and 4G...

IEEE 802.11n is a proposed amendment to the IEEE 802.11-2007 wireless networking standard to significantly improve network throughput over previous standards, such as 802.11b and 802.11g, with a significant increase in raw (PHY) data rate from 54 Mbit/s to a maximum of 600 Mbit/s. Most devices today support a PHY rate of 300 Mbit/s, with the use of 2 Spatial Streams at 40 MHz. Depending on the environment, this may translate into a user throughput (TCP/IP) of 100 Mbit/s.

According to the book "WI-Fi, Bluetooth, Zigbee and Wimax":

802.11n is the 4th generation of wireless lan technology.
  • First generation (IEEE 802.11) since 1997 (WLAN/1G)
  • Second generation (IEEE 802.11b) since 1998 (WLAN/2G)
  • Third generation (802.11a/g) since 2000 (WLAN/3G)
  • Fourth generation (IEEE 802.11n) (WLAN/4G)

The distinguishing features of 802.11n are:

  • Very high throughput (some hundreds of Mbps)
  • Long distances at high data rates (equivalent to IEEE 802.11b at 500 Mbps)
  • Use of robust technologies (e.g. multiple-input multiple-output [MIMO]and space time coding).

In the N option, the real data throughput is estimated to reach a theoretical 540 Mbps (which may require an even higher raw data rate at the physical layer), and should be up to 100 times faster than IEEE 802.11b, and well over ten times faster than IEEE 802.11a or IEEE 802.11g. IEEE 802.11n will probably offer a better operating distance than current networks. IEEE 802.11n builds upon previous IEEE 802.11 standards by adding MIMO. MIMO uses multiple transmitter and receiver antennae to allowfor increased data throughput through spatial multiplexing and increased range by exploiting the spatial diversity and powerful coding schemes. The N system is strongly based on the IEEE 802.11e QoS specification to improve bandwidth performance. The system supports basebands width of 20 or 40MHz.

Note that there is 802.11n PHY and 802.11n MAC that will be required to acheive 540Mbps.

To achieve maximum throughput a pure 802.11n 5 GHz network is recommended. The 5 GHz band has substantial capacity due to many non-overlapping radio channels and less radio interference as compared to the 2.4 GHz band. An all-802.11n network may be impractical, however, as existing laptops generally have 802.11b/g radios which must be replaced if they are to operate on the network. Consequently, it may be more practical to operate a mixed 802.11b/g/n network until 802.11n hardware becomes more prevalent. In a mixed-mode system, it’s generally best to utilize a dual-radio access point and place the 802.11b/g traffic on the 2.4 GHz radio and the 802.11n traffic on the 5 GHz radio.


A lot of phones are coming with inbuilt WiFi (or 802.11 a/b/g) and this WiFi is a must on Laptops or they wont sell. The main difference in 802.11n, compared to previous generation of 802.11 is that there is a presence of MIMO. 802.11 family uses OFDM which is the same technology being adopted by LTE. The new LTE handsets will have advantage of easily integrating this 802.11n technology and the same antennas can be reused. In fact the same is applicable for WiMAX as it supports MIMO and OFDM. Ofcourse we will have problems if they are using quite different frequencies as the antennas ore optimised to range of frequencies, this is something that has to be seen.

In the news:

MIT and a medical center based in Alabama are beginning to deploy faster wireless 802.11n access points from Cisco Systems Inc. In more than 100 buildings on MIT's Cambridge, Mass., campus, as many as 3,200 access points running older 802.11a/b/g protocols will be replaced with 802.11n devices in the next 12 to 16 months, said Chris Murphy, a networking engineer at the university. Murphy said MIT, with more than 10,000 students and 11,000 staff members, has a "very, very wide variety" of client devices, from handhelds to laptops. Many of the laptops probably support the 802.11n protocol, he said. Some MIT staffers have been using voice-over-IP wireless handsets and have experienced poor coverage with the older Wi-Fi technology, but they said they have had full signal strength within the range of the new 802.11n access points, he added. With 802.11n, the university could eventually provide IP television, which requires a lot of bandwidth, Murphy said.

Using 802.11n technology, Lapham said he was able to transmit a gigabyte of data in less than two minutes. Currently, the 370-bed medical center has about 450 access points on older protocols. Devices used on the wireless network include 180 laptops, which are used primarily for transmitting bedside patient data. The hospital also supports 100 VoIP wireless phones and a various medical devices.

Wi-Fi is expected to be available in 99 per cent of North American universities by 2013, according to research released by industry analyst ABI Research this week. Much of that penetration will be in the form of 802.11n equipment: higher education is clearly the number one market for early adopters of 802.11n, the company said.

ABI Research expects 802.11n uptake – which is today fairly small in the education market – to ramp up steeply to quite a high rate of penetration," said ABI Research vice president Stan Schatt. There are several reasons for this. ABI said many students now assume a campus Wi-Fi network as a given, and many of their shiny new laptops will be 'n'-compatible. Universities also have great bandwidth demands, as lecture halls may need to serve a large number of users with multimedia contention at any given time and 802.11n's greater speed and capacity can address that need. Moreover, said Schatt, "Universities are breaking new ground by using video over Wi-Fi in a number innovative ways. This is driving the adoption of high speed 802.11n. Students in the near future (at least the diligent ones) will be just as likely to watch their favourite professor's lectures on their laptops as they will be to view 'America's Next Top Model'."

You may also be interested in reading:

Thursday 21 August 2008

Revised paper on “4G” by 3G Americas

3G Americas have published a revised paper on Defining “4G”: Understanding the ITU Process for IMT-Advanced.

3G Americas initially created this white paper one year ago to provide clear understanding regarding the work-in-progress by the ITU, the sole organization responsible for determining the specifications for IMT-Advanced. The current paper updates the considerable progress made by the ITU, establishing a basis for what should be included in an IMT-Advanced system.


While speculation has been going on about 4G technologies, ITU is close to releasing a full set of documentation for this definition. It has held ongoing consultations with the global community over many years on this topic in Working Party 8F under the scope of a work item known as Question ITU-R 229-1/8 “Future development of IMT-2000 and systems beyond IMT-2000.” Following a year-end 2007 restructure in ITU-R, this work is being addressed under the new Study Group 5 umbrella (replacing the former Study Group 8) by Working Party 5D which is the new name for the former WP 8F.

This work in WP 8F, and now WP 5D, has woven together a definition, recipe, and roadmap for the future beyond 3G that is comprised of a balance among a Market and Services View, a Technology View, and a Spectrum View. These, along with Regulatory aspects, are the key elements for business success in wireless.

By mid-2008, ITU-R advanced beyond the vision and framework and developed a set of requirements by which technologies and systems can, in the near future, be determined as a part of IMT- Advanced and in doing so, earn the right to be considered 4G.

During 2008 and though 2009, ITU-R will hold an open call for the “first invitation” of 4G (IMTAdvanced) candidates. Subsequent to the close of the submission period for the “first invitation” an assessment of those candidates' technologies and systems will be conducted under the established ITU-R process, guidelines, and timeframes for this IMT-Advanced ‘first invitation.” The culmination of this open process will be a 4G, or IMT-Advanced family. Such a 4G family, in adherence to the principles defined for acceptance into this process, is globally recognized to be one which can grow to include all aspects of a marketplace that will arrive beyond 2010, thus complementing and building upon an expanding and maturing 3G business.

The paper is available to download from here.

The ITU-R Radiocommunication Bureau has established an “IMT-Advanced” web page (http://www.itu.int/ITU-R/go/rsg5-imt-advanced/) to facilitate the development of proposals and the work of the evaluation groups. The IMT-Advanced web page provides details of the process for the submission of proposals, and will include the RIT and SRIT submissions, evaluation group registration and contact information, evaluation reports and other relevant information on the development of IMTAdvanced.

Wednesday 20 August 2008

Ofcom's 2008 Comms Market report

Dean Bubley posted this on Forum Oxford and i thought that this is worth spreading around.


Ofcom's just released a huge new report on the current state of the industry, incorporating telecoms, broadcasting and related services. Some interesting statistics:
  • Quite a lot of discussion of the resilience of fixed-line comms in the face of the mobile onslaught. Rather than direct fixed-mobile substitution, it appears that the UK sees more mobile-initiated incremental use of voice. Fixed minutes have dropped about 17bn minutes in total over 6 years, but mobile call volumes have risen by 38bn minutes. The UK outbound call total is still around 60/40 fixed:mobile, and 88% of homes still have a fixed line.
  • The proportion of mobile-only households has been pretty static for the past few years, currently at 11%. This is considerably lower than elsewhere in Europe (eg 37% in Italy), and is possibly reflecting the prevalence of ADSL. Most mobile-only users are from lower socioeconomic groups.
  • 44% of UK adults use SMS daily, against 36% using the Internet
  • More than 100k+ new mobile broadband connections per month in the UK in H1 2008, with the rate of sign-up accelerating. 75% of dongle users are now using their mobile connection at home.
  • Nearly half of adults with home broadband use WiFi
  • 11% of UK mobile phone owners use the device to connect to the Internet, and 7% use it to send email.
  • VoIP usage appears to have fallen from 20% of consumers in late 2006, to 14% in early 2008. However, I suspect that this masks the fact that many instances of VoIP (eg BT's broadband circuit-replacement service, or corporate IP-PBXs), don't make it obvious to the user.
  • Over two-thirds of mobile broadband users also have fixed-line broadband
  • UK mobile subscribers send an average 67 SMS per month (or 82 / month per head, taking account of multiple subs-per-person). MMS use is only 0.37 messages per user per month.
    Slight increase in overall fixed-line subscriptions in 2007 - attributed to business lines.
    Overall UK non-SMS mobile data revenues were flat in 2007 vs 2006 at £1bn. I reckon that's because the data pre-dates the big rise in mobile dongle sales, and also reflects price pressures on things like ringtones. Ofcom also attributes this to adoption of flatrate data plans vs. pay-per-MB.
  • UK prepay mobile ARPU has been flat at £9 / month for the last 4 years. That's a big issue for operators wanting to sell data services to prepay subs in my view.
  • 17% of mobile subscriptions in the UK were on 3G at end-2007, although there's not much detail on the actual usage of 3G for non-voice applications.
  • Overall, UK households allocate 3.3% of total spending to telecom services. That's been flat since 2003 - ie the slice of the pie isn't getting any bigger relative to food/rent/entertainment/travel etc.
  • 94% of new mobile subscriptions are bundled with handsets.
  • 11% of UK adults have >1 SIM card. Among 16-24yo users, this rises to 16%. There's an estimate that of the second devices in use in the UK, 1m are 3G dongles, 0.7m are BlackBerries or similar, and 8m are genuine "second handsets". There's also another 8m "barely active" devices that are used as backups, or legacy numbers that get occasional inbound calls or SMS

Some other interesting key points that are available here:

  • Communications industry revenue (based on the elements monitored by Ofcom) increased by 4.0% to £51.2bn in 2007, with telecoms industry revenue the fastest growing component, up 4.1% on the year.
  • Mobile telephony (including an estimate for messaging) accounted for 40% of the total time spent using telecoms services, compared to 25% in 2002. However, much of this growth has come about as a result of an increase in the overall number of voice call minutes (from 217 in 2002 to 247 in 2007) rather than because of substitution with fixed voice, which still accounted for 148 billion minutes last year, down only 10% from 165 minutes in 2002.
  • The most popular internet activity among older people is ‘communication’ (using email, instant messaging and chat rooms for example); 63% of over-65s say they communicate online, compared to 76% of all adults.
  • The majority of children aged 5-7 have access to the internet and most children aged 8-11 have access to a mobile phone. Children are more likely to use the internet for instant messaging than for email.
  • Television is particularly important to older people. Sixty-nine per cent of those aged 65-74 say it is the media activity that they would miss most (compared to 52% of all adults) and this rises to 77% among the over 75s. Older people are also more likely to say they miss newspapers and magazines – 10% of 65-74s and 7% of over 75s, compared to 5% of all adults.
  • The converged nature of mobile handsets became apparent during 2007, with 41% of mobile phone users claiming to use their handset for taking pictures and 15% uploading photos to their PC. Nearly one in five (17%) also claimed that they used their phone for gaming.

Tuesday 19 August 2008

Nokia Eco Sensor Concept Mobile

Though this is not new, i havent seen it anywhere and found it recently while working on a report.

A visionary design concept is a mobile phone and compatible sensing device that will help you stay connected to your friends and loved ones, as well as to your health and local environment. You can also share the environmental data your sensing device collects and view other users’ shared data, thereby increasing your global environmental awareness.

The concept consists of two parts – a wearable sensor unit which can sense and analyze your environment, health, and local weather conditions, and a dedicated mobile phone.

The sensor unit will be worn on a wrist or neck strap made from solar cells that provide power to the sensors. NFC (near field communication) technology will relay information by touch from the sensors to the phone or to or to other devices that support NFC technology.

Both the phone and the sensor unit will be as compact as possible to minimize material use, and those materials used in the design will be renewable and/or reclaimed. Technologies used inside the phone and sensor unit will also help save energy.

To help make you more aware of your health and local environmental conditions, the Nokia Eco Sensor Concept will include a separate, wearable sensing device with detectors that collect environment, health, and/or weather data.

You will be able to choose which sensors you would like to have inside the sensing device, thereby customizing the device to your needs and desires. For example, you could use the device as a “personal trainee” if you were to choose a heart-rate monitor and motion detector (for measuring your walking pace).
The Nokia Eco Sensor Concept is built upon all three of these underlying principles of waste reduction. Emphasis will be placed on materials use and reuse in the phone’s construction.

To complete the Nokia Eco Sensor Concept, the phone and detector units will be optimized for lower energy consumption than phones in 2007 in both the manufacturing process and use. Alternative energy sources, such as solar power, will fuel the sensor unit’s power usage.

Please note that this is a concept phone so you wont be seeing this in a shop near you anytime soon.

Monday 18 August 2008

4G: Where are we now.

Last month i read this news about WiMAX leading the world of 4G and last week I read about an American carrier selecting LTE as its choice of 4G technology. Since ITU has decided that they wont be using the term 4G in future and rather use IMT-Advanced or LTE-Advanced, I guess 4G is up for grabs.
The main driver for '4G' is data. Recently carriers have become agressive and started offering some decently priced 'Wireless Broadband' data plans. Rather than confuse people with HSDPA, etc., they have decided to use the term 'Wireless Broadband' or 'Mobile Broadband'. Personally both the terms have managed to confuse some people who associate Mobile Broadband with Internet access on Mobile and Wireless Broadband as broadband on WiFi.

Andrew Seybold makes some valid points in an article in Fierce Wireless. One of the things that he points out is that LTE may tout on higher data rates as compared to others, that is only possble in 20MHz of spectrum. In real world this kind of spectrum is near impossible to obtain. If the spectrum flexibility is removed than HSPA+, LTE, EV-DO Rev B and WiMAX have nearly the same data rates and performance.

For HSPA+ the existing infrastructure can be reused and a software upgrade would suffice whereas for LTE new infrastructure would be required. NTT DoCoMo has fully committed to being the first LTE network operator and others are raising their hands. He thinks that nationwide LTE networks would only be available around 2014.

While I agree with this analysis completely, I think what is going to dictate this transformation from 3G+ to LTE for the operators will be the uptake of data on a network. The biggest advantage of LTE is that it is able to operate in TDD and FDD mode. Operators that have been traditionally using FDD mode of operation will change their loyalty to TDD mode so that they can use asymmetric data transfer. This can provide more capacity in case of some special event taking place (Football finals, Reality show results, etc.) where the users are just interested in receiving information rather than sending any. For operators with paired spectrums, they can use both the band seperately in TDD modes.

Gigaom has list of American operators that are involved in 4G and the list is quite interesting:
  • AT&T: USA's largest network in terms of subscribers, AT&T plans to use LTE to upgrade to 4G, but not for a long, long time. For now it’s content with its current 3G network. It will upgrade to HSPA+ in 2009 and 2010. Eventually it will go to LTE, but won’t begin testing until 2010 or 2011 with full deployment coming after that.
  • Verizon Wireless: Verizon is already testing LTE equipment from several vendors, with plans to roll out the network in 2010 and have most of the country covered by 2012; Verizon’s would likely be the first full U.S. deployment of the LTE technology.
  • Sprint-Nextel: The outlier in the whole transition to 4G, Sprint is going with WiMAX rather than LTE. After a number of delays, the company is set to launch its network in September. By the end of the year it will join with Clearwire to operate a nationwide WiMAX network under the Clearwire brand.
  • T-Mobile: T-Mobile is still launching its 3G coverage, so its 4G networks may take a while to come to fruition. The carrier’s German parent appears to favor LTE.
  • Metro PCS: This budget carrier plans to use LTE but it doesn’t yet have a time frame for deployment, pointing out that its customers aren’t heavy data users yet.
  • U.S. Cellular: The company is unsure of its deployment plans but it would likely choose to follow the rest of the industry with LTE. As for deployment, the time frame isn’t set.
  • Leap Wireless: Recently said it had not made a decision or public comment about its 4G plans.

The picture is a bit different here in UK because all the operators are going to LTE. There may be some ISP's that may be tempted to move to WiMAX as they would get economy of scale. There is also the news of BT (the largest landline phone provider) planning to roll out nationwide WiMAX network in the 2.6GHz spectrum. If BT is able to fulfil its ambition that it could be a big win for the people.

Sunday 17 August 2008

Femtocell success reliant on handset innovations

Femtocells are one of the emerging technology in telecomms. The success of femtocells cannot be predicted yet the industry related to femotcells has had its share of good and bad moments. There is no doubt in my mind that femtocells, together with WiMax and LTE are the most talked technology in telecoms these days.

In the past few weeks I have hearing about the challenges faced in deployment of femtocells. Getting the right handset for femtocells is the key for the success of the femtocells.
I must say that the hype surrounding the mass deployment of femtocells has been doused with cold water by a new study into the need for handset vendors to quickly transform their devices to support the technology. According to the report, published by Research & Markets (R&M), the femtocell industry is basing its optimism on the notion that subscribers will use their cell phones differently when in range of femtocells. There will be different applications and behavioral patterns when people are at home, perhaps content backups, podcasts or even advertiser sponsored TV programming. The mobile phone may need to be linked to the TV, PC, HiFi or other items of domestic technology, claims R&M.

I have seen some reports which suggest that although the currently available handsets will work with femtocells they are not optimised to support this new 'in home' activity. The question which remains in my mind is that how the handset will determine the femtocells as compared to any other stronger not femtocell available. The phone needs to be aware of the femtocell, ideally both in the radio and the application platform. I firmly believe that we will need new architecture for the handsets to solve the above problem. But changing how the handset industry approaches this challenge could take 2-3 years given that it takes this amount of time to implement new handset architecture, and around the same time before new cell phone technology reaches a broad range of devices. The handset industry also needs to be aware that where we will ne in terms wireless technology in 2-3 years time. We might be entering in the era of LTE by this time.

But there are some more issues which femtocell industry should be aware of. Some of these issues identified by Research and Market are:
  • In dense deployments of femtocells, handsets can spend too much time and power attempting to connect at locations that are not their own "home zone."
  • The new 3GPP Release 8 specifications contain various modifications to enable handsets to work better with femtocells, but the first R8-compliant phones will likely be shipped at the end of 2010.
  • The usage of handsets on femtocells may identify unexpected side-effects, relating to faster/cheaper data connections. This may impact elements of design such as memory allocation and power management.
  • Various suggestions have been made for ‘femto-zone' services--but there is no standardised way for handset applications to know they are attached to a femtocell.

By looking at the above issues it may not sound very well in favour of femtocell deployment and commercialisation. However operators are always looking for new means and ideas for the generation of new revenue streams. Femtocell is definitely on of those new means and a possible opportunity for the operators to generate more revenue. Revenue can be generated by the operators from advertisers and other third parties by enabling the provision of 'at home' services via femtocells.

Research and Market claimed that there could be a demand for at least 48 million femto-aware handsets to be sold to femtocell owners in 2013. However, with more optimistic forecasts, and especially if shared femtocell models become popular, there could potentially be a demand for up to 300 million femto-aware handsets per year in 2013.

Although the above figures look very encouraging, femtocell industry is still very cautious in terms of their approach towards massive investment. Femtocell industry is currently focusing on the short term, getting initial trials in place, developing standards, and securing commitments for early commercial deployment. These initial efforts are very critical for the femtocell industry so that they can validate the market, raise the profile of the femtocells concept. If the industry can do that then it will stimulate finance and investment in the femtocells.

One of the propositions by central marketing is that femtocells can work with normal 3G handsets. If this is true then subscriber can get the service from femtocells without needing to go for expensive upgrades to their existing phones.

But while focus is good and the industry does not want unnecessary distractions there is a risk of medium term failure if certain future problems are not addressed early enough, even if this muddies the waters of the short term marketing message. Already, femtocells proponents are talking up mass market business models that go beyond simple indoor coverage and macro-network offload. They are talking about 10's of millions of subscribers, and new "in-home" services for users, that exploit fast and cheap local mobile connectivity.

It is at that stage that the issue of right handsets for the femtocells industry comes into picture once again. The handset innovations become even more important for important for the industry. As I have mentioned above, the handsets design should be able to differentiate between femtocells and real cell environment. In part, this relates to complexities in managing the radio environment and mobility between femtocell and macrocell networks. This is easy said then done and hence various optimisations are desirable, especially when dense deployments of femtos occur. These drive changes in areas such as the way the phone "selects" cells on which to register. There may also need to be ways to offer provisioning and "guest access" on femtocells, from the handset UI. But this cannot be considered as a solution as users will definitely consider this as an unnecessary exercise for them. In my view the medium term hopes of the industry also reflect the notion that people will use their cellphones differently when in range of femtos. The problem for the femtocells industry doesn’t end with solving the problem of registering to the right cell. There will be different applications and behavioural patterns when people are at home, perhaps content backups, podcasts or even advertiser sponsored TV programming. The mobile phone may need to linked to TV, PC, HiFi or other items of domestic technology. This shows the road ahead is really tough and it again highlights the degree of innovations that will be required for the handsets design in order to work precisely in the femtocell environment.

Some reports suggest that standard phones can work with femtocells, but they are not optimised. Certain applications may only work when the phone is within femto range but they need to know when that is. Yes, some services can be notified by the core network that the user is "at home", but that approach doesn't scale to a wide base of operators, application developers and handset/OS vendors. The phone needs to be "aware" of the femtocell, ideally both in the radio and the application platform.




Changing such elements is not quick. The handset industry is much more complex and slow moving than many in the wider wireless business understand. It takes often 2-3 years for changes in handset architecture to reach commercially sold handsets, and another 2-3 years to reach a broad range of devices and reasonable penetration within the user base.

There is definitely a perception that the femtocell industry needs to be much more open minded about the need for modifying and optimising handsets and to be alert to the huge time and effort it will take to achieve. Other mobile developments like UMA and IMS have suffered in the past from a lack of focus on this issue. Although many femto advocates fear distractions could delay immediate market acceptance, early consideration of these "2nd order" problems is necessary for longer-term success.

What I have seen that there is significant efforts to make the femtos success and overcome the difficulties. The new 3GPP Release 8 specifications contain various modifications to enable handsets to work better with femtos (called Home NodeB's). Various suggestions have been made for "femto-zone" services -but there is no standardised way for handset applications to "know" they are on the femto. Although there are various workarounds, with the network notifying the application when the phone is attached to the femto, this approach is not easily scalable to the wider base of developers or operators. At the moment the best solution suggested is for handset "connection manager" software to explicitly recognise femtocell access as a new and specific type of bearer.

There is no doubt in my mind that operators could benefit from new revenue streams from advertisers & other third parties by enabling the provision of "at home" services via femtocells.
Using baseline forecasts, there should be a demand for at least 48m femto-aware handsets to be sold to femtocell owners in 2013. However, with more optimistic forecasts, and especially if "shared" femtocell models become popular, there could potentially be a demand for up to 300m femto-aware handsets per year in 2013.

Friday 15 August 2008

What is this MEMS and why is it required for 4G?

The mobile technology evolving at an amazing speed going from 3G to 3.5G in around 5 years and now going from 3.5G to 4G in less than 5 years. According to Analysis, Annual Global Mobile Handset Shipments to Reach 1.5 Billion in 2011. Converged-function handsets will become a mainstream product with more than 30% share in developed markets by 2011. There is a constant pressure on the handset manufacturers to reduce the power consumption and the chipset size and at the same time driving down the cost of the device.

RF micro-electro-mechanical systems (RF-MEMS) is a semiconductor technology that allows micro-scale moving mechanical devices to be integrated with electrical transistors on silicon wafers. RF-MEMS technology can be utilized to make high-frequency components whose RF characteristics can be adjusted during operation, allowing for the first time reconfiguration of radio hardware under software control. The ability to reconfigure operating characteristics in real time results in a substantial reduction in the required number of discrete components for a given set of functions, significantly relieving pressure on the handset product developer.

While the electronics are fabricated using integrated circuit (IC) process sequences (e.g., CMOS, Bipolar, or BICMOS processes), the micromechanical components are fabricated using compatible "micromachining" processes that selectively etch away parts of the silicon wafer or add new structural layers to form the mechanical and electromechanical devices.

An early micromotor built in the SUMMiT technology. For size comparison a microscopic dust mite is shown on top.

There are several different broad categories of MEMS technologies:
  • Bulk Micromachining
  • Surface Micromachining
  • LIGA
  • Deep Reactive Ion Etching
  • Integrated MEMS Technologies

Details available here.

MEMS is not really a new technology. It has been around since 1960's but only recently it has become feasible. Samsung watch phone was the first phone to have a commercial MEMS circuit and its being used in variety of devices nowadays, not just mobiles. For those who watched the opening ceremony of 2008 Beijing olympics would have seen the different coloured torch display. That 'Waving Torch' used MEMS circuitry.

Scientists are also working on making MEMS intelligent and they are looking at microorganisms for ideas. The integration of microorganisms with MEMS, resulting in “biotic-MEMS,” is a hot topic for scientists designing micron-level machines. Recently, researcher Xiaorong Xiong of Intel, microbiologist Mary Lidstrom, and electrical engineer Babak Parviz (both of the University of Washington) have catalogued a large number of the most promising microorganisms for different areas of MEMS systems. They show that many of these microorganisms can offer capabilities beyond the limits of conventional MEMS technology.

Finally, from EE Times:

French research and strategy consulting company Yole Dveloppement (Lyon, France) provides an analysis on MEMS components for cell phone applications as it expects this market will represent $2.5 billion in 2012. In its latest report, entitled MEMS for Cell Phones, Yole stated that the cell phone industry represents a complex challenge for MEMS but also its greatest opportunity for growth in the next five years.

According to the market research firm, silicon microphones and FBAR/BAW filters have experienced "incredible growth" since their introduction in 2003 and are now entering the maturity stage. MEMS accelerometers are in "a strong development stage", and MEMS products such as gyroscope, microdisplay, micro autofocus and micro zoom are at the emerging stage.

Yole also mentioned products that are not yet in the emerging stage. Among them are pressure sensors, micromirror, RF switch/varicaps, oscillators, and micro-fuel cells.

Yole reported that, for the year 2007, cumulative sales reached $440 million for three MEMS products in cell phone applications, namely silicon microphones, FBAR/BAW filters and accelerometers.

As a conclusion, Yole said it anticipates MEMS will become a key driver for innovation in the cell phone industry, and new cell phone features will represent 60 percent of the total MEMS market by 2012.

Interested people can also read:

RF-MEMS for Wireless Communications, Jeffrey L. Hilbert, WiSpry, Inc. - IEEE Communications Magazine • August 2008

Tuesday 12 August 2008

IMS: Reality check

IMS is another technology not doing too well at present. I came across this report by Yankee group, "IMS Market Update: The Honeymoon Is Over, Now What?" and it answers some of the questions why IMS is not as popular as people expected it to be 3-4 years back.
Some of the promises made by IMS were:
  • New IMS based apps, hence increased ARPU
  • Simplified network design, hence lower OPEX
  • Platform for killer services
  • Components interchangeable
  • Plug and Play environment for access networks

But the reason IMS has not found success is because:

  • IMS Standards are in flux
  • Everything is quite complex and not very clear
  • OSS/BSS integration is very complicated

An article in Cable 360 has some up to date market details:

Based on a report completed in January by ABI Research, Ericsson is market leader in providing IMS infrastructure followed by Alcatel-Lucent and Nokia Siemens. The other vendors in this ranking include Motorola (4), Huawei Technologies (5), Cisco Systems (6), Nortel Networks (7), Acme Packet (8), Thomson (9) and Tekelec (10).

Bundling is increasingly the way that IMS is sold. "Huawei combined a lot of their wireless offerings with IMS," ABI Senior Research Analyst, Nadine Manjaro said. "Whatever their contacts were, they had an IMS element."

"Previously (IMS) was more fixed," she said. "IMS is difficult to integrate. (So) one trend is combining IMS with infrastructure and 3G deployment and managed services," she said.
Increasingly critical to versions 6, 7 and 8 of the Third Generation Partnership Project (3GPP), IMS will become more tightly linked with mobile technologies, Manjaro predicted. The overall context remains telephony-focused.

"You see highly voice-over-IP related deployments of IMS globally," she said.

My understanding is that with the tight squeeze on financial market, everyone is trying to spend as less as they can. This means that end users are being shy of the extra features and services as long as it costs them and the operators are being shy of investing in new technologies or upgrading their infrastructure. Even though investment on IMS could be significant, it can provide long term benefits which may distinguish an operator from others and provide the cutting edge.

Another thing is that the IMS technologists (and why just them, others as well) should ensure that all the technical problems are ironed out and start promoting the technology to everyone. People ae already confused enough about HSDPA and 4G and we need to prepare them to look forward to IMS.