Sunday, August 31, 2008

Femtocells With LTE and their commercialization

Over the past few months LTE is gaining real momentum and the LTE camp is expanding. Companies who have decided to consider LTE as their 4G technology are doing everything possible to make LTE a big success.

Femtocells is another one of the most talked technology these days. In the past one year itself Femtocells has gained lots of strength and they are already in the process of commercialization. Giants like Verizon, T-Mobile, and Sprint have already announced their offering of femtocell products and service plans sometime this year. A few big announcements like this in the femtocell arena should give Femtocell market some good momentum. Some of you might already be aware that Qualcomm made a significant yet unknown investment in ip.access' Oyster 3G systems, which uses the residential broadband connection to deliver a 3G signal in the home. The move is seen as validating the femtocell concept, especially since Qualcomm is so adept at making the right technology investments.

With the work on LTE in full progress and femtocells strengthening its ground, industry is very comfortable with the idea of having Femtocells in LTE.

Analysts consider LTE as a major boost to the future success of femtocells. In order to take femtocells further with LTE and to make them big success the joint testing of a reference design against the LTE standard was proposed.

Considering the proposal seriously the joint testing was conducted by picoChip, a U.K based femtocell silicon developer; mimoOn a German-based SDR specialist; and the test equipment vendor Agilent Technologies. The objective behind the test was to verify that the femtocell reference design met the requirements of the LTE standard as measured by the recently developed 3GPP LTE modulation analysis option from Agilent.

The above joint testing triggered enough confidence in the industry and hence the idea of having Femtocells on LTE. Based on the joint testing picoChip and mimoOn, which have been co-operating on the reference design for the past 12 months, recently announced the availability of what they suggest are the first LTE femtocell and picocell reference designs, the PC8608 Home eNodeB and PC8618 eNodeB respectively. The design is based upon the same hardware platforms as picoChip's WiMAX products.

Going further PicoChip unveiled its first reference designs for LTE femtocells and picocells, which will enable the company's existing femtocell customers, which include ip.access and Ubiquisys, to upgrade to LTE.

3GPP is well aware of all the developments in the femtocells and is busy in developing the specifications with regards to femtocells in LTE.
To end the squabbling The Third Generation Partnership Project (3GPP) has adopted an official architecture for 3G femtocell home base stations and started work on a new standard for home base stations.

The 3GPP wants to have the new standard done by the end of this year, which appears to be an aggressive time schedule given the fact that vendors had various approaches to building a femtocell base station.


The agreed upon architecture follows an access network-based approach, leveraging existing standards, called IU-cs and Iu-ps interfaces, into the core service network. The result is a new interface called Iu-h.

The architecture defines two new network elements, the femtocell and the femtocell gateway. Between these elements is the new Iu-h interface. This solution was backed by Alcatel-Lucent, Kineto Wireless, Motorola and NEC.

However with every new standard the old or existing architecture comes under review. With this new standard all of the femtocell vendors who had their own design in place, must go back and change their access point and network gateway equipment to comply with the new standard interface. I think in doing so vendors can bring themselves in line with the global standard.
All femtocell vendors will have to make changes to their access points. Alcatel-Lucent, Motorola, NEC, and those that already use Kineto's GAN approach, such as Ubiquisys, will have the least work to do. Ubiquisys has already announced that it will have products ready that support the new standard by December of this year.

Now as the standard is been decided companies can work on their designs based on the standard and can think of introducing the products in the market.

T-Mobile is moving fast in that direction and it has chosen two German cities, Cologne and Bonn to test the commercial feasibility of 3G femtocells. The operator will be the first to conduct trials of the technology in Germany, albeit that numerous trials have already taken place elsewhere in Europe.

While T-Mobile demonstrated femtocells at the giant CeBIT exhibition earlier this year, this trial is aimed at testing how consumers react to the plug-and-play characteristics of femtocells. Having achieved positive feedback from earlier tests, T-Mobile is now continuing to explore the area of deep indoor coverage and enhance in buildings femtocells coverage for UMTS and HSPA (High Speed Packet Access). This will definitely boost both data transmissions and telephony.

T-Mobile’s earlier results from the above tests suggest there might be a limited commercial deployment of femtocells later in the year. T-Mobile is reported as seeing femtocells having 'a lot of potential'.

Femtocells are widely perceived as a solution for mobile operators to boost in-building 3G coverage without the high costs associated with increasing the size of their macro networks. Femtocells are very much the hot topic of the mobile industry at present and are expected to have a high profile at the forthcoming Mobile World Congress in Barcelona, Spain. Femtocell does present another front for revenues and companies are investing in femtocells.
In March of this year the T-Mobile Venture Fund made a strategic investment in Ubiquisys, a developer of 3G femtocells, joining Google and original investors Accel Partners, Atlas Venture and Advent Venture Partners.

Cisco and Intel recently invested in femtocell company ip.access and Qualcomm has put money into Airvana.

T-Mobile said it plans to test Ubiquisys' femtocell technology in trials in Germany, the Netherlands and the U.K. in the coming months. Meanwhile, its U.S. subsidiary is using WiFi hotspots in the home as an alternative to a femtocell solution to improve coverage in the home. Once T-Mobile launches its 3G network in the U.S. we could see both femtocells and WiFi.
However I am not sure whether T-Mobile in its latest trial in Germany used devices provided by Ubiquisys.

The commercial deployment of femtocells has taken another step forward following the adoption by the Femto Forum of a worldwide standard that defines the real-time management of femtocells within households. Members of the Forum have agreed to implement the Broadband Forum's TR-069 CPE WAN management protocol standard which is already in use with around 30 million devices having been defined in 2004 for the broadband community. The basis of the TR-069 standard is to enable CPE devices to be easily deployed and configured reliably but, more importantly, in high volumes, something that has worried operators planning to position the femtocell as user installable.The Femto Forum claims that TR-069 has proven itself to provide consumers with a method of easy installation and self provisioning, while enabling the operator to run diagnostics and conduct remote firmware and service upgrades with millions of end-user devices, in a cost-effective manner. The two groups now plan to define extensions to TR-069 to add additional femtocell capability to the standard.
It is an exciting time for the femtocell industry with commercialization in sight. The industry hopes are even higher with femtocells in LTE will provide even better services to the customers.

Wednesday, August 27, 2008

HSPA: Milestone and bold predictions

GSMA announced last week that the number of HSPA mobile subscribers has reached 50 million. The number of HSPA subscribers last year at around this time was 11 million.

An old slide predicting the rise of HSPA subscribers can be seen above. I dont think that the number of subscribers reached 20 million in 2007 as predicted but they will definitely be more than 60 million by the end of 2008. Around 4 million people are converting to being HSPA subscribers every month.

GSMA also have a site dedicated to HSPA where they also maintain a live counter of the number of HSPA subscribers worldwide.
Finally, here are some HSPA related statistical facts from GSMA website:
  • 267 operator commitments in 111 countries
  • 191 commercial deployments in 89 countries
  • All EU countries have commercial HSPA deployments
  • 747 HSPA devices from 114 suppliers including:
    > 281 mobile handsets
    > 68 data cards
    > 120 notebooks
    > 40 wireless routers
    > 72 USB modems
    > 39 embedded modules
GSMA has also claimed that mobile will reach speeds of 100Mbps before landline will. This was in response to BT announcing FTTx technology to be available in 10 million homes by 2012. With the Fibre technology though the initial speeds will be 40Mbps, rising to 60Mbps later on. Sometime (quite far) in future it will eventually achieve 1000Mbps though.

Finally, Global mobile broadband connections in the first quarter of this year rose by 850% from the same period last year, according to Herns Pierre-Jerome, director for wireless broadband technologies of Qualcomm. The rapid growth, he said, illustrated how mobile broadband had become a mainstream data application of third-generation (3G) mobile phone technology, driven primarily by evolution-data optimised (EV-DO) and high-speed packet access (HSPA) systems.

The number of 3G subscribers globally totals 670 million in a market of 3.5 billion mobile users. The figure is forecast to reach 1.6 billion over the next four years, fuelled by declining costs of network equipment and devices. Telecom vendors and operators are expected to realise revenue of US$114 billion from 3G equipment and $394 billion from 3G services. Qualcomm earned $10 billion in revenue in 2007, out of overall industry revenue of $352 billion.

Tuesday, August 26, 2008

Data revenues can go even higher

Over the past few years the telecoms world has experienced number of emerging technologies with a great degree of innovation coupled with intensive research. In the year 1999 when I completed my engineering in electronics and communication, most of the telecomm companies in India were only interested in GSM/GPRS. At that time I had no idea how the technology will evolve, as it has over the period of time.

Initially it was GPRS and then it lead to the major shift towards 3G. Once the 3G found its place further developments were carried out in terms of improving the user experience while on the move. This sole idea of giving the user the best, lead to the emergence of new technologies like HSDPA, HSUP. HSPA+ and LTE. Clearly improved data rates were the key factor for introduction of each technology from GPRS to LTE.

Everybody in the industry realised that if they have to win the customers and to compete with the fixed technology then they have to provide better data rates while the user is on the move. Till today the vendors and operators together with 3GPP has worked very hard to come forward with technologies like HSPA+ etc which can serve plenty of mega bytes per second to the users.
Although the wireless operators insist that we are still in the early stages of wireless data adoption but the data revenues are already playing the major part in the overall revenues of the companies. Recently when the operators announced Q2 data revenues they reported that data revenues account for nearly 25 percent of their average revenue per user (ARPU). Verizon Wireless is the prime example of the above fact which is the data leader in US, with 24.4 percent of its $51.53 ARPU coming from data. AT&T is a close second with 22.9 percent of its ARPU coming from data.

During the earnings calls with analysts, both the above operators together with the likes of Vodafone talked about the continued growth potential for data. There is a clear trend that operators are leaving no stone unturned in order to provide as high data rates as possible to their users. Operators are working feverishly to upgrade their network and the competition is intensifying for the better user experience. Youth and businesses are the main targets for the companies which are always in demand of high data rates for their own reasons.

These days one can easily get access to mobile broadband with reasonable amount of monthly payment. There are so many competitive deals available in the market in order to lure the customers towards browsing and emailing while on the move. There is no doubt that operators are successfully adding the customers and hence increasing their revenues mostly generated by data use. Verizon's data revenue grew 45 percent year over year. AT&T's data revenue grew 52 percent year over year. Vodafone and T-Mobile’s data revenue too grew by more than 50% over the last couple of years. But I'm wondering how high the data revenues can really climb. Is this strong growth rate sustainable?

Executives from the telecomm giants like Vodafone, AT&T etc predict there are still much more growth to come as consumers upgrade to integrated devices and smart phones that can take advantage of the 3G network. The companies say that nearly 20% of their customers has either upgrade or are in the process of upgrading to an integrated device. Meanwhile, Verizon recently said that 60 percent or 40.5 million, of its retail customers have upgraded to 3G data-capable devices.

Analysts believe that the likes of Vodafone, T-Mobile, AT&T, Verizon are the clear leader in monetizing data and that it will continue to lead the industry in data ARPU as it increases the number of data applications and data-centric devices.

I think the key to sustaining this growth rate lies not in the number of data-capable devices in consumer hands but in the availability of compelling data applications at reasonable price points. Without the continued push for better, more user friendly applications, data revenues are not going to be able to sustain this current growth trajectory.

Vodafone for example is already taking the necessary steps in that directions and it is looking to boost the usage of 3G data. Vodafone has announced an agreement with the laptop vendor Lenovo that will see its new X200 computer pre-installed with a Vodafone SIM and supporting software. The broadband connectivity comes at no extra cost and, when activated by the purchaser of the laptop, will offer the user a 30-day free trial. "The connection manager will ask for your name and email, but no bank details," said Alec Howard, head of PC connectivity at Vodafone. "Users will be prompted to take out a contract at the end of the free trial and the prices are around £12 a month for broadband, with automatic roaming in Europe at £8.50 a day. But like any other products this also has a disadvantage. Users dissatisfied with the Vodafone service will struggle if they want to connect to another mobile operator thus installing a new SIM and downloading and configuring a new connection manager instead of using the built-in software which only works with Vodafone.

I certainly believe that this embedded 3G initiative would significantly lower the cost of built-in mobile broadband technology across the entire range of laptops. I myself have used a dell laptop with an embedded data card.
Where you just have to put the right SIM and then connect to wireless broadband with the help of a connection manager. I think embedded modems are cool and are fun to use. There is no doubt in my mind that the use of embedded modems for mobile broadband connectivity is set to increase rapidly in the next few years, with sales estimated to grow at a rate of well above 80% per cent from 2008 to 2012. Laptops with an embedded modem are one of the data applications which will enhance the user experience and hence lead to the increase in ARPU.

Most of the vendors are also working in the direction where they can enhance the handset architecture with enhanced multimedia functionalities. Nokia surprised analysts with its Q2 revenues with better-than-expected second-quarter earnings. Nokia thinks inline with some of the operators and firmly believe that the global handset market could grow more than its previous estimate of 10 percent in 2008.

For the new devices, Nokia concentrated a lot on the services front, and hence enriched customers with the handsets supporting next generation multimedia services e.g. supporting Sony BMG Entertainment with Music service. The Nokia Music Store is now available in 10 markets and the company expects to have 14 stores open by year-end. In addition, N-Gage mobile games service, which became available during the quarter, has had more than 406,000 downloads.

So in my view if the companies are innovative just like Nokia has, then there is every possible chance to push the data rates to new high. Vendors with their excellent architecture and high degree of data applications can definitely push the data throughput and hence contribute in high data revenues.

As everyday passes by we are seeing new handset with amazing designs and new architecture. These handsets are designed to perform faster and can support very high data rates. Today’s youth can play online games on these devices, can watch live TV, send and receive multimedia messages and so many other things. The business users can exchange email while on the move. The installation of HSPA+ by vendors will further enhance the experience of the data users. The number of HSPA sunscribers is growing many folds i.e. at the rate of 4 million subscribers a year. Companies like At&T are aggressive towards their HSPA roll out plans and it looks to rollout the HSPA together with the 3G iphone.

Very high speed of up to 20 Mbps in a 5 Mhz channel is already achieved by HSPA and Qualcomm is one of the many to prove this.

At the moment things looks very promising and I strongly believe that industry will keep coming out with bright ideas to generate the increased data revenues. LTE is another step towards more revenue generation with enhanced user experience in view. Let’s see how high the data throughput together with the data revenues will go.

Sunday, August 24, 2008

Need for femtocells -> from Youtube

Someone sent me this link from youtube. Even though this is more like a marketing presentation from Soundpartners for a market report, it gives a good idea from operators point of view, why they will be looking at Femtocells to strengthen their market position and as a way of optimising their networks.

Friday, August 22, 2008

802.11n and 4G...

IEEE 802.11n is a proposed amendment to the IEEE 802.11-2007 wireless networking standard to significantly improve network throughput over previous standards, such as 802.11b and 802.11g, with a significant increase in raw (PHY) data rate from 54 Mbit/s to a maximum of 600 Mbit/s. Most devices today support a PHY rate of 300 Mbit/s, with the use of 2 Spatial Streams at 40 MHz. Depending on the environment, this may translate into a user throughput (TCP/IP) of 100 Mbit/s.

According to the book "WI-Fi, Bluetooth, Zigbee and Wimax":

802.11n is the 4th generation of wireless lan technology.
  • First generation (IEEE 802.11) since 1997 (WLAN/1G)
  • Second generation (IEEE 802.11b) since 1998 (WLAN/2G)
  • Third generation (802.11a/g) since 2000 (WLAN/3G)
  • Fourth generation (IEEE 802.11n) (WLAN/4G)

The distinguishing features of 802.11n are:

  • Very high throughput (some hundreds of Mbps)
  • Long distances at high data rates (equivalent to IEEE 802.11b at 500 Mbps)
  • Use of robust technologies (e.g. multiple-input multiple-output [MIMO]and space time coding).

In the N option, the real data throughput is estimated to reach a theoretical 540 Mbps (which may require an even higher raw data rate at the physical layer), and should be up to 100 times faster than IEEE 802.11b, and well over ten times faster than IEEE 802.11a or IEEE 802.11g. IEEE 802.11n will probably offer a better operating distance than current networks. IEEE 802.11n builds upon previous IEEE 802.11 standards by adding MIMO. MIMO uses multiple transmitter and receiver antennae to allowfor increased data throughput through spatial multiplexing and increased range by exploiting the spatial diversity and powerful coding schemes. The N system is strongly based on the IEEE 802.11e QoS specification to improve bandwidth performance. The system supports basebands width of 20 or 40MHz.

Note that there is 802.11n PHY and 802.11n MAC that will be required to acheive 540Mbps.

To achieve maximum throughput a pure 802.11n 5 GHz network is recommended. The 5 GHz band has substantial capacity due to many non-overlapping radio channels and less radio interference as compared to the 2.4 GHz band. An all-802.11n network may be impractical, however, as existing laptops generally have 802.11b/g radios which must be replaced if they are to operate on the network. Consequently, it may be more practical to operate a mixed 802.11b/g/n network until 802.11n hardware becomes more prevalent. In a mixed-mode system, it’s generally best to utilize a dual-radio access point and place the 802.11b/g traffic on the 2.4 GHz radio and the 802.11n traffic on the 5 GHz radio.


A lot of phones are coming with inbuilt WiFi (or 802.11 a/b/g) and this WiFi is a must on Laptops or they wont sell. The main difference in 802.11n, compared to previous generation of 802.11 is that there is a presence of MIMO. 802.11 family uses OFDM which is the same technology being adopted by LTE. The new LTE handsets will have advantage of easily integrating this 802.11n technology and the same antennas can be reused. In fact the same is applicable for WiMAX as it supports MIMO and OFDM. Ofcourse we will have problems if they are using quite different frequencies as the antennas ore optimised to range of frequencies, this is something that has to be seen.

In the news:

MIT and a medical center based in Alabama are beginning to deploy faster wireless 802.11n access points from Cisco Systems Inc. In more than 100 buildings on MIT's Cambridge, Mass., campus, as many as 3,200 access points running older 802.11a/b/g protocols will be replaced with 802.11n devices in the next 12 to 16 months, said Chris Murphy, a networking engineer at the university. Murphy said MIT, with more than 10,000 students and 11,000 staff members, has a "very, very wide variety" of client devices, from handhelds to laptops. Many of the laptops probably support the 802.11n protocol, he said. Some MIT staffers have been using voice-over-IP wireless handsets and have experienced poor coverage with the older Wi-Fi technology, but they said they have had full signal strength within the range of the new 802.11n access points, he added. With 802.11n, the university could eventually provide IP television, which requires a lot of bandwidth, Murphy said.

Using 802.11n technology, Lapham said he was able to transmit a gigabyte of data in less than two minutes. Currently, the 370-bed medical center has about 450 access points on older protocols. Devices used on the wireless network include 180 laptops, which are used primarily for transmitting bedside patient data. The hospital also supports 100 VoIP wireless phones and a various medical devices.

Wi-Fi is expected to be available in 99 per cent of North American universities by 2013, according to research released by industry analyst ABI Research this week. Much of that penetration will be in the form of 802.11n equipment: higher education is clearly the number one market for early adopters of 802.11n, the company said.

ABI Research expects 802.11n uptake – which is today fairly small in the education market – to ramp up steeply to quite a high rate of penetration," said ABI Research vice president Stan Schatt. There are several reasons for this. ABI said many students now assume a campus Wi-Fi network as a given, and many of their shiny new laptops will be 'n'-compatible. Universities also have great bandwidth demands, as lecture halls may need to serve a large number of users with multimedia contention at any given time and 802.11n's greater speed and capacity can address that need. Moreover, said Schatt, "Universities are breaking new ground by using video over Wi-Fi in a number innovative ways. This is driving the adoption of high speed 802.11n. Students in the near future (at least the diligent ones) will be just as likely to watch their favourite professor's lectures on their laptops as they will be to view 'America's Next Top Model'."

You may also be interested in reading:

Thursday, August 21, 2008

Revised paper on “4G” by 3G Americas

3G Americas have published a revised paper on Defining “4G”: Understanding the ITU Process for IMT-Advanced.

3G Americas initially created this white paper one year ago to provide clear understanding regarding the work-in-progress by the ITU, the sole organization responsible for determining the specifications for IMT-Advanced. The current paper updates the considerable progress made by the ITU, establishing a basis for what should be included in an IMT-Advanced system.


While speculation has been going on about 4G technologies, ITU is close to releasing a full set of documentation for this definition. It has held ongoing consultations with the global community over many years on this topic in Working Party 8F under the scope of a work item known as Question ITU-R 229-1/8 “Future development of IMT-2000 and systems beyond IMT-2000.” Following a year-end 2007 restructure in ITU-R, this work is being addressed under the new Study Group 5 umbrella (replacing the former Study Group 8) by Working Party 5D which is the new name for the former WP 8F.

This work in WP 8F, and now WP 5D, has woven together a definition, recipe, and roadmap for the future beyond 3G that is comprised of a balance among a Market and Services View, a Technology View, and a Spectrum View. These, along with Regulatory aspects, are the key elements for business success in wireless.

By mid-2008, ITU-R advanced beyond the vision and framework and developed a set of requirements by which technologies and systems can, in the near future, be determined as a part of IMT- Advanced and in doing so, earn the right to be considered 4G.

During 2008 and though 2009, ITU-R will hold an open call for the “first invitation” of 4G (IMTAdvanced) candidates. Subsequent to the close of the submission period for the “first invitation” an assessment of those candidates' technologies and systems will be conducted under the established ITU-R process, guidelines, and timeframes for this IMT-Advanced ‘first invitation.” The culmination of this open process will be a 4G, or IMT-Advanced family. Such a 4G family, in adherence to the principles defined for acceptance into this process, is globally recognized to be one which can grow to include all aspects of a marketplace that will arrive beyond 2010, thus complementing and building upon an expanding and maturing 3G business.

The paper is available to download from here.

The ITU-R Radiocommunication Bureau has established an “IMT-Advanced” web page (http://www.itu.int/ITU-R/go/rsg5-imt-advanced/) to facilitate the development of proposals and the work of the evaluation groups. The IMT-Advanced web page provides details of the process for the submission of proposals, and will include the RIT and SRIT submissions, evaluation group registration and contact information, evaluation reports and other relevant information on the development of IMTAdvanced.

Wednesday, August 20, 2008

Ofcom's 2008 Comms Market report

Dean Bubley posted this on Forum Oxford and i thought that this is worth spreading around.


Ofcom's just released a huge new report on the current state of the industry, incorporating telecoms, broadcasting and related services. Some interesting statistics:
  • Quite a lot of discussion of the resilience of fixed-line comms in the face of the mobile onslaught. Rather than direct fixed-mobile substitution, it appears that the UK sees more mobile-initiated incremental use of voice. Fixed minutes have dropped about 17bn minutes in total over 6 years, but mobile call volumes have risen by 38bn minutes. The UK outbound call total is still around 60/40 fixed:mobile, and 88% of homes still have a fixed line.
  • The proportion of mobile-only households has been pretty static for the past few years, currently at 11%. This is considerably lower than elsewhere in Europe (eg 37% in Italy), and is possibly reflecting the prevalence of ADSL. Most mobile-only users are from lower socioeconomic groups.
  • 44% of UK adults use SMS daily, against 36% using the Internet
  • More than 100k+ new mobile broadband connections per month in the UK in H1 2008, with the rate of sign-up accelerating. 75% of dongle users are now using their mobile connection at home.
  • Nearly half of adults with home broadband use WiFi
  • 11% of UK mobile phone owners use the device to connect to the Internet, and 7% use it to send email.
  • VoIP usage appears to have fallen from 20% of consumers in late 2006, to 14% in early 2008. However, I suspect that this masks the fact that many instances of VoIP (eg BT's broadband circuit-replacement service, or corporate IP-PBXs), don't make it obvious to the user.
  • Over two-thirds of mobile broadband users also have fixed-line broadband
  • UK mobile subscribers send an average 67 SMS per month (or 82 / month per head, taking account of multiple subs-per-person). MMS use is only 0.37 messages per user per month.
    Slight increase in overall fixed-line subscriptions in 2007 - attributed to business lines.
    Overall UK non-SMS mobile data revenues were flat in 2007 vs 2006 at £1bn. I reckon that's because the data pre-dates the big rise in mobile dongle sales, and also reflects price pressures on things like ringtones. Ofcom also attributes this to adoption of flatrate data plans vs. pay-per-MB.
  • UK prepay mobile ARPU has been flat at £9 / month for the last 4 years. That's a big issue for operators wanting to sell data services to prepay subs in my view.
  • 17% of mobile subscriptions in the UK were on 3G at end-2007, although there's not much detail on the actual usage of 3G for non-voice applications.
  • Overall, UK households allocate 3.3% of total spending to telecom services. That's been flat since 2003 - ie the slice of the pie isn't getting any bigger relative to food/rent/entertainment/travel etc.
  • 94% of new mobile subscriptions are bundled with handsets.
  • 11% of UK adults have >1 SIM card. Among 16-24yo users, this rises to 16%. There's an estimate that of the second devices in use in the UK, 1m are 3G dongles, 0.7m are BlackBerries or similar, and 8m are genuine "second handsets". There's also another 8m "barely active" devices that are used as backups, or legacy numbers that get occasional inbound calls or SMS

Some other interesting key points that are available here:

  • Communications industry revenue (based on the elements monitored by Ofcom) increased by 4.0% to £51.2bn in 2007, with telecoms industry revenue the fastest growing component, up 4.1% on the year.
  • Mobile telephony (including an estimate for messaging) accounted for 40% of the total time spent using telecoms services, compared to 25% in 2002. However, much of this growth has come about as a result of an increase in the overall number of voice call minutes (from 217 in 2002 to 247 in 2007) rather than because of substitution with fixed voice, which still accounted for 148 billion minutes last year, down only 10% from 165 minutes in 2002.
  • The most popular internet activity among older people is ‘communication’ (using email, instant messaging and chat rooms for example); 63% of over-65s say they communicate online, compared to 76% of all adults.
  • The majority of children aged 5-7 have access to the internet and most children aged 8-11 have access to a mobile phone. Children are more likely to use the internet for instant messaging than for email.
  • Television is particularly important to older people. Sixty-nine per cent of those aged 65-74 say it is the media activity that they would miss most (compared to 52% of all adults) and this rises to 77% among the over 75s. Older people are also more likely to say they miss newspapers and magazines – 10% of 65-74s and 7% of over 75s, compared to 5% of all adults.
  • The converged nature of mobile handsets became apparent during 2007, with 41% of mobile phone users claiming to use their handset for taking pictures and 15% uploading photos to their PC. Nearly one in five (17%) also claimed that they used their phone for gaming.

Tuesday, August 19, 2008

Nokia Eco Sensor Concept Mobile

Though this is not new, i havent seen it anywhere and found it recently while working on a report.

A visionary design concept is a mobile phone and compatible sensing device that will help you stay connected to your friends and loved ones, as well as to your health and local environment. You can also share the environmental data your sensing device collects and view other users’ shared data, thereby increasing your global environmental awareness.

The concept consists of two parts – a wearable sensor unit which can sense and analyze your environment, health, and local weather conditions, and a dedicated mobile phone.

The sensor unit will be worn on a wrist or neck strap made from solar cells that provide power to the sensors. NFC (near field communication) technology will relay information by touch from the sensors to the phone or to or to other devices that support NFC technology.

Both the phone and the sensor unit will be as compact as possible to minimize material use, and those materials used in the design will be renewable and/or reclaimed. Technologies used inside the phone and sensor unit will also help save energy.

To help make you more aware of your health and local environmental conditions, the Nokia Eco Sensor Concept will include a separate, wearable sensing device with detectors that collect environment, health, and/or weather data.

You will be able to choose which sensors you would like to have inside the sensing device, thereby customizing the device to your needs and desires. For example, you could use the device as a “personal trainee” if you were to choose a heart-rate monitor and motion detector (for measuring your walking pace).
The Nokia Eco Sensor Concept is built upon all three of these underlying principles of waste reduction. Emphasis will be placed on materials use and reuse in the phone’s construction.

To complete the Nokia Eco Sensor Concept, the phone and detector units will be optimized for lower energy consumption than phones in 2007 in both the manufacturing process and use. Alternative energy sources, such as solar power, will fuel the sensor unit’s power usage.

Please note that this is a concept phone so you wont be seeing this in a shop near you anytime soon.

Monday, August 18, 2008

4G: Where are we now.

Last month i read this news about WiMAX leading the world of 4G and last week I read about an American carrier selecting LTE as its choice of 4G technology. Since ITU has decided that they wont be using the term 4G in future and rather use IMT-Advanced or LTE-Advanced, I guess 4G is up for grabs.
The main driver for '4G' is data. Recently carriers have become agressive and started offering some decently priced 'Wireless Broadband' data plans. Rather than confuse people with HSDPA, etc., they have decided to use the term 'Wireless Broadband' or 'Mobile Broadband'. Personally both the terms have managed to confuse some people who associate Mobile Broadband with Internet access on Mobile and Wireless Broadband as broadband on WiFi.

Andrew Seybold makes some valid points in an article in Fierce Wireless. One of the things that he points out is that LTE may tout on higher data rates as compared to others, that is only possble in 20MHz of spectrum. In real world this kind of spectrum is near impossible to obtain. If the spectrum flexibility is removed than HSPA+, LTE, EV-DO Rev B and WiMAX have nearly the same data rates and performance.

For HSPA+ the existing infrastructure can be reused and a software upgrade would suffice whereas for LTE new infrastructure would be required. NTT DoCoMo has fully committed to being the first LTE network operator and others are raising their hands. He thinks that nationwide LTE networks would only be available around 2014.

While I agree with this analysis completely, I think what is going to dictate this transformation from 3G+ to LTE for the operators will be the uptake of data on a network. The biggest advantage of LTE is that it is able to operate in TDD and FDD mode. Operators that have been traditionally using FDD mode of operation will change their loyalty to TDD mode so that they can use asymmetric data transfer. This can provide more capacity in case of some special event taking place (Football finals, Reality show results, etc.) where the users are just interested in receiving information rather than sending any. For operators with paired spectrums, they can use both the band seperately in TDD modes.

Gigaom has list of American operators that are involved in 4G and the list is quite interesting:
  • AT&T: USA's largest network in terms of subscribers, AT&T plans to use LTE to upgrade to 4G, but not for a long, long time. For now it’s content with its current 3G network. It will upgrade to HSPA+ in 2009 and 2010. Eventually it will go to LTE, but won’t begin testing until 2010 or 2011 with full deployment coming after that.
  • Verizon Wireless: Verizon is already testing LTE equipment from several vendors, with plans to roll out the network in 2010 and have most of the country covered by 2012; Verizon’s would likely be the first full U.S. deployment of the LTE technology.
  • Sprint-Nextel: The outlier in the whole transition to 4G, Sprint is going with WiMAX rather than LTE. After a number of delays, the company is set to launch its network in September. By the end of the year it will join with Clearwire to operate a nationwide WiMAX network under the Clearwire brand.
  • T-Mobile: T-Mobile is still launching its 3G coverage, so its 4G networks may take a while to come to fruition. The carrier’s German parent appears to favor LTE.
  • Metro PCS: This budget carrier plans to use LTE but it doesn’t yet have a time frame for deployment, pointing out that its customers aren’t heavy data users yet.
  • U.S. Cellular: The company is unsure of its deployment plans but it would likely choose to follow the rest of the industry with LTE. As for deployment, the time frame isn’t set.
  • Leap Wireless: Recently said it had not made a decision or public comment about its 4G plans.

The picture is a bit different here in UK because all the operators are going to LTE. There may be some ISP's that may be tempted to move to WiMAX as they would get economy of scale. There is also the news of BT (the largest landline phone provider) planning to roll out nationwide WiMAX network in the 2.6GHz spectrum. If BT is able to fulfil its ambition that it could be a big win for the people.

Sunday, August 17, 2008

Femtocell success reliant on handset innovations

Femtocells are one of the emerging technology in telecomms. The success of femtocells cannot be predicted yet the industry related to femotcells has had its share of good and bad moments. There is no doubt in my mind that femtocells, together with WiMax and LTE are the most talked technology in telecoms these days.

In the past few weeks I have hearing about the challenges faced in deployment of femtocells. Getting the right handset for femtocells is the key for the success of the femtocells.
I must say that the hype surrounding the mass deployment of femtocells has been doused with cold water by a new study into the need for handset vendors to quickly transform their devices to support the technology. According to the report, published by Research & Markets (R&M), the femtocell industry is basing its optimism on the notion that subscribers will use their cell phones differently when in range of femtocells. There will be different applications and behavioral patterns when people are at home, perhaps content backups, podcasts or even advertiser sponsored TV programming. The mobile phone may need to be linked to the TV, PC, HiFi or other items of domestic technology, claims R&M.

I have seen some reports which suggest that although the currently available handsets will work with femtocells they are not optimised to support this new 'in home' activity. The question which remains in my mind is that how the handset will determine the femtocells as compared to any other stronger not femtocell available. The phone needs to be aware of the femtocell, ideally both in the radio and the application platform. I firmly believe that we will need new architecture for the handsets to solve the above problem. But changing how the handset industry approaches this challenge could take 2-3 years given that it takes this amount of time to implement new handset architecture, and around the same time before new cell phone technology reaches a broad range of devices. The handset industry also needs to be aware that where we will ne in terms wireless technology in 2-3 years time. We might be entering in the era of LTE by this time.

But there are some more issues which femtocell industry should be aware of. Some of these issues identified by Research and Market are:
  • In dense deployments of femtocells, handsets can spend too much time and power attempting to connect at locations that are not their own "home zone."
  • The new 3GPP Release 8 specifications contain various modifications to enable handsets to work better with femtocells, but the first R8-compliant phones will likely be shipped at the end of 2010.
  • The usage of handsets on femtocells may identify unexpected side-effects, relating to faster/cheaper data connections. This may impact elements of design such as memory allocation and power management.
  • Various suggestions have been made for ‘femto-zone' services--but there is no standardised way for handset applications to know they are attached to a femtocell.

By looking at the above issues it may not sound very well in favour of femtocell deployment and commercialisation. However operators are always looking for new means and ideas for the generation of new revenue streams. Femtocell is definitely on of those new means and a possible opportunity for the operators to generate more revenue. Revenue can be generated by the operators from advertisers and other third parties by enabling the provision of 'at home' services via femtocells.

Research and Market claimed that there could be a demand for at least 48 million femto-aware handsets to be sold to femtocell owners in 2013. However, with more optimistic forecasts, and especially if shared femtocell models become popular, there could potentially be a demand for up to 300 million femto-aware handsets per year in 2013.

Although the above figures look very encouraging, femtocell industry is still very cautious in terms of their approach towards massive investment. Femtocell industry is currently focusing on the short term, getting initial trials in place, developing standards, and securing commitments for early commercial deployment. These initial efforts are very critical for the femtocell industry so that they can validate the market, raise the profile of the femtocells concept. If the industry can do that then it will stimulate finance and investment in the femtocells.

One of the propositions by central marketing is that femtocells can work with normal 3G handsets. If this is true then subscriber can get the service from femtocells without needing to go for expensive upgrades to their existing phones.

But while focus is good and the industry does not want unnecessary distractions there is a risk of medium term failure if certain future problems are not addressed early enough, even if this muddies the waters of the short term marketing message. Already, femtocells proponents are talking up mass market business models that go beyond simple indoor coverage and macro-network offload. They are talking about 10's of millions of subscribers, and new "in-home" services for users, that exploit fast and cheap local mobile connectivity.

It is at that stage that the issue of right handsets for the femtocells industry comes into picture once again. The handset innovations become even more important for important for the industry. As I have mentioned above, the handsets design should be able to differentiate between femtocells and real cell environment. In part, this relates to complexities in managing the radio environment and mobility between femtocell and macrocell networks. This is easy said then done and hence various optimisations are desirable, especially when dense deployments of femtos occur. These drive changes in areas such as the way the phone "selects" cells on which to register. There may also need to be ways to offer provisioning and "guest access" on femtocells, from the handset UI. But this cannot be considered as a solution as users will definitely consider this as an unnecessary exercise for them. In my view the medium term hopes of the industry also reflect the notion that people will use their cellphones differently when in range of femtos. The problem for the femtocells industry doesn’t end with solving the problem of registering to the right cell. There will be different applications and behavioural patterns when people are at home, perhaps content backups, podcasts or even advertiser sponsored TV programming. The mobile phone may need to linked to TV, PC, HiFi or other items of domestic technology. This shows the road ahead is really tough and it again highlights the degree of innovations that will be required for the handsets design in order to work precisely in the femtocell environment.

Some reports suggest that standard phones can work with femtocells, but they are not optimised. Certain applications may only work when the phone is within femto range but they need to know when that is. Yes, some services can be notified by the core network that the user is "at home", but that approach doesn't scale to a wide base of operators, application developers and handset/OS vendors. The phone needs to be "aware" of the femtocell, ideally both in the radio and the application platform.




Changing such elements is not quick. The handset industry is much more complex and slow moving than many in the wider wireless business understand. It takes often 2-3 years for changes in handset architecture to reach commercially sold handsets, and another 2-3 years to reach a broad range of devices and reasonable penetration within the user base.

There is definitely a perception that the femtocell industry needs to be much more open minded about the need for modifying and optimising handsets and to be alert to the huge time and effort it will take to achieve. Other mobile developments like UMA and IMS have suffered in the past from a lack of focus on this issue. Although many femto advocates fear distractions could delay immediate market acceptance, early consideration of these "2nd order" problems is necessary for longer-term success.

What I have seen that there is significant efforts to make the femtos success and overcome the difficulties. The new 3GPP Release 8 specifications contain various modifications to enable handsets to work better with femtos (called Home NodeB's). Various suggestions have been made for "femto-zone" services -but there is no standardised way for handset applications to "know" they are on the femto. Although there are various workarounds, with the network notifying the application when the phone is attached to the femto, this approach is not easily scalable to the wider base of developers or operators. At the moment the best solution suggested is for handset "connection manager" software to explicitly recognise femtocell access as a new and specific type of bearer.

There is no doubt in my mind that operators could benefit from new revenue streams from advertisers & other third parties by enabling the provision of "at home" services via femtocells.
Using baseline forecasts, there should be a demand for at least 48m femto-aware handsets to be sold to femtocell owners in 2013. However, with more optimistic forecasts, and especially if "shared" femtocell models become popular, there could potentially be a demand for up to 300m femto-aware handsets per year in 2013.