Wednesday 16 July 2008
Momentum Building for UMTS 900MHz
According to a recent paper published by GSA, momentum is building for introducing UMTS 900 i.e. WCDMA-HSPA systems in 900 MHz band, used today by GSM/EDGE networks, to help operators to extend voice, data and mobile broadband services coverage by leveraging the advantages of lower frequencies. UMTS 900 is on the roadmap of several manufacturers. Three
commercial UMTS 900 systems have launched, and 20 user devices have been announced by 6 manufacturers.
I have blogged on 900Mhz band in past. Technical specifications for WCDMA-HSDPA in the 900 MHz band (UMTS 900) were completed by 3GPP in December 2005. The 900 MHz band, denoted as Band Class VIII, is defined as paired bands in the range 880 to 915 MHz (uplink), and 925 to 960 MHz (downlink).
A Manx Telecom trial confirmed 30% improved inbuilding penetration compared to 2100 MHz, and 40% in deep indoor penetration. With HSDPA, throughput could increase by 10%, raising overall network capacity 5%. A key finding was the ability to hand over calls between base stations operating at different frequencies. The trial confirmed a GSM 900 operator could re-use sites for UMTS without having to redesign and re-deploy the network, thus significantly reducing operational costs.
Saturday 12 July 2008
Will WiMax and LTE find happiness together?
So till now most of you must be coming slowly to the terms that there might be a possibility of LTE and WiMax working together. In the past blogs I stressed this point and also tried to convey some of the common grounds emerging for LTE and WiMax to work together. There is no doubt that the two technologies still struggle to find happiness together on a common platform.
From a software-defined radio (SDR) perspective, the opportunity for LTE and WiMax to seek a settlement is even more enticing. Flexibility, gate reuse and programmability seem to be the answers to the WiMax-LTE multimode challenge--and that might spell SDR
In todays advanced technology there are many multimode solution for SDR.
So will WiMax and LTE find happiness in Multimode SDR?
While it is true that Multimode solution via SDR has a well-deserved reputation for being expensive and overhyped, it is just as true that telecom chip designers are already adopting SDR techniques. They need to, simply to accommodate changes to ever-evolving standards.
The classic definition of SDR is having arrays of general-purpose processors running virtually all functions in software. But to achieve this is very time consuming and expensive as well. The approach of running all the functions of processor in a software can be expensive and may not be able to hit the price/performance targets of high-data-rate technologies such as WiMax and LTE.
Bit then we knows that the chip technology has never been any better than what it is today. Today’s innovative approaches for significant high standard of hardware architecture can make things simpler and can pave a path for SDR.
Such architectures are very oftern presented in the telecoms world on a regular basis and one such early entry is from Wavesat, which has a long history of designing OFDMA chips. The company has inked agreements with Compal Communications, a mobile-products ODM, to develop mobile WiMax products, and with Willcom, a Japanese telecom company, to develop XG-PHS broadband wireless products using Wavesat's Odyssey 8500 chip set. (http://www.eetimes.com/news/latest/showArticle.jhtml?articleID=208403496&pgno=3)
Wavesat presented the above chipset to both the LTE and WiMax camps. According to WaveSat the chip set is, in reality, a 4G platform that can implement any OFDM-based technology and thus can carry both WiMax and LTE together on it’s shoulders. Odyssey 8500 based on eight DSP cores is one such chipset.
But Wavesat is not the only company in the race of taking efforts towards SDR and hence finding a common solution for LTE and WiMax. Coresonic AB also has a multimode platform based on a new architecture: single instruction stream, multiple tasks. According the Coresonic AB CEO Rich Clucas SIMT can achieve the performance of very long instruction word architecture, but with lower control overhead and much lower program and memory usage.
Most of the big guns in the industry ahs acknowledged that Multimode baseband solutions for LTE and WiMax are challenging, but designing the front-end chip is truly daunting for several reasons, not the least of which is the wide spectrum covered by the two standards--about 4 GHz. LTE would likely support the 900-MHz to 1,900-MHz bands. WiMax has had to scramble to find available spectrum and, depending on region, may operate from 2.3 GHz to 3.5 GHz.
BitWave Semiconductor's programmable RF transceiver promises a way through the multimode thicket. Prototypes of BitWave's Softransceiver RFIC are already in the hands of selected ODMs Handsets and femtocells that incorporate the technology should launch next year. BitWave's technology digitally tunes passive circuit elements to make the analog functions such as LNAs, filters and mixers programmable.
With these new technologies in play, a little harmonization will go a long way. Everybody in the industry knows one thing very well that LTE is still very much in its development stage, Nor is WiMax standing still. Meanwhile the 802.16m task group is working to complete improvements that will make it look a lot like cellular, with such things as hand-offs. So even though there is air of some peace and vibes of togetherness between the two camps they are still looking to outdo the each other. Both LTE and WiMax camps are burning the midnight oil to achieve the perfect solution and if possible go alone.
WiMax camp knows very well that their technology is a proven one and is at a very advanced stage. They know very well that they can go places in the two to three years, the time it will take to even bring the LTE standard to commercial viability.
There is no doubt that LTE camp is worried that WiMax might chew up traditional cell market share by the time LTE becomes available commercially. In my view there is no doubt LTE and WiMax will merge down the road, but I think it will be the LTE folks doing the adapting. WiMax is here and will dominate. It is already dominating despite the puff fantasies of media reports to the contrary.
From a software-defined radio (SDR) perspective, the opportunity for LTE and WiMax to seek a settlement is even more enticing. Flexibility, gate reuse and programmability seem to be the answers to the WiMax-LTE multimode challenge--and that might spell SDR
In todays advanced technology there are many multimode solution for SDR.
So will WiMax and LTE find happiness in Multimode SDR?
While it is true that Multimode solution via SDR has a well-deserved reputation for being expensive and overhyped, it is just as true that telecom chip designers are already adopting SDR techniques. They need to, simply to accommodate changes to ever-evolving standards.
The classic definition of SDR is having arrays of general-purpose processors running virtually all functions in software. But to achieve this is very time consuming and expensive as well. The approach of running all the functions of processor in a software can be expensive and may not be able to hit the price/performance targets of high-data-rate technologies such as WiMax and LTE.
Bit then we knows that the chip technology has never been any better than what it is today. Today’s innovative approaches for significant high standard of hardware architecture can make things simpler and can pave a path for SDR.
Such architectures are very oftern presented in the telecoms world on a regular basis and one such early entry is from Wavesat, which has a long history of designing OFDMA chips. The company has inked agreements with Compal Communications, a mobile-products ODM, to develop mobile WiMax products, and with Willcom, a Japanese telecom company, to develop XG-PHS broadband wireless products using Wavesat's Odyssey 8500 chip set. (http://www.eetimes.com/news/latest/showArticle.jhtml?articleID=208403496&pgno=3)
Wavesat presented the above chipset to both the LTE and WiMax camps. According to WaveSat the chip set is, in reality, a 4G platform that can implement any OFDM-based technology and thus can carry both WiMax and LTE together on it’s shoulders. Odyssey 8500 based on eight DSP cores is one such chipset.
But Wavesat is not the only company in the race of taking efforts towards SDR and hence finding a common solution for LTE and WiMax. Coresonic AB also has a multimode platform based on a new architecture: single instruction stream, multiple tasks. According the Coresonic AB CEO Rich Clucas SIMT can achieve the performance of very long instruction word architecture, but with lower control overhead and much lower program and memory usage.
Most of the big guns in the industry ahs acknowledged that Multimode baseband solutions for LTE and WiMax are challenging, but designing the front-end chip is truly daunting for several reasons, not the least of which is the wide spectrum covered by the two standards--about 4 GHz. LTE would likely support the 900-MHz to 1,900-MHz bands. WiMax has had to scramble to find available spectrum and, depending on region, may operate from 2.3 GHz to 3.5 GHz.
BitWave Semiconductor's programmable RF transceiver promises a way through the multimode thicket. Prototypes of BitWave's Softransceiver RFIC are already in the hands of selected ODMs Handsets and femtocells that incorporate the technology should launch next year. BitWave's technology digitally tunes passive circuit elements to make the analog functions such as LNAs, filters and mixers programmable.
With these new technologies in play, a little harmonization will go a long way. Everybody in the industry knows one thing very well that LTE is still very much in its development stage, Nor is WiMax standing still. Meanwhile the 802.16m task group is working to complete improvements that will make it look a lot like cellular, with such things as hand-offs. So even though there is air of some peace and vibes of togetherness between the two camps they are still looking to outdo the each other. Both LTE and WiMax camps are burning the midnight oil to achieve the perfect solution and if possible go alone.
WiMax camp knows very well that their technology is a proven one and is at a very advanced stage. They know very well that they can go places in the two to three years, the time it will take to even bring the LTE standard to commercial viability.
There is no doubt that LTE camp is worried that WiMax might chew up traditional cell market share by the time LTE becomes available commercially. In my view there is no doubt LTE and WiMax will merge down the road, but I think it will be the LTE folks doing the adapting. WiMax is here and will dominate. It is already dominating despite the puff fantasies of media reports to the contrary.
Advanced 3GPP Interference Aware Receivers
Receiver structures in UEs and Node-Bs are constantly being improved as products evolve and more complex features are added to HSPA. The result is improved system performance and higher user data bit rates. This trend is reflected in constantly changing UE receiver requirements in 3GPP. In 2006, 3GPP has studied further improved minimum performance requirements for UMTS/HSDPA UEs. These enhanced performance requirements are release-independent (i.e. apply also to a Rel-6 terminal with advanced receivers).
Interference aware receivers, referred to as type 2i and type 3i, were defined as extensions of the existing type 2 and type 3 receivers, respectively. The basic receiver structure is that of an LMMSE sub-chip level equalizer which takes into account not only the channel response matrix of the serving cell, but also the channel response matrices of the most significant interfering cells. HSDPA throughput estimates were developed using link level simulations, which include the other-cell interference model plus Orthogonal Carrier Noise Simulator (OCNS) models for the serving and interfering cells based on the two network scenarios considered.
This type of receiver attempts to cancel the interference that arises from users operating outside the serving cell, which is also referred to as other-cell interference. Interference models/profiles were developed for this other-cell interference in terms of the number of interfering Node Bs to consider, and their powers relative to the total other cell interference power, the latter ratios referred to as Dominant Interferer Proportion (DIP) ratios. For the purposes of this study item it was determined that five interfering Node Bs should be taken into account in the interference models. DIP ratios were defined based on three criteria: median values of the corresponding cumulative density functions, weighted average throughput gain, and field data. Of these criteria, the one based on the ‘weighted average’ was felt to offer a compromise between the conservative, median value criteria and the more optimistic field data criteria. In addition, two network scenarios were defined, one based solely on HSDPA traffic (HSDPA-only), and the other based on a mixture of HSDPA and Rel-99 voice traffic (HSDPA+R99).
HSDPA throughput estimates were then developed using link level simulations, which included the othercell interference models plus OCNS models for the serving and interfering cells based on the two network scenarios considered. The two-branch reference receiver, referred to as a type 3i receiver, was found to offer significant gains in throughput primarily at or near the cell edge. Link level results were developed for a wide range of operating conditions including such factors as transport format, network scenario, modulation, and channel model. For example, the gains for the DIP ratios based on the weighted average ranged from a factor of 1.2 to 2.05 for QPSK H-SET6 PB3, and from 1.2 to 3.02 for VA30 for network geometries of -3 and 0 dB. This complements the performance of existing two-branch equalizers (type 3), which typically provide gain at high geometries, and thus, the combination of the two will lead to a much better user experience over the entire cell.
In addition, a system level study was conducted that indicated that a type 3i receiver provided gains in coverage ranging from 20-55% for mildly dispersive channels, and 25-35% for heavily dispersive channels, the exact value of which depends upon user location. A second system level study divided the users into two different groups depending on their DCH handover states, where the first group collected users in soft handover (between cells), and the second group collected users in softer handover (between sectors of the same cell). The results of this second study indicate that the Type 3i receiver will provide benefits for users in these two groups, increasing their throughput by slightly over 20%. With regards to implementation issues, it was felt that the type 3i receiver is based upon known and mature signal processing techniques, and thus, the complexity is minimized. With two-branch, equalizer-based receivers already available in today’s marketplace, it appears quite doable to develop a two-branch equalizer with interference cancellation/mitigation capabilities. Given all of the above, 3GPP concluded that two-branch interference cancellation receivers are feasible for HSDPA, and a work item has been created to standardize the performance requirements with type 3i receiver.
More on this topic is available in the following:
Interference aware receivers, referred to as type 2i and type 3i, were defined as extensions of the existing type 2 and type 3 receivers, respectively. The basic receiver structure is that of an LMMSE sub-chip level equalizer which takes into account not only the channel response matrix of the serving cell, but also the channel response matrices of the most significant interfering cells. HSDPA throughput estimates were developed using link level simulations, which include the other-cell interference model plus Orthogonal Carrier Noise Simulator (OCNS) models for the serving and interfering cells based on the two network scenarios considered.
This type of receiver attempts to cancel the interference that arises from users operating outside the serving cell, which is also referred to as other-cell interference. Interference models/profiles were developed for this other-cell interference in terms of the number of interfering Node Bs to consider, and their powers relative to the total other cell interference power, the latter ratios referred to as Dominant Interferer Proportion (DIP) ratios. For the purposes of this study item it was determined that five interfering Node Bs should be taken into account in the interference models. DIP ratios were defined based on three criteria: median values of the corresponding cumulative density functions, weighted average throughput gain, and field data. Of these criteria, the one based on the ‘weighted average’ was felt to offer a compromise between the conservative, median value criteria and the more optimistic field data criteria. In addition, two network scenarios were defined, one based solely on HSDPA traffic (HSDPA-only), and the other based on a mixture of HSDPA and Rel-99 voice traffic (HSDPA+R99).
HSDPA throughput estimates were then developed using link level simulations, which included the othercell interference models plus OCNS models for the serving and interfering cells based on the two network scenarios considered. The two-branch reference receiver, referred to as a type 3i receiver, was found to offer significant gains in throughput primarily at or near the cell edge. Link level results were developed for a wide range of operating conditions including such factors as transport format, network scenario, modulation, and channel model. For example, the gains for the DIP ratios based on the weighted average ranged from a factor of 1.2 to 2.05 for QPSK H-SET6 PB3, and from 1.2 to 3.02 for VA30 for network geometries of -3 and 0 dB. This complements the performance of existing two-branch equalizers (type 3), which typically provide gain at high geometries, and thus, the combination of the two will lead to a much better user experience over the entire cell.
In addition, a system level study was conducted that indicated that a type 3i receiver provided gains in coverage ranging from 20-55% for mildly dispersive channels, and 25-35% for heavily dispersive channels, the exact value of which depends upon user location. A second system level study divided the users into two different groups depending on their DCH handover states, where the first group collected users in soft handover (between cells), and the second group collected users in softer handover (between sectors of the same cell). The results of this second study indicate that the Type 3i receiver will provide benefits for users in these two groups, increasing their throughput by slightly over 20%. With regards to implementation issues, it was felt that the type 3i receiver is based upon known and mature signal processing techniques, and thus, the complexity is minimized. With two-branch, equalizer-based receivers already available in today’s marketplace, it appears quite doable to develop a two-branch equalizer with interference cancellation/mitigation capabilities. Given all of the above, 3GPP concluded that two-branch interference cancellation receivers are feasible for HSDPA, and a work item has been created to standardize the performance requirements with type 3i receiver.
More on this topic is available in the following:
- 3GPP TR 25.963 V7.0.0: Feasibility study on interference cancellation for UTRA FDD User Equipment (UE)
- Signal Processing for Wireless Communications By Joseph Boccuzzi
- Simulation results can also be obtained from reports here.
Qualcomm to make Femto-Intelligent Handsets
The last time I read about Qualcomm bying stake in Femtocell maker Ip.Access, I overlooked it but now the rumors are that they want to be the leaders in Femto-Intelligent handsets. This fits well with Femto-optimised handsets plan that I blogged about.
Ovum had interesting analysis on the original news:
It's easy to see, though, how femtocells fit into Qualcomm's vision of the future of the wireless world. It's not too much of an exaggeration to say the company believes that the WCDMA radio technology is so good that no other is needed; curiously, ip.access's other industry investors, including Cisco and Motorola, are much more closely associated with the opposite position. Some years ago Qualcomm engaged in a sustained rubbishing of WiFi as a public access technology, and more recently it has given WiMAX the same treatment. Femtocells could be a route towards using WCDMA in home and other internal networking.
The reality may be more prosaic. According to Qualcomm Ventures' self-description, its “aim is to support Qualcomm's mission of enabling and fostering 3G (WCDMA) and wireless Internet markets through strategic investments in privately owned startup ventures.” With femtocells at least a plausible runner in the medium term in the future development of public mobile networks, not investing anywhere in the technology might almost be deemed remiss. In Qualcomm's own words, both it and ip.access “have developed key intellectual property for femtocells, share the same vision for femtocells, and can work together to make the most effective use of these ideas for the benefit of the whole femtocell industry.”
From ip.access's perspective, the deal is somewhat more straightforward. It believes that Qualcomm is buying in because of its technological excellence and 'traction' with operators - it claims involvement in a number of femtocell trials with major MNOs, though none of these has been publicly disclosed.
For ip.access, the money Qualcomm is bringing is less interesting than the relationships and influence - especially with standards organisations - that it adds. Privately held, the company already lists Cisco, Intel Capital, ADC and Motorola Ventures among its owners. Neither party is disclosing the extent of Qualcomm's investment, or whether it will have any significant impact on the balance of shareholdings.
We are still mildly sceptical about the femtocell business case; even though ip.access provides some interesting data on the impact that domestic femtocells can have on the capacity requirements for the macro network. But it's clear that Qualcomm's decision to buy into ip.access makes the proposition rather more credible.
Ovum had interesting analysis on the original news:
It's easy to see, though, how femtocells fit into Qualcomm's vision of the future of the wireless world. It's not too much of an exaggeration to say the company believes that the WCDMA radio technology is so good that no other is needed; curiously, ip.access's other industry investors, including Cisco and Motorola, are much more closely associated with the opposite position. Some years ago Qualcomm engaged in a sustained rubbishing of WiFi as a public access technology, and more recently it has given WiMAX the same treatment. Femtocells could be a route towards using WCDMA in home and other internal networking.
The reality may be more prosaic. According to Qualcomm Ventures' self-description, its “aim is to support Qualcomm's mission of enabling and fostering 3G (WCDMA) and wireless Internet markets through strategic investments in privately owned startup ventures.” With femtocells at least a plausible runner in the medium term in the future development of public mobile networks, not investing anywhere in the technology might almost be deemed remiss. In Qualcomm's own words, both it and ip.access “have developed key intellectual property for femtocells, share the same vision for femtocells, and can work together to make the most effective use of these ideas for the benefit of the whole femtocell industry.”
From ip.access's perspective, the deal is somewhat more straightforward. It believes that Qualcomm is buying in because of its technological excellence and 'traction' with operators - it claims involvement in a number of femtocell trials with major MNOs, though none of these has been publicly disclosed.
For ip.access, the money Qualcomm is bringing is less interesting than the relationships and influence - especially with standards organisations - that it adds. Privately held, the company already lists Cisco, Intel Capital, ADC and Motorola Ventures among its owners. Neither party is disclosing the extent of Qualcomm's investment, or whether it will have any significant impact on the balance of shareholdings.
We are still mildly sceptical about the femtocell business case; even though ip.access provides some interesting data on the impact that domestic femtocells can have on the capacity requirements for the macro network. But it's clear that Qualcomm's decision to buy into ip.access makes the proposition rather more credible.
Labels:
Femtocells,
Mobile Phones and Devices,
Qualcomm
Wednesday 9 July 2008
Updated Paper on 3GPP Rel 7 and Rel 8 from 3G Americas
3G Americas, a wireless industry group supporting the GSM family of technologies in the Americas, has provided updates to its popular white paper titled UMTS Evolution from 3GPP Release 7 to Release 8: HSPA and SAE/LTE that explains the leading evolutionary roadmap for the GSM family of technologies to 3G and beyond. Globally, the demand for wireless data services is driving the growth of 3G UMTS/HSPA technology with more than 200 commercial HSDPA networks today and subscriptions to UMTS/HSPA estimated at over 236 million by Informa Telecoms and Media. With more than 3.1 billion subscriptions for the GSM family of technologies worldwide today, the potential for third generation HSPA technology is forecast to reach 1.39 billion subscriptions by year end 2012 and 1.8 billion by year end 2013 according to Informa.
UMTS Evolution from 3GPP Release 7 to Release 8: HSPA and SAE/LTE offers a further review of 3GPP Release-7 (Rel-7) upon its completion in the technology standardization process and an introduction to the improved features of 3GPP Release 8 (Rel-8). The paper explores the growing demands for wireless data and successes for a variety of wireless applications, the increasing Average Revenue per User (ARPU) for wireless services by operators worldwide, recent developments in 3GPP technologies by several leading manufacturers, and 3GPP technology benefits and technical features.
Upon the finalization of the Rel-8 standard later this year, 3G Americas will publish a new white paper on the 3GPP standards that will include the completion of Rel-7 HSPA+ features, voice over HSPA, SAE/EPC (Evolved Packet Core) specification and Common IMS among other new developments and features. Since HSPA+ enhancements are fully backwards compatible with Rel-99/Rel-5/Rel-6, the upgrade to HSPA+ has been made smooth and evolutionary for GSM operators. Additionally, Rel-7 standardizes Evolved EDGE with continuing development in Rel-8 which will improve the user experience across all wireless data services by reducing latency and increasing data throughput and capacity. Finalization of the Rel-8 standard by the end of this year will further progress market interest in commercial deployment of LTE. Leading operators worldwide are announcing their plans to deploy LTE as early as 2010 with trials already occurring today.
The popular white paper UMTS Evolution from 3GPP Release 7 to Release 8: HSPA and SAE/LTE was written collaboratively by members of 3G Americas and is available for free download here.
UMTS Evolution from 3GPP Release 7 to Release 8: HSPA and SAE/LTE offers a further review of 3GPP Release-7 (Rel-7) upon its completion in the technology standardization process and an introduction to the improved features of 3GPP Release 8 (Rel-8). The paper explores the growing demands for wireless data and successes for a variety of wireless applications, the increasing Average Revenue per User (ARPU) for wireless services by operators worldwide, recent developments in 3GPP technologies by several leading manufacturers, and 3GPP technology benefits and technical features.
Upon the finalization of the Rel-8 standard later this year, 3G Americas will publish a new white paper on the 3GPP standards that will include the completion of Rel-7 HSPA+ features, voice over HSPA, SAE/EPC (Evolved Packet Core) specification and Common IMS among other new developments and features. Since HSPA+ enhancements are fully backwards compatible with Rel-99/Rel-5/Rel-6, the upgrade to HSPA+ has been made smooth and evolutionary for GSM operators. Additionally, Rel-7 standardizes Evolved EDGE with continuing development in Rel-8 which will improve the user experience across all wireless data services by reducing latency and increasing data throughput and capacity. Finalization of the Rel-8 standard by the end of this year will further progress market interest in commercial deployment of LTE. Leading operators worldwide are announcing their plans to deploy LTE as early as 2010 with trials already occurring today.
The popular white paper UMTS Evolution from 3GPP Release 7 to Release 8: HSPA and SAE/LTE was written collaboratively by members of 3G Americas and is available for free download here.
Sunday 6 July 2008
The case for Femto-optimised handsets
Dean Bubley from Disruptive wireless has come out with a report arguing for the need of Femto-aware handsets. The following is an extract from the report summary:
Already, femto proponents are talking up massmarket business models that go beyond simple indoor coverage and macro-network offload. They are talking about 10’s of millions of subscribers, and new “in-home” services for users, that exploit fast and cheap local mobile connectivity.
But this is based on the notion that people will use their cellphones differently when in range of femtos. There will be different applications and behaviour when people are at home – perhaps content backups, podcasts or even advertiser-sponsored TV programming. The mobile phone may need to linked to TV, PC, HiFi or other items of domestic technology.
This report argues that if the phone will be used differently, it needs to be designed differently as well. Standard phones can work with femtocells, but they are not optimised. The phone needs to be “aware” of the femtocell, ideally both in the radio and the application platform.
The report looks at all the various "layers" of a typical phone, and examines how the advent of femtocells will drive changes and optimisations:
The following is from an article in Tech World:
The devices could do useful jobs such as handling large media files on phones, but these applications won't work well unless the phone has a reliable way of knowing whether it is on the femto or the "macro" network, said Dean Bubley of Disruptive Analysis. Unfortunately, vendors' efforts to make femtocells work seamlessly with all existing phones has resulted in a definition which makes the femto look exactly like a macrocell, to the handset.
"If femtos change user behavior you will need to change the handsets," said Bubley of Disruptive Analysis, who warns in a report that femto-aware handsets will be required.
Femto-aware handsets will be important after 2010, said Vedat Eyuboglu, chief technology officer at femto maker Airvana, which supplies silicon modules to Thomson, a maker of home gateways including BT's Home Hub.
However, Bubley warned that the actual phones might get forgotten, or take too long to develop: "The debate around femto-aware phones may get mired in discussions about interference management and 3GPP R8 tweaks to the interfaces involved," he said. "It takes two years to alter protocol stacks and hardware - we won't have femto-optimized phones until at least the end of 2010."
The handset issue could be addressed by industry body the Femto Forum, said Forum chair Simon Saunders, announcing the Forum's new relationship with the Next Generation Mobile Network Alliance (NGMNA), a body helping specify requirements for WiMax and LTE systems.
"As well as phones, femtocells will be used by devices including dongles and ultra-mobile PCs," said Saunders. "They do not have such long development cycles as mobile phones."
The NGMNA is developing recommendations for a cost-optimized indoor node, and for self-organizing networks, both of which can be met by femtocell designs that the Femto Forum will help develop. The two bodies will promote joint solutions and submit them to standards bodies.
Already, femto proponents are talking up massmarket business models that go beyond simple indoor coverage and macro-network offload. They are talking about 10’s of millions of subscribers, and new “in-home” services for users, that exploit fast and cheap local mobile connectivity.
But this is based on the notion that people will use their cellphones differently when in range of femtos. There will be different applications and behaviour when people are at home – perhaps content backups, podcasts or even advertiser-sponsored TV programming. The mobile phone may need to linked to TV, PC, HiFi or other items of domestic technology.
This report argues that if the phone will be used differently, it needs to be designed differently as well. Standard phones can work with femtocells, but they are not optimised. The phone needs to be “aware” of the femtocell, ideally both in the radio and the application platform.
The report looks at all the various "layers" of a typical phone, and examines how the advent of femtocells will drive changes and optimisations:
- Physical design & form-factor of the handset
- Radio layer & protocol stack
- Internal hardware - memory, power management etc
- Handset operating system & connection manager
- New femtocell-related applications & capabilities
The study includes forecasts for the overall femtocell market, and scenarios examining how the evolution of femto-cell aware handsets may evolve. It examines the value chain of the phone design & manufacturing industry, and discusses the role of component suppliers, OS specialists and industry bodies.
More Info here.The following is from an article in Tech World:
The devices could do useful jobs such as handling large media files on phones, but these applications won't work well unless the phone has a reliable way of knowing whether it is on the femto or the "macro" network, said Dean Bubley of Disruptive Analysis. Unfortunately, vendors' efforts to make femtocells work seamlessly with all existing phones has resulted in a definition which makes the femto look exactly like a macrocell, to the handset.
"If femtos change user behavior you will need to change the handsets," said Bubley of Disruptive Analysis, who warns in a report that femto-aware handsets will be required.
Femto-aware handsets will be important after 2010, said Vedat Eyuboglu, chief technology officer at femto maker Airvana, which supplies silicon modules to Thomson, a maker of home gateways including BT's Home Hub.
However, Bubley warned that the actual phones might get forgotten, or take too long to develop: "The debate around femto-aware phones may get mired in discussions about interference management and 3GPP R8 tweaks to the interfaces involved," he said. "It takes two years to alter protocol stacks and hardware - we won't have femto-optimized phones until at least the end of 2010."
The handset issue could be addressed by industry body the Femto Forum, said Forum chair Simon Saunders, announcing the Forum's new relationship with the Next Generation Mobile Network Alliance (NGMNA), a body helping specify requirements for WiMax and LTE systems.
"As well as phones, femtocells will be used by devices including dongles and ultra-mobile PCs," said Saunders. "They do not have such long development cycles as mobile phones."
The NGMNA is developing recommendations for a cost-optimized indoor node, and for self-organizing networks, both of which can be met by femtocell designs that the Femto Forum will help develop. The two bodies will promote joint solutions and submit them to standards bodies.
Labels:
Femtocells,
Mobile Phones and Devices
Saturday 5 July 2008
Mobile TV! Still no joy
Mobile TV is floundering in one of its bastion (Korea ... other one being Japan). The following is from a report in telecoms.com:
With mobile TV services in the flagship market of South Korea floundering and with few signs that operators anywhere else have found a successful formula for launching such services, most operator and vendor delegates at the recent CommunicAsia Summit in Singapore struggled to find enthusiasm for the fledgling industry.
Some operators and vendors say that mobile TV should be subscription-based, to offer a reliable revenue stream; others say an ad-supported model is the most viable option; and still others argue that a combined pay/advertising approach is the way forward.
Figures from South Korea seem to suggest that both pay-based and ad-supported models have critical weaknesses, which would also apply in other markets in the region. A lot more experimentation and creativity from operators might be required to find the right model.
Those promoting the idea of a pay-based service say that only by charging for content can a business model work. They say operators must team up with content firms to acquire premium content - most particularly sports - that people will be willing to pay a monthly fee to view or even pay for on a per-view basis.
But this line of thinking seems flawed, given that there is a limited amount of blue-chip content for which people will be prepared to pay, most notably live sports events - such as English Premier League soccer games - or highlights of them.
The problem is, of course, that content-rights holders have become adept at exacting a premium price for key sports rights, meaning that mobile TV operators would have to recoup their heavy capital investment by charging high subscription fees.
This is a problem, since the high churn rate experienced by TU Media in South Korea seems to suggest that mobile TV subscribers are extremely price-sensitive.
TU Media subscribers pay just KRW13,000 ($12.60) a month for the service but have been leaving in droves after their initial one-year contracts finish, forcing the firm to offer significantly reduced subscription rates to keep subscribers from deserting the service.
TU Media's experience suggests that mobile TV subscribers will be willing to pay only so much for services and that although blue-chip sports content has a crucial role to play, operators must find a way to acquire the content without paying excessive prices.
On the advertising side of the debate, many delegates at CommunicAsia argued that an ad-based strategy would work best for mobile TV platforms but that operators would have to be extremely creative in their approach.
There is no magic bullet that will provide a successful business model, but there seems to be a reasonable possibility that an attractive model can be built if operators can match the largely young and technology-friendly subscribers viewing mobile TV on their handsets with advertisers desperate to reach such a market.
Intriguingly, conference delegates also discussed the possibility that broadcast-type mobile TV services might never fully take off in the region and that Multimedia Broadcast Multicast Service (MBMS) video streaming over high-speed HSPA and future LTE networks would dominate the market.
The debate has strong proponents on both sides. Many vendors back an MBMS approach, saying that experience shows that broadcast-style services are not what users are demanding and that the more-narrowly targeted VOD-style content being offered on HSPA networks is already proving hugely popular.
The pro-MBMS argument also runs that with HSPA/LTE networks already in place and offering voice, data and video services, why go to the expense of deploying a terrestrial or satellite-based mobile TV network, especially with the expense involved in creating high-quality in-building reception?
Although this is a persuasive argument, it has shortfalls, most notably the fact that even LTE networks will still be point-to-point networks and will be unequipped to operate as point-to-multipoint services, which a full broadcast mobile TV service would require.
The broadcast-mobile-TV lobby argues strongly that the core strengths of broadcast-based networks cannot be replicated by even high-speed mobile networks, which would not be able to support the huge demand that's sure to arise for broadcasts of live sports and important news events.
In reality, the MBMS-vs.-broadcast-mobile-TV debate is spurious, given that both technologies are going to be on the market, and it will be users who determine which is the more successful.
At this early stage, it looks likely that subscribers and operators will use high-speed, quality video streaming for VOD-based "snacking" on content and that full broadcast mobile TV will be used for some live events, for which only a broadcast-style service can supply the quality of service required.
With mobile TV services in the flagship market of South Korea floundering and with few signs that operators anywhere else have found a successful formula for launching such services, most operator and vendor delegates at the recent CommunicAsia Summit in Singapore struggled to find enthusiasm for the fledgling industry.
Some operators and vendors say that mobile TV should be subscription-based, to offer a reliable revenue stream; others say an ad-supported model is the most viable option; and still others argue that a combined pay/advertising approach is the way forward.
Figures from South Korea seem to suggest that both pay-based and ad-supported models have critical weaknesses, which would also apply in other markets in the region. A lot more experimentation and creativity from operators might be required to find the right model.
Those promoting the idea of a pay-based service say that only by charging for content can a business model work. They say operators must team up with content firms to acquire premium content - most particularly sports - that people will be willing to pay a monthly fee to view or even pay for on a per-view basis.
But this line of thinking seems flawed, given that there is a limited amount of blue-chip content for which people will be prepared to pay, most notably live sports events - such as English Premier League soccer games - or highlights of them.
The problem is, of course, that content-rights holders have become adept at exacting a premium price for key sports rights, meaning that mobile TV operators would have to recoup their heavy capital investment by charging high subscription fees.
This is a problem, since the high churn rate experienced by TU Media in South Korea seems to suggest that mobile TV subscribers are extremely price-sensitive.
TU Media subscribers pay just KRW13,000 ($12.60) a month for the service but have been leaving in droves after their initial one-year contracts finish, forcing the firm to offer significantly reduced subscription rates to keep subscribers from deserting the service.
TU Media's experience suggests that mobile TV subscribers will be willing to pay only so much for services and that although blue-chip sports content has a crucial role to play, operators must find a way to acquire the content without paying excessive prices.
On the advertising side of the debate, many delegates at CommunicAsia argued that an ad-based strategy would work best for mobile TV platforms but that operators would have to be extremely creative in their approach.
There is no magic bullet that will provide a successful business model, but there seems to be a reasonable possibility that an attractive model can be built if operators can match the largely young and technology-friendly subscribers viewing mobile TV on their handsets with advertisers desperate to reach such a market.
Intriguingly, conference delegates also discussed the possibility that broadcast-type mobile TV services might never fully take off in the region and that Multimedia Broadcast Multicast Service (MBMS) video streaming over high-speed HSPA and future LTE networks would dominate the market.
The debate has strong proponents on both sides. Many vendors back an MBMS approach, saying that experience shows that broadcast-style services are not what users are demanding and that the more-narrowly targeted VOD-style content being offered on HSPA networks is already proving hugely popular.
The pro-MBMS argument also runs that with HSPA/LTE networks already in place and offering voice, data and video services, why go to the expense of deploying a terrestrial or satellite-based mobile TV network, especially with the expense involved in creating high-quality in-building reception?
Although this is a persuasive argument, it has shortfalls, most notably the fact that even LTE networks will still be point-to-point networks and will be unequipped to operate as point-to-multipoint services, which a full broadcast mobile TV service would require.
The broadcast-mobile-TV lobby argues strongly that the core strengths of broadcast-based networks cannot be replicated by even high-speed mobile networks, which would not be able to support the huge demand that's sure to arise for broadcasts of live sports and important news events.
In reality, the MBMS-vs.-broadcast-mobile-TV debate is spurious, given that both technologies are going to be on the market, and it will be users who determine which is the more successful.
At this early stage, it looks likely that subscribers and operators will use high-speed, quality video streaming for VOD-based "snacking" on content and that full broadcast mobile TV will be used for some live events, for which only a broadcast-style service can supply the quality of service required.
Korean Insight has an interesting section on Mobile TV (but no blogs on this topic for some time). A blog on this topic last year says a lot:
As TU Media started operations in mid 2005 it tried to acquire simultaneous re-transmission rights from broadcasters. This means that S-DMB viewers would be able to watch popular dramas and shows simultaneously with fixed TV. These contents are considered the most popular on both fixed and mobile TV. However, previously have broadcasters been reluctant to share these contents because they wanted to use it for their own T-DMB service. This is why S-DMB had to focus on other contents like sports and news. But the lack of “killer” contents from fixed TV hindered S-DMB development (as shown in the graphic above). Until today it had been able to acquire approximately 1.26 million subscribers. But according to TU Media they need approximately 2.5 million subscribers to be profitable.
But also T-DMB is struggling to build a profitable business. Despite more than seven million T-DMB devices in Korea the advertising revenues are marginal. Which partially is the result of very restrictive legislation on advertising but also broadcasters have failed to develop an attractive mobile advertising value proposition to make this channel more attractive for advertisers.
Consumers have embraced this new medium and it is very likely that broadcasters will take mobile TV more serious and endeavor to make mobile TV advertising more attractive for broadcasters. Until 2012 more than 20 million T-DMB devices are expected, so mobile TV has a future in Korea.
LTE And WiMax Together?
In my last blog I talked about LTE and WiMax finally finding a peace in each other and the early signs of the two having a future together. As I said before I have always believed that the two technologies as a basic are not very much different. I certainly support the notion that the industry can benefit a lot from the two working side by side.
But as always when I was discussing this with some of my friends in the industry they questioned about the similarity between the two technologies.
So how much similar or different they are?
Whenever the similarity between LTE and WiMax is discussed we conclude that the single most important similarity between LTE and WiMax is orthogonal frequency division multiplex (OFDM) signalling. Both technologies also employ Viterbi and turbo accelerators for forward error correction. From a chip designer's perspective, that makes the extensive reuse of gates highly likely if one had to support both schemes in the same chip or chip set. From a software-defined radio (SDR) perspective, the opportunity is even more enticing. Flexibility, gate reuse and programmability seem to be the answers to the WiMax-LTE multimode challenge and that might spell SDR.
Most of the articles and discussion shows that LTE and WiMax may be two peas in an OFDM pod, but they are not twins. Here are three significant differences:
1. Both use orthogonal frequency division multiple access (OFDMA) in the downlink. But WiMax optimizes for maximum channel usage by processing all the information in a wide channel. LTE, on the other hand, organizes the available spectrum into smaller chunks.
WiMax pays a price for high channel utilization, however, because processing that much information might require a 1,000-point fast Fourier transform. LTE can get by with a 16-point FFT. This translates into higher power consumption, because it's difficult to design fixed-function WiMax hardware that is also efficient in LTE designs. An architecture that exploits the principles of SDR, however, could reconfigure its FFT function for better power efficiency.
2. LTE uses single-carrier frequency division multiple access (SC-FDMA) for uplink signaling, while WiMax sticks with OFDMA. A major problem with OFDM-based systems is their high peak-to-average power ratios. The average power spec cited in marketing presentations does not show the whole picture. Unfortunately, the system's power amplifier has to be designed to handle peak power--and the PA is the single-largest power consumer in a handset.
LTE opted for the SC-FDMA specifically to boost PA efficiency. "If you can improve the efficiency from 5 percent up to 50 percent simply by changing modulation schemes, then you save a lot of battery time," said Anders Nilsson, principal system architect at multimode specialist Coresonic AB. WiMax's OFDMA has a peak-average ratio of about 10 dB, while LTE's SC-FDMA's peak-average ratio is about 5 dB.
The difference also affects the baseband chip, Nilsson added, because of the need to support two modulation schemes in the uplink. Programmable solutions are flexible enough to reuse gates and keep power low in LTE mode.
Regarding the PAPR issue (Peak to Average Power Ratio), I found the following tutorial interesting
http://to.swang.googlepages.com/peaktoaveragepowerratioreduction
3. Although both the IEEE 802.13e standard and the evolving LTE standard support frequency division duplexing (FDD) and time division duplexing (TDD), WiMax implementations are predominantly TDD. LTE seems to be heading in the FDD direction because it is true full-duplex operation: Adjacent channels are used for uplink and downlink. LTE can therefore quote a better spec for downlink data rates, albeit at a cost of placing very severe latency requirements for forward error correction. The bottom line is that the WiMax radio is much simpler
These differences make designing a chip or chip set to support both standards more difficult, but they also have network infrastructure consequences that might be more easily resolved by harmonization instead of competition. Certainly, from the handset designer's perspective, there is no clear winner.
The battery life and power efficiency of the chip or chip set are critical to market success, said Fannie Mlinarsky, an independent consultant specializing in wireless testing and design. Power is a big issue for WiMax and LTE because megabit-per-second capability means running the DSP hard and making the chips more power hungry.
But as always when I was discussing this with some of my friends in the industry they questioned about the similarity between the two technologies.
So how much similar or different they are?
Whenever the similarity between LTE and WiMax is discussed we conclude that the single most important similarity between LTE and WiMax is orthogonal frequency division multiplex (OFDM) signalling. Both technologies also employ Viterbi and turbo accelerators for forward error correction. From a chip designer's perspective, that makes the extensive reuse of gates highly likely if one had to support both schemes in the same chip or chip set. From a software-defined radio (SDR) perspective, the opportunity is even more enticing. Flexibility, gate reuse and programmability seem to be the answers to the WiMax-LTE multimode challenge and that might spell SDR.
So to start with I just concentrated on OFDMA and did some research to find out how much similar the two technologies are in terms of OFDMA or are they?
Most of the articles and discussion shows that LTE and WiMax may be two peas in an OFDM pod, but they are not twins. Here are three significant differences:
1. Both use orthogonal frequency division multiple access (OFDMA) in the downlink. But WiMax optimizes for maximum channel usage by processing all the information in a wide channel. LTE, on the other hand, organizes the available spectrum into smaller chunks.
WiMax pays a price for high channel utilization, however, because processing that much information might require a 1,000-point fast Fourier transform. LTE can get by with a 16-point FFT. This translates into higher power consumption, because it's difficult to design fixed-function WiMax hardware that is also efficient in LTE designs. An architecture that exploits the principles of SDR, however, could reconfigure its FFT function for better power efficiency.
2. LTE uses single-carrier frequency division multiple access (SC-FDMA) for uplink signaling, while WiMax sticks with OFDMA. A major problem with OFDM-based systems is their high peak-to-average power ratios. The average power spec cited in marketing presentations does not show the whole picture. Unfortunately, the system's power amplifier has to be designed to handle peak power--and the PA is the single-largest power consumer in a handset.
LTE opted for the SC-FDMA specifically to boost PA efficiency. "If you can improve the efficiency from 5 percent up to 50 percent simply by changing modulation schemes, then you save a lot of battery time," said Anders Nilsson, principal system architect at multimode specialist Coresonic AB. WiMax's OFDMA has a peak-average ratio of about 10 dB, while LTE's SC-FDMA's peak-average ratio is about 5 dB.
The difference also affects the baseband chip, Nilsson added, because of the need to support two modulation schemes in the uplink. Programmable solutions are flexible enough to reuse gates and keep power low in LTE mode.
Regarding the PAPR issue (Peak to Average Power Ratio), I found the following tutorial interesting
http://to.swang.googlepages.com/peaktoaveragepowerratioreduction
3. Although both the IEEE 802.13e standard and the evolving LTE standard support frequency division duplexing (FDD) and time division duplexing (TDD), WiMax implementations are predominantly TDD. LTE seems to be heading in the FDD direction because it is true full-duplex operation: Adjacent channels are used for uplink and downlink. LTE can therefore quote a better spec for downlink data rates, albeit at a cost of placing very severe latency requirements for forward error correction. The bottom line is that the WiMax radio is much simpler
These differences make designing a chip or chip set to support both standards more difficult, but they also have network infrastructure consequences that might be more easily resolved by harmonization instead of competition. Certainly, from the handset designer's perspective, there is no clear winner.
The battery life and power efficiency of the chip or chip set are critical to market success, said Fannie Mlinarsky, an independent consultant specializing in wireless testing and design. Power is a big issue for WiMax and LTE because megabit-per-second capability means running the DSP hard and making the chips more power hungry.
Thursday 3 July 2008
Rumors: Femtocells being rolled out from this month
Sprint's indoor coverage-extending femtocell device, Airave, will be rolled out nationwide on July 15 (USA) according to early reports. Airave is a device which connects to any cable modem or DSL router with an open port, and generates a signal to which mobile phones can connect. Airave allows up to three simultaneous voice connections to be made within a 5,000 square foot coverage area.
Additionally, Airave comes with a 20-foot long antenna (cant be right) that must be stationed near a window so its onboard GPS receiver can connect. This is necessary for the device to function because the GPS determines if it is stationed within Sprint-licensed territory. If this device is used in an area that does not offer some degree of native Sprint coverage, it will not function.
This is a bold move from Sprint considering this another story from Unstrung titled, "Operators Feel Femto Frustration":
The lack of a definitive standard for femtocells is a sticking point for operators and has even caused some to postpone their vendor selections. SFR , for one, has delayed its femtocell RFQ (request for quotation) because the standards are not yet defined.
While equipment suppliers have taken a big step recently to agree on a framework for defining a femtocell standard -- thanks to vendor compromises and the Femto Forum Ltd. facilitating a consensus -- the hard work to hash out a standard at the 3GPP is just beginning.
O2 (UK) Ltd. has found in recent femtocell tests that the devices were not hitting HSDPA data speeds. Chris Fenton, director of convergence policy at O2, said that the femtocells the operator tested recently got to just 700 kbit/s on the downlink. “There is some work to do to get us to the 3.6 Mbit/s and 7.2 Mbit/s” he said. “We’re testing early boxes, though -- I think it’s really close, so that’s OK.”
Operators admit that they are not yet certain of what the service proposition for home base stations should be or how they’ll make money from the devices. Some look to femtocells simply to improve indoor coverage, which could have a big effect on churn. AT&T’s Gordon Mansfield, director of radio access network planning, said that poor coverage is the number one cause of churn. “In the U.S. market, we’ve got these challenges,” he said.
Mansfield said that early femto deployments would be aimed at improving indoor coverage and that future deployments could be aimed at “integrating three screens in the home,” or, in other words, tying the mobile phone into consumers’ home networks.
Additionally, Airave comes with a 20-foot long antenna (cant be right) that must be stationed near a window so its onboard GPS receiver can connect. This is necessary for the device to function because the GPS determines if it is stationed within Sprint-licensed territory. If this device is used in an area that does not offer some degree of native Sprint coverage, it will not function.
This is a bold move from Sprint considering this another story from Unstrung titled, "Operators Feel Femto Frustration":
A lack of standards, unresolved technical issues, and unclear business cases are conspiring to push operator plans for commercial femtocell launches into next year. That’s the message from major mobile operators at the recent Femtocell Europe 2008 conference. While operators are still enticed by the potential for cost savings, capacity increases, and new service revenues that the mini home base stations promise, they gave a realistic account of their own plans for femtocells here, and they said they anticipate commercial deployments in the market some time in 2009 .
The lack of a definitive standard for femtocells is a sticking point for operators and has even caused some to postpone their vendor selections. SFR , for one, has delayed its femtocell RFQ (request for quotation) because the standards are not yet defined.
While equipment suppliers have taken a big step recently to agree on a framework for defining a femtocell standard -- thanks to vendor compromises and the Femto Forum Ltd. facilitating a consensus -- the hard work to hash out a standard at the 3GPP is just beginning.
T-Mobile International AG ’s head of RAN strategy, Zhongrong Liu, said the femtocell standardization process was hampered by “lack of resource or focus from big vendors,” and he urged them to send more delegates to the 3GPP working groups.
O2 (UK) Ltd. has found in recent femtocell tests that the devices were not hitting HSDPA data speeds. Chris Fenton, director of convergence policy at O2, said that the femtocells the operator tested recently got to just 700 kbit/s on the downlink. “There is some work to do to get us to the 3.6 Mbit/s and 7.2 Mbit/s” he said. “We’re testing early boxes, though -- I think it’s really close, so that’s OK.”
Operators admit that they are not yet certain of what the service proposition for home base stations should be or how they’ll make money from the devices. Some look to femtocells simply to improve indoor coverage, which could have a big effect on churn. AT&T’s Gordon Mansfield, director of radio access network planning, said that poor coverage is the number one cause of churn. “In the U.S. market, we’ve got these challenges,” he said.
Mansfield said that early femto deployments would be aimed at improving indoor coverage and that future deployments could be aimed at “integrating three screens in the home,” or, in other words, tying the mobile phone into consumers’ home networks.
Another article in Unstrung last month had an interview with Vodafone visionary on how Vodafone dreams of Metro Femto:
Femtocells could one day proliferate in metropolitan areas at bus stops, on lamp posts, or on buildings, if Vodafone Group plc's vision for a hotspot deployment of femto access points becomes reality. The giant mobile operator's head of new technologies and innovation, Kenny Graham, proposed taking the mini home base stations out of the home/office and onto the streets at the Femtocells Europe 2008 conference in London Wednesday morning. Graham (a.k.a. the Vodafone Visionary) reckons the same attributes that make femtocells ideal for deployment in homes and offices -– localized coverage, improved performance, self-configuration, self-optimization, and low cost -– can be of use outside, too. He dubbed this kind of deployment a “metrozone.”
Meanwhile operators are working hard to make sure the cost of Femtocell hits rock bottom:
Telecoms operators are pushing to get the cost of a 3G femtocell in the home down to €40 – well below the current $99 (€63) target. "For the femtocell to be economically viable compared with the macrocells, it has to be less than €40 in the total cost of ownership per access point," said Thierry Berthouloux, head of network evolution at French mobile phone operator SFR, speaking at the European Femtocell conference. "It is not €100, that is far to expensive, and that is really the challenge," said Berthouloux.
Tuesday 1 July 2008
NFC: Near-Field Communication
Saw this posted on Forum Oxford. Very good introductory article on NFC:
Near-field Communication (NFC) is characterized as a very short-range radio communication technology with a lot of potential, especially when applied to mobile handsets. Imagine yourself using your cellphone to interact with posters, magazines, and even with products while at the store, and with such interaction initiating a request or search for related information in real-time. Other usages of NFC include the electronic wallet to make payments using your handset, the same way you do with your credit card. With NFC all this is possible. But NFC is still a young technology. That said, NFC-enabled handsets are being introduced into the market, and deployments and pilots around the world are occurring.
Near-field Communication or NFC is a standard defined by the NFC Forum, a global consortium of hardware, software/application, credit card companies, banking, network-providers, and others who are interested in the advancement and standardization of this promising technology.
NFC is a short-range radio technology that operates on the 13.56 MHz frequency, with data transfers of up to 424 kilobits per second. NFC communication is triggered when two NFC-compatible devices are brought within close proximity, around four centimeters. Because the transmission range is so short, NFC-based transactions are inherently secure; more on this shortly.
When compared to the other short-range radio technologies, NFC is extremely short ranged and what I call people-centric. Some of the other short-range communication technologies have similar characteristics, for example RFID, while others are completely different yet complimentary to NFC; for example Bluetooth and Infrared. A good scenario of such compliment is the combination of NFC and Bluetooth, where NFC is used for pairing (authenticating) a Bluetooth session used for the transfer of data.
Complete Article at Sun Developer Network.
More about NFC at NFC Forum homepage.
Subscribe to:
Posts (Atom)