Sunday, August 31, 2008

NRS, ENUM and NGN


The way most of mobile savvy people work is that they have bundles of free minutes on their mobile which they use for calling people locally/nationally and then they have VoIP based clients like Skype they use for calling people on similar services locally/nationally/internationally. There is a constant juggle between Mobile numbers and VoIP numbers. What if we were able to use our number with VoIP client so regardless of whom you are calling, if they have a similar VoIP service on their side, you get free call and if they dont have this VoIP client then you use your inclusive minutes or get charged. ENUM will be able to solve this.

According to a Yankee Group report titled "ENUM Will Be Reinvented as a Strategic NGN Element", In spite of its early struggles, ENUM, short for Electronic Numbering or Telephone Number Mapping, is well positioned to provide a fundamental underpinning of the Anywhere Network™ as it relates to the efficient routing of any IP-based service across operator domains. It is in this new role that ENUM evolves from its rather meager beginnings to a strategic role in the transition to IP.


I wrote about ENUM sometime back as that was mentioned as magical entity in one of the conferences. Since then I have managed to find the Nominet presentation which was discussed in the conference. Infact there was a conference in London on ENUM organised by Nominet. If you are not clear about ENUM from my earlier Blog then please check Technology Inside Blog here.
We can discuss again here, why ENUM is important:

Imagine the NHS has 500 telephone numbers that it operates as 0800 freephone numbers to allow customer (patients) to contact various local departments. The cost of each minute of every call is borne by the NHS so ultimately by the British taxpayer. Now the NHS also has VoIP connectivity and decides to advertise their 0800 numbers through DNS using ENUM. Subsequently, every time someone using VoIP decides to call any of those 0800 numbers their VoIP provider will find the 0800 number in the ENUM DNS listings for the NHS and will connect the caller to the medical department using VoIP alone - at no cost to either party (usually).

Siemens have a very good presentation that shows different uses of ENUM.

Clearly with this approach there is scope for financial savings. That said, there remains considerable work needed to achieve even this small goal, let alone the potential options further down the road.


In case you were wondering, ENUM is an international standard being implemented by individual countries separately through their respective Governments. The UK Government, through regulator OFCOM, has assigned the design, implementation and ongoing administration of the project to UKEC who, in turn, have contracted much of the work to Nominet. Nominet administer and maintain the .uk gTLD - when you buy any domain ending .uk it is ultimately sold by Nominet although almost always through a reseller (”registrar”) like GoDaddy.

GSMA and NeuStar have been working with leading operators to provide a standards-based solution to this problem. This solution is the Number Resolution Service, or NRS.

NRS is a GSMA Managed Service operated by NeuStar. The service facilitates IP interoperability by translating telephone numbers to IP-based addresses. Interoperability is particularly important in facilitating the uptake of emerging services such as MMS, IMS and Packet Voice.

Based on Carrier ENUM, NRS is available to mobile operators, fixed network operators, and related service providers. The service is currently being piloted with a number of operators, with commercial availability scheduled for the autumn of 2008.

As next generation IP-based services proliferate, operators can utilise NRS to position new services behind the telephone number already used by subscribers. Whenever a telephone number is used to identify an end user, the NRS service will facilitate the discovery of URI containing information specific to the service being provided.

NRS is provided as an off the shelf managed service, interoperable on a global basis, providing all the facilities and features necessary to implement an operator’s interconnect policies. Pricing is based on a cost effective “pay as you go” model with no up front capital investment required. NRS thus helps lower the entry barrier for new services and promotes innovation by simplifying the product development and implementation process.

ENUM is going to be hated by the CPs because it will lower their per minute revenue which they are getting at the moment but they it is definitely going to provide new opportunities (and competetion). At the same time the customers will love it because they will get loads of free calls and dont have to worry too much about installing different VoIP clients on their phones. At present it is still in the initial stages with everyone waiting for others to adopt it first but ENUM is here to stay.

Abbreviations:
  • ENUM - tElephone NUmber Mapping (I have also seen Electronic NUMbering)
  • CP - Communications Provider
  • SBC - Session Border Controller
  • NHS - National Health Service (in UK)
  • NRS - Number Resolution Service
  • GSMA - GSM Association

Femtocells With LTE and their commercialization

Over the past few months LTE is gaining real momentum and the LTE camp is expanding. Companies who have decided to consider LTE as their 4G technology are doing everything possible to make LTE a big success.

Femtocells is another one of the most talked technology these days. In the past one year itself Femtocells has gained lots of strength and they are already in the process of commercialization. Giants like Verizon, T-Mobile, and Sprint have already announced their offering of femtocell products and service plans sometime this year. A few big announcements like this in the femtocell arena should give Femtocell market some good momentum. Some of you might already be aware that Qualcomm made a significant yet unknown investment in ip.access' Oyster 3G systems, which uses the residential broadband connection to deliver a 3G signal in the home. The move is seen as validating the femtocell concept, especially since Qualcomm is so adept at making the right technology investments.

With the work on LTE in full progress and femtocells strengthening its ground, industry is very comfortable with the idea of having Femtocells in LTE.

Analysts consider LTE as a major boost to the future success of femtocells. In order to take femtocells further with LTE and to make them big success the joint testing of a reference design against the LTE standard was proposed.

Considering the proposal seriously the joint testing was conducted by picoChip, a U.K based femtocell silicon developer; mimoOn a German-based SDR specialist; and the test equipment vendor Agilent Technologies. The objective behind the test was to verify that the femtocell reference design met the requirements of the LTE standard as measured by the recently developed 3GPP LTE modulation analysis option from Agilent.

The above joint testing triggered enough confidence in the industry and hence the idea of having Femtocells on LTE. Based on the joint testing picoChip and mimoOn, which have been co-operating on the reference design for the past 12 months, recently announced the availability of what they suggest are the first LTE femtocell and picocell reference designs, the PC8608 Home eNodeB and PC8618 eNodeB respectively. The design is based upon the same hardware platforms as picoChip's WiMAX products.

Going further PicoChip unveiled its first reference designs for LTE femtocells and picocells, which will enable the company's existing femtocell customers, which include ip.access and Ubiquisys, to upgrade to LTE.

3GPP is well aware of all the developments in the femtocells and is busy in developing the specifications with regards to femtocells in LTE.
To end the squabbling The Third Generation Partnership Project (3GPP) has adopted an official architecture for 3G femtocell home base stations and started work on a new standard for home base stations.

The 3GPP wants to have the new standard done by the end of this year, which appears to be an aggressive time schedule given the fact that vendors had various approaches to building a femtocell base station.


The agreed upon architecture follows an access network-based approach, leveraging existing standards, called IU-cs and Iu-ps interfaces, into the core service network. The result is a new interface called Iu-h.

The architecture defines two new network elements, the femtocell and the femtocell gateway. Between these elements is the new Iu-h interface. This solution was backed by Alcatel-Lucent, Kineto Wireless, Motorola and NEC.

However with every new standard the old or existing architecture comes under review. With this new standard all of the femtocell vendors who had their own design in place, must go back and change their access point and network gateway equipment to comply with the new standard interface. I think in doing so vendors can bring themselves in line with the global standard.
All femtocell vendors will have to make changes to their access points. Alcatel-Lucent, Motorola, NEC, and those that already use Kineto's GAN approach, such as Ubiquisys, will have the least work to do. Ubiquisys has already announced that it will have products ready that support the new standard by December of this year.

Now as the standard is been decided companies can work on their designs based on the standard and can think of introducing the products in the market.

T-Mobile is moving fast in that direction and it has chosen two German cities, Cologne and Bonn to test the commercial feasibility of 3G femtocells. The operator will be the first to conduct trials of the technology in Germany, albeit that numerous trials have already taken place elsewhere in Europe.

While T-Mobile demonstrated femtocells at the giant CeBIT exhibition earlier this year, this trial is aimed at testing how consumers react to the plug-and-play characteristics of femtocells. Having achieved positive feedback from earlier tests, T-Mobile is now continuing to explore the area of deep indoor coverage and enhance in buildings femtocells coverage for UMTS and HSPA (High Speed Packet Access). This will definitely boost both data transmissions and telephony.

T-Mobile’s earlier results from the above tests suggest there might be a limited commercial deployment of femtocells later in the year. T-Mobile is reported as seeing femtocells having 'a lot of potential'.

Femtocells are widely perceived as a solution for mobile operators to boost in-building 3G coverage without the high costs associated with increasing the size of their macro networks. Femtocells are very much the hot topic of the mobile industry at present and are expected to have a high profile at the forthcoming Mobile World Congress in Barcelona, Spain. Femtocell does present another front for revenues and companies are investing in femtocells.
In March of this year the T-Mobile Venture Fund made a strategic investment in Ubiquisys, a developer of 3G femtocells, joining Google and original investors Accel Partners, Atlas Venture and Advent Venture Partners.

Cisco and Intel recently invested in femtocell company ip.access and Qualcomm has put money into Airvana.

T-Mobile said it plans to test Ubiquisys' femtocell technology in trials in Germany, the Netherlands and the U.K. in the coming months. Meanwhile, its U.S. subsidiary is using WiFi hotspots in the home as an alternative to a femtocell solution to improve coverage in the home. Once T-Mobile launches its 3G network in the U.S. we could see both femtocells and WiFi.
However I am not sure whether T-Mobile in its latest trial in Germany used devices provided by Ubiquisys.

The commercial deployment of femtocells has taken another step forward following the adoption by the Femto Forum of a worldwide standard that defines the real-time management of femtocells within households. Members of the Forum have agreed to implement the Broadband Forum's TR-069 CPE WAN management protocol standard which is already in use with around 30 million devices having been defined in 2004 for the broadband community. The basis of the TR-069 standard is to enable CPE devices to be easily deployed and configured reliably but, more importantly, in high volumes, something that has worried operators planning to position the femtocell as user installable.The Femto Forum claims that TR-069 has proven itself to provide consumers with a method of easy installation and self provisioning, while enabling the operator to run diagnostics and conduct remote firmware and service upgrades with millions of end-user devices, in a cost-effective manner. The two groups now plan to define extensions to TR-069 to add additional femtocell capability to the standard.
It is an exciting time for the femtocell industry with commercialization in sight. The industry hopes are even higher with femtocells in LTE will provide even better services to the customers.

Tuesday, August 26, 2008

HSPA: Milestone and bold predictions

GSMA announced last week that the number of HSPA mobile subscribers has reached 50 million. The number of HSPA subscribers last year at around this time was 11 million.

An old slide predicting the rise of HSPA subscribers can be seen above. I dont think that the number of subscribers reached 20 million in 2007 as predicted but they will definitely be more than 60 million by the end of 2008. Around 4 million people are converting to being HSPA subscribers every month.

GSMA also have a site dedicated to HSPA where they also maintain a live counter of the number of HSPA subscribers worldwide.
Finally, here are some HSPA related statistical facts from GSMA website:
  • 267 operator commitments in 111 countries
  • 191 commercial deployments in 89 countries
  • All EU countries have commercial HSPA deployments
  • 747 HSPA devices from 114 suppliers including:
    > 281 mobile handsets
    > 68 data cards
    > 120 notebooks
    > 40 wireless routers
    > 72 USB modems
    > 39 embedded modules
GSMA has also claimed that mobile will reach speeds of 100Mbps before landline will. This was in response to BT announcing FTTx technology to be available in 10 million homes by 2012. With the Fibre technology though the initial speeds will be 40Mbps, rising to 60Mbps later on. Sometime (quite far) in future it will eventually achieve 1000Mbps though.

Finally, Global mobile broadband connections in the first quarter of this year rose by 850% from the same period last year, according to Herns Pierre-Jerome, director for wireless broadband technologies of Qualcomm. The rapid growth, he said, illustrated how mobile broadband had become a mainstream data application of third-generation (3G) mobile phone technology, driven primarily by evolution-data optimised (EV-DO) and high-speed packet access (HSPA) systems.

The number of 3G subscribers globally totals 670 million in a market of 3.5 billion mobile users. The figure is forecast to reach 1.6 billion over the next four years, fuelled by declining costs of network equipment and devices. Telecom vendors and operators are expected to realise revenue of US$114 billion from 3G equipment and $394 billion from 3G services. Qualcomm earned $10 billion in revenue in 2007, out of overall industry revenue of $352 billion.

Data revenues can go even higher

Over the past few years the telecoms world has experienced number of emerging technologies with a great degree of innovation coupled with intensive research. In the year 1999 when I completed my engineering in electronics and communication, most of the telecomm companies in India were only interested in GSM/GPRS. At that time I had no idea how the technology will evolve, as it has over the period of time.

Initially it was GPRS and then it lead to the major shift towards 3G. Once the 3G found its place further developments were carried out in terms of improving the user experience while on the move. This sole idea of giving the user the best, lead to the emergence of new technologies like HSDPA, HSUP. HSPA+ and LTE. Clearly improved data rates were the key factor for introduction of each technology from GPRS to LTE.

Everybody in the industry realised that if they have to win the customers and to compete with the fixed technology then they have to provide better data rates while the user is on the move. Till today the vendors and operators together with 3GPP has worked very hard to come forward with technologies like HSPA+ etc which can serve plenty of mega bytes per second to the users.
Although the wireless operators insist that we are still in the early stages of wireless data adoption but the data revenues are already playing the major part in the overall revenues of the companies. Recently when the operators announced Q2 data revenues they reported that data revenues account for nearly 25 percent of their average revenue per user (ARPU). Verizon Wireless is the prime example of the above fact which is the data leader in US, with 24.4 percent of its $51.53 ARPU coming from data. AT&T is a close second with 22.9 percent of its ARPU coming from data.

During the earnings calls with analysts, both the above operators together with the likes of Vodafone talked about the continued growth potential for data. There is a clear trend that operators are leaving no stone unturned in order to provide as high data rates as possible to their users. Operators are working feverishly to upgrade their network and the competition is intensifying for the better user experience. Youth and businesses are the main targets for the companies which are always in demand of high data rates for their own reasons.

These days one can easily get access to mobile broadband with reasonable amount of monthly payment. There are so many competitive deals available in the market in order to lure the customers towards browsing and emailing while on the move. There is no doubt that operators are successfully adding the customers and hence increasing their revenues mostly generated by data use. Verizon's data revenue grew 45 percent year over year. AT&T's data revenue grew 52 percent year over year. Vodafone and T-Mobile’s data revenue too grew by more than 50% over the last couple of years. But I'm wondering how high the data revenues can really climb. Is this strong growth rate sustainable?

Executives from the telecomm giants like Vodafone, AT&T etc predict there are still much more growth to come as consumers upgrade to integrated devices and smart phones that can take advantage of the 3G network. The companies say that nearly 20% of their customers has either upgrade or are in the process of upgrading to an integrated device. Meanwhile, Verizon recently said that 60 percent or 40.5 million, of its retail customers have upgraded to 3G data-capable devices.

Analysts believe that the likes of Vodafone, T-Mobile, AT&T, Verizon are the clear leader in monetizing data and that it will continue to lead the industry in data ARPU as it increases the number of data applications and data-centric devices.

I think the key to sustaining this growth rate lies not in the number of data-capable devices in consumer hands but in the availability of compelling data applications at reasonable price points. Without the continued push for better, more user friendly applications, data revenues are not going to be able to sustain this current growth trajectory.

Vodafone for example is already taking the necessary steps in that directions and it is looking to boost the usage of 3G data. Vodafone has announced an agreement with the laptop vendor Lenovo that will see its new X200 computer pre-installed with a Vodafone SIM and supporting software. The broadband connectivity comes at no extra cost and, when activated by the purchaser of the laptop, will offer the user a 30-day free trial. "The connection manager will ask for your name and email, but no bank details," said Alec Howard, head of PC connectivity at Vodafone. "Users will be prompted to take out a contract at the end of the free trial and the prices are around £12 a month for broadband, with automatic roaming in Europe at £8.50 a day. But like any other products this also has a disadvantage. Users dissatisfied with the Vodafone service will struggle if they want to connect to another mobile operator thus installing a new SIM and downloading and configuring a new connection manager instead of using the built-in software which only works with Vodafone.

I certainly believe that this embedded 3G initiative would significantly lower the cost of built-in mobile broadband technology across the entire range of laptops. I myself have used a dell laptop with an embedded data card.
Where you just have to put the right SIM and then connect to wireless broadband with the help of a connection manager. I think embedded modems are cool and are fun to use. There is no doubt in my mind that the use of embedded modems for mobile broadband connectivity is set to increase rapidly in the next few years, with sales estimated to grow at a rate of well above 80% per cent from 2008 to 2012. Laptops with an embedded modem are one of the data applications which will enhance the user experience and hence lead to the increase in ARPU.

Most of the vendors are also working in the direction where they can enhance the handset architecture with enhanced multimedia functionalities. Nokia surprised analysts with its Q2 revenues with better-than-expected second-quarter earnings. Nokia thinks inline with some of the operators and firmly believe that the global handset market could grow more than its previous estimate of 10 percent in 2008.

For the new devices, Nokia concentrated a lot on the services front, and hence enriched customers with the handsets supporting next generation multimedia services e.g. supporting Sony BMG Entertainment with Music service. The Nokia Music Store is now available in 10 markets and the company expects to have 14 stores open by year-end. In addition, N-Gage mobile games service, which became available during the quarter, has had more than 406,000 downloads.

So in my view if the companies are innovative just like Nokia has, then there is every possible chance to push the data rates to new high. Vendors with their excellent architecture and high degree of data applications can definitely push the data throughput and hence contribute in high data revenues.

As everyday passes by we are seeing new handset with amazing designs and new architecture. These handsets are designed to perform faster and can support very high data rates. Today’s youth can play online games on these devices, can watch live TV, send and receive multimedia messages and so many other things. The business users can exchange email while on the move. The installation of HSPA+ by vendors will further enhance the experience of the data users. The number of HSPA sunscribers is growing many folds i.e. at the rate of 4 million subscribers a year. Companies like At&T are aggressive towards their HSPA roll out plans and it looks to rollout the HSPA together with the 3G iphone.

Very high speed of up to 20 Mbps in a 5 Mhz channel is already achieved by HSPA and Qualcomm is one of the many to prove this.

At the moment things looks very promising and I strongly believe that industry will keep coming out with bright ideas to generate the increased data revenues. LTE is another step towards more revenue generation with enhanced user experience in view. Let’s see how high the data throughput together with the data revenues will go.

Sunday, August 24, 2008

Need for femtocells -> from Youtube

Someone sent me this link from youtube. Even though this is more like a marketing presentation from Soundpartners for a market report, it gives a good idea from operators point of view, why they will be looking at Femtocells to strengthen their market position and as a way of optimising their networks.

Thursday, August 21, 2008

802.11n and 4G...

IEEE 802.11n is a proposed amendment to the IEEE 802.11-2007 wireless networking standard to significantly improve network throughput over previous standards, such as 802.11b and 802.11g, with a significant increase in raw (PHY) data rate from 54 Mbit/s to a maximum of 600 Mbit/s. Most devices today support a PHY rate of 300 Mbit/s, with the use of 2 Spatial Streams at 40 MHz. Depending on the environment, this may translate into a user throughput (TCP/IP) of 100 Mbit/s.

According to the book "WI-Fi, Bluetooth, Zigbee and Wimax":

802.11n is the 4th generation of wireless lan technology.
  • First generation (IEEE 802.11) since 1997 (WLAN/1G)
  • Second generation (IEEE 802.11b) since 1998 (WLAN/2G)
  • Third generation (802.11a/g) since 2000 (WLAN/3G)
  • Fourth generation (IEEE 802.11n) (WLAN/4G)

The distinguishing features of 802.11n are:

  • Very high throughput (some hundreds of Mbps)
  • Long distances at high data rates (equivalent to IEEE 802.11b at 500 Mbps)
  • Use of robust technologies (e.g. multiple-input multiple-output [MIMO]and space time coding).

In the N option, the real data throughput is estimated to reach a theoretical 540 Mbps (which may require an even higher raw data rate at the physical layer), and should be up to 100 times faster than IEEE 802.11b, and well over ten times faster than IEEE 802.11a or IEEE 802.11g. IEEE 802.11n will probably offer a better operating distance than current networks. IEEE 802.11n builds upon previous IEEE 802.11 standards by adding MIMO. MIMO uses multiple transmitter and receiver antennae to allowfor increased data throughput through spatial multiplexing and increased range by exploiting the spatial diversity and powerful coding schemes. The N system is strongly based on the IEEE 802.11e QoS specification to improve bandwidth performance. The system supports basebands width of 20 or 40MHz.

Note that there is 802.11n PHY and 802.11n MAC that will be required to acheive 540Mbps.

To achieve maximum throughput a pure 802.11n 5 GHz network is recommended. The 5 GHz band has substantial capacity due to many non-overlapping radio channels and less radio interference as compared to the 2.4 GHz band. An all-802.11n network may be impractical, however, as existing laptops generally have 802.11b/g radios which must be replaced if they are to operate on the network. Consequently, it may be more practical to operate a mixed 802.11b/g/n network until 802.11n hardware becomes more prevalent. In a mixed-mode system, it’s generally best to utilize a dual-radio access point and place the 802.11b/g traffic on the 2.4 GHz radio and the 802.11n traffic on the 5 GHz radio.


A lot of phones are coming with inbuilt WiFi (or 802.11 a/b/g) and this WiFi is a must on Laptops or they wont sell. The main difference in 802.11n, compared to previous generation of 802.11 is that there is a presence of MIMO. 802.11 family uses OFDM which is the same technology being adopted by LTE. The new LTE handsets will have advantage of easily integrating this 802.11n technology and the same antennas can be reused. In fact the same is applicable for WiMAX as it supports MIMO and OFDM. Ofcourse we will have problems if they are using quite different frequencies as the antennas ore optimised to range of frequencies, this is something that has to be seen.

In the news:

MIT and a medical center based in Alabama are beginning to deploy faster wireless 802.11n access points from Cisco Systems Inc. In more than 100 buildings on MIT's Cambridge, Mass., campus, as many as 3,200 access points running older 802.11a/b/g protocols will be replaced with 802.11n devices in the next 12 to 16 months, said Chris Murphy, a networking engineer at the university. Murphy said MIT, with more than 10,000 students and 11,000 staff members, has a "very, very wide variety" of client devices, from handhelds to laptops. Many of the laptops probably support the 802.11n protocol, he said. Some MIT staffers have been using voice-over-IP wireless handsets and have experienced poor coverage with the older Wi-Fi technology, but they said they have had full signal strength within the range of the new 802.11n access points, he added. With 802.11n, the university could eventually provide IP television, which requires a lot of bandwidth, Murphy said.

Using 802.11n technology, Lapham said he was able to transmit a gigabyte of data in less than two minutes. Currently, the 370-bed medical center has about 450 access points on older protocols. Devices used on the wireless network include 180 laptops, which are used primarily for transmitting bedside patient data. The hospital also supports 100 VoIP wireless phones and a various medical devices.

Wi-Fi is expected to be available in 99 per cent of North American universities by 2013, according to research released by industry analyst ABI Research this week. Much of that penetration will be in the form of 802.11n equipment: higher education is clearly the number one market for early adopters of 802.11n, the company said.

ABI Research expects 802.11n uptake – which is today fairly small in the education market – to ramp up steeply to quite a high rate of penetration," said ABI Research vice president Stan Schatt. There are several reasons for this. ABI said many students now assume a campus Wi-Fi network as a given, and many of their shiny new laptops will be 'n'-compatible. Universities also have great bandwidth demands, as lecture halls may need to serve a large number of users with multimedia contention at any given time and 802.11n's greater speed and capacity can address that need. Moreover, said Schatt, "Universities are breaking new ground by using video over Wi-Fi in a number innovative ways. This is driving the adoption of high speed 802.11n. Students in the near future (at least the diligent ones) will be just as likely to watch their favourite professor's lectures on their laptops as they will be to view 'America's Next Top Model'."

You may also be interested in reading:

Revised paper on “4G” by 3G Americas

3G Americas have published a revised paper on Defining “4G”: Understanding the ITU Process for IMT-Advanced.

3G Americas initially created this white paper one year ago to provide clear understanding regarding the work-in-progress by the ITU, the sole organization responsible for determining the specifications for IMT-Advanced. The current paper updates the considerable progress made by the ITU, establishing a basis for what should be included in an IMT-Advanced system.


While speculation has been going on about 4G technologies, ITU is close to releasing a full set of documentation for this definition. It has held ongoing consultations with the global community over many years on this topic in Working Party 8F under the scope of a work item known as Question ITU-R 229-1/8 “Future development of IMT-2000 and systems beyond IMT-2000.” Following a year-end 2007 restructure in ITU-R, this work is being addressed under the new Study Group 5 umbrella (replacing the former Study Group 8) by Working Party 5D which is the new name for the former WP 8F.

This work in WP 8F, and now WP 5D, has woven together a definition, recipe, and roadmap for the future beyond 3G that is comprised of a balance among a Market and Services View, a Technology View, and a Spectrum View. These, along with Regulatory aspects, are the key elements for business success in wireless.

By mid-2008, ITU-R advanced beyond the vision and framework and developed a set of requirements by which technologies and systems can, in the near future, be determined as a part of IMT- Advanced and in doing so, earn the right to be considered 4G.

During 2008 and though 2009, ITU-R will hold an open call for the “first invitation” of 4G (IMTAdvanced) candidates. Subsequent to the close of the submission period for the “first invitation” an assessment of those candidates' technologies and systems will be conducted under the established ITU-R process, guidelines, and timeframes for this IMT-Advanced ‘first invitation.” The culmination of this open process will be a 4G, or IMT-Advanced family. Such a 4G family, in adherence to the principles defined for acceptance into this process, is globally recognized to be one which can grow to include all aspects of a marketplace that will arrive beyond 2010, thus complementing and building upon an expanding and maturing 3G business.

The paper is available to download from here.

The ITU-R Radiocommunication Bureau has established an “IMT-Advanced” web page (http://www.itu.int/ITU-R/go/rsg5-imt-advanced/) to facilitate the development of proposals and the work of the evaluation groups. The IMT-Advanced web page provides details of the process for the submission of proposals, and will include the RIT and SRIT submissions, evaluation group registration and contact information, evaluation reports and other relevant information on the development of IMTAdvanced.

Wednesday, August 20, 2008

Ofcom's 2008 Comms Market report

Dean Bubley posted this on Forum Oxford and i thought that this is worth spreading around.


Ofcom's just released a huge new report on the current state of the industry, incorporating telecoms, broadcasting and related services. Some interesting statistics:
  • Quite a lot of discussion of the resilience of fixed-line comms in the face of the mobile onslaught. Rather than direct fixed-mobile substitution, it appears that the UK sees more mobile-initiated incremental use of voice. Fixed minutes have dropped about 17bn minutes in total over 6 years, but mobile call volumes have risen by 38bn minutes. The UK outbound call total is still around 60/40 fixed:mobile, and 88% of homes still have a fixed line.
  • The proportion of mobile-only households has been pretty static for the past few years, currently at 11%. This is considerably lower than elsewhere in Europe (eg 37% in Italy), and is possibly reflecting the prevalence of ADSL. Most mobile-only users are from lower socioeconomic groups.
  • 44% of UK adults use SMS daily, against 36% using the Internet
  • More than 100k+ new mobile broadband connections per month in the UK in H1 2008, with the rate of sign-up accelerating. 75% of dongle users are now using their mobile connection at home.
  • Nearly half of adults with home broadband use WiFi
  • 11% of UK mobile phone owners use the device to connect to the Internet, and 7% use it to send email.
  • VoIP usage appears to have fallen from 20% of consumers in late 2006, to 14% in early 2008. However, I suspect that this masks the fact that many instances of VoIP (eg BT's broadband circuit-replacement service, or corporate IP-PBXs), don't make it obvious to the user.
  • Over two-thirds of mobile broadband users also have fixed-line broadband
  • UK mobile subscribers send an average 67 SMS per month (or 82 / month per head, taking account of multiple subs-per-person). MMS use is only 0.37 messages per user per month.
    Slight increase in overall fixed-line subscriptions in 2007 - attributed to business lines.
    Overall UK non-SMS mobile data revenues were flat in 2007 vs 2006 at £1bn. I reckon that's because the data pre-dates the big rise in mobile dongle sales, and also reflects price pressures on things like ringtones. Ofcom also attributes this to adoption of flatrate data plans vs. pay-per-MB.
  • UK prepay mobile ARPU has been flat at £9 / month for the last 4 years. That's a big issue for operators wanting to sell data services to prepay subs in my view.
  • 17% of mobile subscriptions in the UK were on 3G at end-2007, although there's not much detail on the actual usage of 3G for non-voice applications.
  • Overall, UK households allocate 3.3% of total spending to telecom services. That's been flat since 2003 - ie the slice of the pie isn't getting any bigger relative to food/rent/entertainment/travel etc.
  • 94% of new mobile subscriptions are bundled with handsets.
  • 11% of UK adults have >1 SIM card. Among 16-24yo users, this rises to 16%. There's an estimate that of the second devices in use in the UK, 1m are 3G dongles, 0.7m are BlackBerries or similar, and 8m are genuine "second handsets". There's also another 8m "barely active" devices that are used as backups, or legacy numbers that get occasional inbound calls or SMS

Some other interesting key points that are available here:

  • Communications industry revenue (based on the elements monitored by Ofcom) increased by 4.0% to £51.2bn in 2007, with telecoms industry revenue the fastest growing component, up 4.1% on the year.
  • Mobile telephony (including an estimate for messaging) accounted for 40% of the total time spent using telecoms services, compared to 25% in 2002. However, much of this growth has come about as a result of an increase in the overall number of voice call minutes (from 217 in 2002 to 247 in 2007) rather than because of substitution with fixed voice, which still accounted for 148 billion minutes last year, down only 10% from 165 minutes in 2002.
  • The most popular internet activity among older people is ‘communication’ (using email, instant messaging and chat rooms for example); 63% of over-65s say they communicate online, compared to 76% of all adults.
  • The majority of children aged 5-7 have access to the internet and most children aged 8-11 have access to a mobile phone. Children are more likely to use the internet for instant messaging than for email.
  • Television is particularly important to older people. Sixty-nine per cent of those aged 65-74 say it is the media activity that they would miss most (compared to 52% of all adults) and this rises to 77% among the over 75s. Older people are also more likely to say they miss newspapers and magazines – 10% of 65-74s and 7% of over 75s, compared to 5% of all adults.
  • The converged nature of mobile handsets became apparent during 2007, with 41% of mobile phone users claiming to use their handset for taking pictures and 15% uploading photos to their PC. Nearly one in five (17%) also claimed that they used their phone for gaming.

Monday, August 18, 2008

Nokia Eco Sensor Concept Mobile

Though this is not new, i havent seen it anywhere and found it recently while working on a report.

A visionary design concept is a mobile phone and compatible sensing device that will help you stay connected to your friends and loved ones, as well as to your health and local environment. You can also share the environmental data your sensing device collects and view other users’ shared data, thereby increasing your global environmental awareness.

The concept consists of two parts – a wearable sensor unit which can sense and analyze your environment, health, and local weather conditions, and a dedicated mobile phone.

The sensor unit will be worn on a wrist or neck strap made from solar cells that provide power to the sensors. NFC (near field communication) technology will relay information by touch from the sensors to the phone or to or to other devices that support NFC technology.

Both the phone and the sensor unit will be as compact as possible to minimize material use, and those materials used in the design will be renewable and/or reclaimed. Technologies used inside the phone and sensor unit will also help save energy.

To help make you more aware of your health and local environmental conditions, the Nokia Eco Sensor Concept will include a separate, wearable sensing device with detectors that collect environment, health, and/or weather data.

You will be able to choose which sensors you would like to have inside the sensing device, thereby customizing the device to your needs and desires. For example, you could use the device as a “personal trainee” if you were to choose a heart-rate monitor and motion detector (for measuring your walking pace).
The Nokia Eco Sensor Concept is built upon all three of these underlying principles of waste reduction. Emphasis will be placed on materials use and reuse in the phone’s construction.

To complete the Nokia Eco Sensor Concept, the phone and detector units will be optimized for lower energy consumption than phones in 2007 in both the manufacturing process and use. Alternative energy sources, such as solar power, will fuel the sensor unit’s power usage.

Please note that this is a concept phone so you wont be seeing this in a shop near you anytime soon.

Sunday, August 17, 2008

4G: Where are we now.

Last month i read this news about WiMAX leading the world of 4G and last week I read about an American carrier selecting LTE as its choice of 4G technology. Since ITU has decided that they wont be using the term 4G in future and rather use IMT-Advanced or LTE-Advanced, I guess 4G is up for grabs.
The main driver for '4G' is data. Recently carriers have become agressive and started offering some decently priced 'Wireless Broadband' data plans. Rather than confuse people with HSDPA, etc., they have decided to use the term 'Wireless Broadband' or 'Mobile Broadband'. Personally both the terms have managed to confuse some people who associate Mobile Broadband with Internet access on Mobile and Wireless Broadband as broadband on WiFi.

Andrew Seybold makes some valid points in an article in Fierce Wireless. One of the things that he points out is that LTE may tout on higher data rates as compared to others, that is only possble in 20MHz of spectrum. In real world this kind of spectrum is near impossible to obtain. If the spectrum flexibility is removed than HSPA+, LTE, EV-DO Rev B and WiMAX have nearly the same data rates and performance.

For HSPA+ the existing infrastructure can be reused and a software upgrade would suffice whereas for LTE new infrastructure would be required. NTT DoCoMo has fully committed to being the first LTE network operator and others are raising their hands. He thinks that nationwide LTE networks would only be available around 2014.

While I agree with this analysis completely, I think what is going to dictate this transformation from 3G+ to LTE for the operators will be the uptake of data on a network. The biggest advantage of LTE is that it is able to operate in TDD and FDD mode. Operators that have been traditionally using FDD mode of operation will change their loyalty to TDD mode so that they can use asymmetric data transfer. This can provide more capacity in case of some special event taking place (Football finals, Reality show results, etc.) where the users are just interested in receiving information rather than sending any. For operators with paired spectrums, they can use both the band seperately in TDD modes.

Gigaom has list of American operators that are involved in 4G and the list is quite interesting:
  • AT&T: USA's largest network in terms of subscribers, AT&T plans to use LTE to upgrade to 4G, but not for a long, long time. For now it’s content with its current 3G network. It will upgrade to HSPA+ in 2009 and 2010. Eventually it will go to LTE, but won’t begin testing until 2010 or 2011 with full deployment coming after that.
  • Verizon Wireless: Verizon is already testing LTE equipment from several vendors, with plans to roll out the network in 2010 and have most of the country covered by 2012; Verizon’s would likely be the first full U.S. deployment of the LTE technology.
  • Sprint-Nextel: The outlier in the whole transition to 4G, Sprint is going with WiMAX rather than LTE. After a number of delays, the company is set to launch its network in September. By the end of the year it will join with Clearwire to operate a nationwide WiMAX network under the Clearwire brand.
  • T-Mobile: T-Mobile is still launching its 3G coverage, so its 4G networks may take a while to come to fruition. The carrier’s German parent appears to favor LTE.
  • Metro PCS: This budget carrier plans to use LTE but it doesn’t yet have a time frame for deployment, pointing out that its customers aren’t heavy data users yet.
  • U.S. Cellular: The company is unsure of its deployment plans but it would likely choose to follow the rest of the industry with LTE. As for deployment, the time frame isn’t set.
  • Leap Wireless: Recently said it had not made a decision or public comment about its 4G plans.

The picture is a bit different here in UK because all the operators are going to LTE. There may be some ISP's that may be tempted to move to WiMAX as they would get economy of scale. There is also the news of BT (the largest landline phone provider) planning to roll out nationwide WiMAX network in the 2.6GHz spectrum. If BT is able to fulfil its ambition that it could be a big win for the people.

Femtocell success reliant on handset innovations

Femtocells are one of the emerging technology in telecomms. The success of femtocells cannot be predicted yet the industry related to femotcells has had its share of good and bad moments. There is no doubt in my mind that femtocells, together with WiMax and LTE are the most talked technology in telecoms these days.

In the past few weeks I have hearing about the challenges faced in deployment of femtocells. Getting the right handset for femtocells is the key for the success of the femtocells.
I must say that the hype surrounding the mass deployment of femtocells has been doused with cold water by a new study into the need for handset vendors to quickly transform their devices to support the technology. According to the report, published by Research & Markets (R&M), the femtocell industry is basing its optimism on the notion that subscribers will use their cell phones differently when in range of femtocells. There will be different applications and behavioral patterns when people are at home, perhaps content backups, podcasts or even advertiser sponsored TV programming. The mobile phone may need to be linked to the TV, PC, HiFi or other items of domestic technology, claims R&M.

I have seen some reports which suggest that although the currently available handsets will work with femtocells they are not optimised to support this new 'in home' activity. The question which remains in my mind is that how the handset will determine the femtocells as compared to any other stronger not femtocell available. The phone needs to be aware of the femtocell, ideally both in the radio and the application platform. I firmly believe that we will need new architecture for the handsets to solve the above problem. But changing how the handset industry approaches this challenge could take 2-3 years given that it takes this amount of time to implement new handset architecture, and around the same time before new cell phone technology reaches a broad range of devices. The handset industry also needs to be aware that where we will ne in terms wireless technology in 2-3 years time. We might be entering in the era of LTE by this time.

But there are some more issues which femtocell industry should be aware of. Some of these issues identified by Research and Market are:
  • In dense deployments of femtocells, handsets can spend too much time and power attempting to connect at locations that are not their own "home zone."
  • The new 3GPP Release 8 specifications contain various modifications to enable handsets to work better with femtocells, but the first R8-compliant phones will likely be shipped at the end of 2010.
  • The usage of handsets on femtocells may identify unexpected side-effects, relating to faster/cheaper data connections. This may impact elements of design such as memory allocation and power management.
  • Various suggestions have been made for ‘femto-zone' services--but there is no standardised way for handset applications to know they are attached to a femtocell.

By looking at the above issues it may not sound very well in favour of femtocell deployment and commercialisation. However operators are always looking for new means and ideas for the generation of new revenue streams. Femtocell is definitely on of those new means and a possible opportunity for the operators to generate more revenue. Revenue can be generated by the operators from advertisers and other third parties by enabling the provision of 'at home' services via femtocells.

Research and Market claimed that there could be a demand for at least 48 million femto-aware handsets to be sold to femtocell owners in 2013. However, with more optimistic forecasts, and especially if shared femtocell models become popular, there could potentially be a demand for up to 300 million femto-aware handsets per year in 2013.

Although the above figures look very encouraging, femtocell industry is still very cautious in terms of their approach towards massive investment. Femtocell industry is currently focusing on the short term, getting initial trials in place, developing standards, and securing commitments for early commercial deployment. These initial efforts are very critical for the femtocell industry so that they can validate the market, raise the profile of the femtocells concept. If the industry can do that then it will stimulate finance and investment in the femtocells.

One of the propositions by central marketing is that femtocells can work with normal 3G handsets. If this is true then subscriber can get the service from femtocells without needing to go for expensive upgrades to their existing phones.

But while focus is good and the industry does not want unnecessary distractions there is a risk of medium term failure if certain future problems are not addressed early enough, even if this muddies the waters of the short term marketing message. Already, femtocells proponents are talking up mass market business models that go beyond simple indoor coverage and macro-network offload. They are talking about 10's of millions of subscribers, and new "in-home" services for users, that exploit fast and cheap local mobile connectivity.

It is at that stage that the issue of right handsets for the femtocells industry comes into picture once again. The handset innovations become even more important for important for the industry. As I have mentioned above, the handsets design should be able to differentiate between femtocells and real cell environment. In part, this relates to complexities in managing the radio environment and mobility between femtocell and macrocell networks. This is easy said then done and hence various optimisations are desirable, especially when dense deployments of femtos occur. These drive changes in areas such as the way the phone "selects" cells on which to register. There may also need to be ways to offer provisioning and "guest access" on femtocells, from the handset UI. But this cannot be considered as a solution as users will definitely consider this as an unnecessary exercise for them. In my view the medium term hopes of the industry also reflect the notion that people will use their cellphones differently when in range of femtos. The problem for the femtocells industry doesn’t end with solving the problem of registering to the right cell. There will be different applications and behavioural patterns when people are at home, perhaps content backups, podcasts or even advertiser sponsored TV programming. The mobile phone may need to linked to TV, PC, HiFi or other items of domestic technology. This shows the road ahead is really tough and it again highlights the degree of innovations that will be required for the handsets design in order to work precisely in the femtocell environment.

Some reports suggest that standard phones can work with femtocells, but they are not optimised. Certain applications may only work when the phone is within femto range but they need to know when that is. Yes, some services can be notified by the core network that the user is "at home", but that approach doesn't scale to a wide base of operators, application developers and handset/OS vendors. The phone needs to be "aware" of the femtocell, ideally both in the radio and the application platform.




Changing such elements is not quick. The handset industry is much more complex and slow moving than many in the wider wireless business understand. It takes often 2-3 years for changes in handset architecture to reach commercially sold handsets, and another 2-3 years to reach a broad range of devices and reasonable penetration within the user base.

There is definitely a perception that the femtocell industry needs to be much more open minded about the need for modifying and optimising handsets and to be alert to the huge time and effort it will take to achieve. Other mobile developments like UMA and IMS have suffered in the past from a lack of focus on this issue. Although many femto advocates fear distractions could delay immediate market acceptance, early consideration of these "2nd order" problems is necessary for longer-term success.

What I have seen that there is significant efforts to make the femtos success and overcome the difficulties. The new 3GPP Release 8 specifications contain various modifications to enable handsets to work better with femtos (called Home NodeB's). Various suggestions have been made for "femto-zone" services -but there is no standardised way for handset applications to "know" they are on the femto. Although there are various workarounds, with the network notifying the application when the phone is attached to the femto, this approach is not easily scalable to the wider base of developers or operators. At the moment the best solution suggested is for handset "connection manager" software to explicitly recognise femtocell access as a new and specific type of bearer.

There is no doubt in my mind that operators could benefit from new revenue streams from advertisers & other third parties by enabling the provision of "at home" services via femtocells.
Using baseline forecasts, there should be a demand for at least 48m femto-aware handsets to be sold to femtocell owners in 2013. However, with more optimistic forecasts, and especially if "shared" femtocell models become popular, there could potentially be a demand for up to 300m femto-aware handsets per year in 2013.

Thursday, August 14, 2008

What is this MEMS and why is it required for 4G?

The mobile technology evolving at an amazing speed going from 3G to 3.5G in around 5 years and now going from 3.5G to 4G in less than 5 years. According to Analysis, Annual Global Mobile Handset Shipments to Reach 1.5 Billion in 2011. Converged-function handsets will become a mainstream product with more than 30% share in developed markets by 2011. There is a constant pressure on the handset manufacturers to reduce the power consumption and the chipset size and at the same time driving down the cost of the device.

RF micro-electro-mechanical systems (RF-MEMS) is a semiconductor technology that allows micro-scale moving mechanical devices to be integrated with electrical transistors on silicon wafers. RF-MEMS technology can be utilized to make high-frequency components whose RF characteristics can be adjusted during operation, allowing for the first time reconfiguration of radio hardware under software control. The ability to reconfigure operating characteristics in real time results in a substantial reduction in the required number of discrete components for a given set of functions, significantly relieving pressure on the handset product developer.

While the electronics are fabricated using integrated circuit (IC) process sequences (e.g., CMOS, Bipolar, or BICMOS processes), the micromechanical components are fabricated using compatible "micromachining" processes that selectively etch away parts of the silicon wafer or add new structural layers to form the mechanical and electromechanical devices.

An early micromotor built in the SUMMiT technology. For size comparison a microscopic dust mite is shown on top.

There are several different broad categories of MEMS technologies:
  • Bulk Micromachining
  • Surface Micromachining
  • LIGA
  • Deep Reactive Ion Etching
  • Integrated MEMS Technologies

Details available here.

MEMS is not really a new technology. It has been around since 1960's but only recently it has become feasible. Samsung watch phone was the first phone to have a commercial MEMS circuit and its being used in variety of devices nowadays, not just mobiles. For those who watched the opening ceremony of 2008 Beijing olympics would have seen the different coloured torch display. That 'Waving Torch' used MEMS circuitry.

Scientists are also working on making MEMS intelligent and they are looking at microorganisms for ideas. The integration of microorganisms with MEMS, resulting in “biotic-MEMS,” is a hot topic for scientists designing micron-level machines. Recently, researcher Xiaorong Xiong of Intel, microbiologist Mary Lidstrom, and electrical engineer Babak Parviz (both of the University of Washington) have catalogued a large number of the most promising microorganisms for different areas of MEMS systems. They show that many of these microorganisms can offer capabilities beyond the limits of conventional MEMS technology.

Finally, from EE Times:

French research and strategy consulting company Yole Dveloppement (Lyon, France) provides an analysis on MEMS components for cell phone applications as it expects this market will represent $2.5 billion in 2012. In its latest report, entitled MEMS for Cell Phones, Yole stated that the cell phone industry represents a complex challenge for MEMS but also its greatest opportunity for growth in the next five years.

According to the market research firm, silicon microphones and FBAR/BAW filters have experienced "incredible growth" since their introduction in 2003 and are now entering the maturity stage. MEMS accelerometers are in "a strong development stage", and MEMS products such as gyroscope, microdisplay, micro autofocus and micro zoom are at the emerging stage.

Yole also mentioned products that are not yet in the emerging stage. Among them are pressure sensors, micromirror, RF switch/varicaps, oscillators, and micro-fuel cells.

Yole reported that, for the year 2007, cumulative sales reached $440 million for three MEMS products in cell phone applications, namely silicon microphones, FBAR/BAW filters and accelerometers.

As a conclusion, Yole said it anticipates MEMS will become a key driver for innovation in the cell phone industry, and new cell phone features will represent 60 percent of the total MEMS market by 2012.

Interested people can also read:

RF-MEMS for Wireless Communications, Jeffrey L. Hilbert, WiSpry, Inc. - IEEE Communications Magazine • August 2008

Tuesday, August 12, 2008

IMS: Reality check

IMS is another technology not doing too well at present. I came across this report by Yankee group, "IMS Market Update: The Honeymoon Is Over, Now What?" and it answers some of the questions why IMS is not as popular as people expected it to be 3-4 years back.
Some of the promises made by IMS were:
  • New IMS based apps, hence increased ARPU
  • Simplified network design, hence lower OPEX
  • Platform for killer services
  • Components interchangeable
  • Plug and Play environment for access networks

But the reason IMS has not found success is because:

  • IMS Standards are in flux
  • Everything is quite complex and not very clear
  • OSS/BSS integration is very complicated

An article in Cable 360 has some up to date market details:

Based on a report completed in January by ABI Research, Ericsson is market leader in providing IMS infrastructure followed by Alcatel-Lucent and Nokia Siemens. The other vendors in this ranking include Motorola (4), Huawei Technologies (5), Cisco Systems (6), Nortel Networks (7), Acme Packet (8), Thomson (9) and Tekelec (10).

Bundling is increasingly the way that IMS is sold. "Huawei combined a lot of their wireless offerings with IMS," ABI Senior Research Analyst, Nadine Manjaro said. "Whatever their contacts were, they had an IMS element."

"Previously (IMS) was more fixed," she said. "IMS is difficult to integrate. (So) one trend is combining IMS with infrastructure and 3G deployment and managed services," she said.
Increasingly critical to versions 6, 7 and 8 of the Third Generation Partnership Project (3GPP), IMS will become more tightly linked with mobile technologies, Manjaro predicted. The overall context remains telephony-focused.

"You see highly voice-over-IP related deployments of IMS globally," she said.

My understanding is that with the tight squeeze on financial market, everyone is trying to spend as less as they can. This means that end users are being shy of the extra features and services as long as it costs them and the operators are being shy of investing in new technologies or upgrading their infrastructure. Even though investment on IMS could be significant, it can provide long term benefits which may distinguish an operator from others and provide the cutting edge.

Another thing is that the IMS technologists (and why just them, others as well) should ensure that all the technical problems are ironed out and start promoting the technology to everyone. People ae already confused enough about HSDPA and 4G and we need to prepare them to look forward to IMS.

Monday, August 11, 2008

Mobile Tv going in Hibernation!

Earlier this month, there was this report which mentioned that 'Mobile 3.0' (a consortium in Germany) decided to end plans to launch a DVB-H (handheld) network. The failure was blamed to some extent on the wireless service providers who were not abole to get their act together to establish a paid DVB-H infrastructure.

The following is an extract from the article:

Burda and Holtzbrink, both publishing houses, and South African media company Naspers have thrown in the towel and won't launch a DVB-H network in Europe, the reports said.

Their effort wasn't helped when service providers said they plan to introduce mobile TV devices that use the free DVB-T technology. Noting that subscribers aren't likely to favor the idea of paying for TV on top of their often hefty wireless charges, service provider Vodafone has said it favors a mobile TV strategy whereby consumers pay for add-on video services that are offered in conjunction with free mobile TV. Mobile 3.0 had planned to charge monthly fees of as much as $10 to $15.

The Mobile 3.0 group had begun testing a service with nine TV channels and three radio stations.

The German situation isn't likely to influence the delivery of mobile TV in the United States, which is still in its embryonic stage. To date, no major third-party providers of mobile TV have emrged in the United States.

According to a report in Mobile burn the same day as the above news:

Toshiba announced that it was shutting down Mobile Broadcasting Corp. at the end of March 2009, stating that the company has not gained enough subscribers due in large part to the popularity of the free TV broadcasting that many of Japan's phones are now capable of receiving (and even recording).

The situation is different on many levels in the U.S. The nation's two largest carriers, AT&T and Verizon Wireless, both use Qualcomm's MediaFLO (definition) mobile TV standard on their TV compatible cell phones. While different technically, MediaFLO and DVB-H work on the same basic premiss of broadcasting a separate, mobile optimized digital TV signal over the air that compatible devices can receive. Since AT&T and Verizon more or less control the handset models that are available to its customers, much as is the case with German carriers, the two have been able to steer subscribers into using the MediaFLO system while avoiding competition from devices that could otherwise pick up free broadcast TV signals. Similarly, Sprint offers a streaming TV service on most of its handsets. T-Mobile currently offers no integrated TV support to its customers.

Then we had the bad news about Mobile Tv in Korea:

Some new numbers on mobile TV's non-pickup in Korea...more specifically, the TV broadcasting using digital multimedia broadcasting (DMB) format. The story says DMB, which includes the free terrestrial and premium satellite DMB-- has an audience of some 13.7 million, according to latest data. That's up from nine million in December last year. The number of DMB-enabled receivers sold here reached 13.69 million in June.

-- Mobile phones accounted for 48.4 percent of all DMB subscribers.-- Car navigation systems and other DMB-enabled terminals used in vehicles accounted for 37.8 percent of DMB receivers, followed by portable media players at 9.4 percent and USB devices at 3.8 percent.?Laptop computers were the least popular DMB device, accounting for just 0.9 percent of all receivers.

Bu the overall viewership numbers remain minuscule: TNS Media, a local research firm, overall viewer rating for the day was just 1.172 percent, peaking at 3.585 percent during the commuting hours of 6 to 7 p.m. in the survey. And, even more surprising: male viewers in their 50s proved the largest audience for mobile TV rather than the convention wisdow that tech-savvy youngsters would be watching TV on the go. Viewership was also relatively high among men in their 40s and 30s, but minuscule among women and younger customers.

We have completely stopped hearing anything new on MBMS. There is no news on Mobile TV trials. I think that Mobile Tv is going in Hibernation and will be for some time, until some killer charging models are in place for these kinds of services.

Saturday, August 9, 2008

Is WiMax Slowing Down



In the past few week it’s becoming quite interesting to know where WiMax is heading and it’s progress if at all there is any. There are so many articles which emerge everyday to say that WiMax has leaped further in terms of development in the technology and commercialization. Although I am too convinced about that but one thing about which am pity sure is that LTE has to seriously push its plans if it has to be in any sort of chance to catch WiMax.

There is no doubt that WiMax, a high-speed wireless service, is gaining momentum worldwide. Latest figures show WiMax equipment sales rose nearly 50 percent in 2007 to US$800 million. WiMAX networks have been deployed in some 80 countries with 2.2 million customers, and growth is expected to continue.

According to the WiMAX Forum, worldwide WiMAX customers will exceed 200 million by 2012, generating US$7.7 billion in equipment sales. With IEEE 802.16 ratified by the ITU last October, many operators see it as a sensible alternative to 3G mobile Internet service.
Although there is a significant divide between the equipment manufacturers when it comes to LTE and WiMax but the later is certainly pushing hard thus making life very difficult for LTE camp. Indian government’s decision of auctioning 2.3, 2.5 GHz frequency bands for broadband data has further encouraged WiMax camp.

So far, most of the excitement has been generated by equipment manufacturers such as Huawei Technologies, a member of the WiMAX Forum. Huawei is moving fast to take advantage of surging demand, mostly in Europe and North America. So far Huawei has sold 16 commercial networks and 30 trial networks, making it one of the most prolific WiMAX vendors in the world.

Huawei also is engaged heavily in WiMAX R&D activity: It has 1,200 engineers dedicated to WiMAX product development; it owns 100 WiMAX- related patents, more than any other company. Huawei is also developing WiMAX terminals which are expected to become available for sale later this year. The handsets reportedly will work in dual-mode with CDMA, GSM and 3G (WCDMA).

ZTE, another Chinese equipment maker and a senior member of the WiMAX Forum, began OFDM research in 1998. It projects sales of WiMAX equipment will reach that of CDMA by 2011 (estimated US$700 million). ZTE is more enthusiastic about WiMAX and it predicts WiMAX will make up 20 percent of the global wireless market by the end of 2009 after commercial handsets become readily available later this year.

When all the manufacturers are getting excited and trying to run as fast as they can operators are mum whether they will jump on the WiMAX bandwagon. Vodafone, T-Mobile and AT&T are yet to announce any significant trials on WiMax. There are no signs of any massive deployment of WiMax especially in China and India which are considered as the biggest WiMax market. China Mobile and China Unicom show no signs of massive deployment or commercial service after small trials two years ago in a half-dozen cities, including Beijing, Shanghai, Wuhan and Shenzhen. Most trials employed 802.16d (fixed access) at 3.5 GHz, a temporary spectrum band for experiment. Most trials used WiMAX as backhaul for businesses to transmit data and video in a campus environment. It is premature to assume large-scale deployment will follow, at least in the immediate future, because the industry is consumed completely by restructuring, which, in addition to changes in organization and personnel.

China Mobile, for example, is carrying the torch of TD-SCDMA, a home- grown, 3G wireless standard which the government hopes to become a winner someday. For China Mobile, TD-SCDMA is very much a political mandate and it has no option but to succeed. If this holds out, it is natural for the operator to adopt some kind of LTE for TD-SCDMA, an evolutionary platform for faster speed and more profitable service.

Questions remain, Operators must weigh WiMax’s potential gains against the cost of deploying regional or national networks, and there is no clear-cut answer. While WiMAX can offer significant speed to fixed and mobile devices which are conducive to more bandwidth-sensitive services like video and TV broadcast, the key hurdle is scale. As tests show, a typical 802.16e base station delivers 30Mbps, but actual speed can whittle down to 1.2Mbps or lower when “fully loaded” with access.

If speed is compromised, cost will become a serious concern. According to estimates, operator capex for WiMAX will be 20 percent to 50 percent higher than for HSDPA, a software-enabled overlay for sending data over 3G networks. At higher frequency, say 3.5 GHz, the number of WiMAX base stations must increase, as many as 50 percent more than for HSDPA, to cover the same area without signal degradation. This is the last thing operators want after already plunking down billions on 3G networks.

So far operators have focused mainly on network and handset performance and not on services which can have a negative effect on initial growth. Despite all the hype, it is not clear if WiMAX will create the miracle equipment vendors want to see especially in China, since there simply is a lack of driving force among the governments, operators and the public. If anything, WiMAX will complement 3G especially in data service for high- end customers, enterprises and government agencies, but its role as a public service will be limited.

Sunday, August 3, 2008

Q2 Revenues for the major Telecomms Companies

With the Quarter 2 (Q2) just gone everybody concerned in the telecoms industry are waiting for the Q2 results from the major telecoms giants. By looking at the current state of the economy and with the entire credit crunch problem, there is some nervousness regarding the growth expectation in the telecoms area.

In the past few weeks most of the telecoms giants has announced their Q2 earning. I must say the results don’t look too bad and I will not be quoted wrong when I say that some of the investors have breathed a sigh of relief. There is no doubt that little bit of tricky times might be head but overall growth forecast for Q3 are encouraging.

First of all let me tell you about the major shock of the Q2.

Ericsson and Alcatel-Lucent has really got lot to do after their Q2 results. Although expected, Swedish telecoms giant Ericsson reported a fall of 70 per cent in its second quarter profits. The company blamed the fall on the high cost of shedding staff and falling handset sales by its Sony Ericsson joint venture. Sony Ericsson reported that its second quarter pretax profits were 8 million euros compared with 327 million euros a year earlier and announced plans to cut 2000 jobs worldwide.

But Alcatel-Lucent was the most shocking when it announced a humongous $1.7 billion second quarter loss. CEO Patrica Russo and chairman Serge Tchuruk who together orchestrated AlcaLu’s merger two years ago has already said that they’ll step down by year’s end.

The size of the loss, the sixth straight quarterly red ink since the merger has stunned analysts who had expected less painful news. The company said $1.27 billion of the loss was due to a write-down of AlcaLu’s North America CDMA technology business.

The company also is trimming its board, dumping former Lucent CEO Henry Schacht.

I must admit that as one of the giants of telecoms world the above results looks very disappointing. What is more worrying specially for Ericsson is the fact that it’s sales in Western, Central and Eastern Europe, the Middle East, Africa and Asia Pacific were all down by varying degrees. The only positive news came form America where its sales in North and Latin America were up.

TeliSonera on the other hand reported a 5.7 percent increase in sales for the second quarter and revenues of $1.3 billion, up some 6 percent from a year ago. TeliaSonera said it expects stable growth for the remainder of 2008 which is indeed good news for the investors.
The Nordic telco, which spurned an attempted takeover earlier this year by France Telecom said margins for broadband services in Sweden were under pressure, hurt by the growth of mobile broadband services. CEO Lars Nyberg told Swedish radio news that he backed the board's earlier decision to reject a bid from France Telecom."I am glad that we can continue to grow on our own," Nyberg said.TeliaSonera and its associated companies had some 122.9 million customers in the quarter, up 19.3 per cent from the end of the second quarter 2007.

First look at AT&T's earnings for Q2 showed that perhaps we did not have to worry about big, old AT&T after all. Many industry analysts had been watching the telco's second quarter numbers for signs of weakness that might portend further weakness for the industry at large, but AT&T issued a report marked by an increase in net income, strong wireless performance and continued progress with U-verse and wireline IP efforts.
For the second quarter, the company's net income hit $3.8 billion, or $0.63 earnings per share, up 34 percent from the same quarter last year. Revenue reached $30.9 billion, an increase of 4.7 percent. Wireless data revenues were up 52 percent, and wireless postpaid subscriber churn came in at 1.1 percent, a quarterly record for AT&T. Wireline IP data revenues increased 16.1 percent. U-verse subscriber additions amounted to 179,000 for the quarter, giving AT&T a total U-verse subscriber tally of 549,000.

There are good results from Vodafone as well. I have always been impressed with Vodafone as a company. Vodafone has always got ideas in its sleeve and it always come with some aggression and beat the general trends in the market. The Q2 results of Vodafone were again boosted mainly by the emerging markets where the only big disappointment came form Spain.
Some of you might remember, recently when Vodafone announced that it would feel the effects of the global downturn, its share price went down by 14 per cent, the largest fall in Vodafone history.
The company reacted to this fall by announcing a £1 billion buyback. However, this warning by the outgoing CEO, Arun Sarin, that the outlook was less than rosy might have disguised some hidden gems within Vodafone's arsenal of revenue generating resources.
Sure, the sharp decline in the Spanish market hit the company badly as did the lacklustre results from the U.K. But both are very mature markets and consumers in both countries are reining back on all expenditures. The revenue growth high spots were the emerging markets, namely Eastern Europe, the Middle East, Africa, Asia and Pacific which was up 30.9 per cent, supported by growth from India of more than 50 per cent. Overall, Vodafone reported a 19.1 per cent revenue growth to £9.8 billion for the three months to the end of June 2008-and added 8.5 million subscribers.
I always like the aggressiveness that Vodafone has got when it comes to revenue generation. This was once again proved evidently when Vodafone announced the 50 per cent rise in Q2 revenues from its data services. This shocking 50% rise is after the number of its customers using the web from mobile devices was more than doubled. Data revenue for the quarter stood at £664 million globally, compared with £441 million for the same period last year. The CEO of Vodafone Germany, Friedrich Joussen, was also upbeat claiming that, although revenue fell slightly in the second quarter, due mainly to regulatory causes, "it won't be long until we see growth again."
There are also growing indications that the company is seriously considering an offer for freenet's DSL business. "We are taking a very close look at it," said Joussen, adding that a purchase couldn't be ruled out. Vodafone Germany doesn't necessarily need acquisitions as its organic growth is strong, Joussen said.

Nokia also posted better that expected results for Q2 after having previously forecast that the cell phone market would grow by 10 per cent. The CFO of Nokia, Rick Simonson, has indicated that, with half a year visibility, Nokia would be able to post growth of 10 per cent or more. This upbeat assessment comes after the company good results for the second quarter which pushed Nokia's share price up by eight per cent--albeit that the company's share price had plunged about 40 per cent this year amid concerns the mobile industry would suffer as the credit crunch and inflation took their toll.

There is no doubt that everybody's affected by the economic reality and Nokia and its customers are no different. But in my view Nokia have an ability to play all markets where some markets are growing strongly, some not so strongly. That is the beauty of Nokia where it’s not trapped to one market only.

During the posting of the Q2 results, Nokia's CEO, Olli-Pekka Kallasvuo, added to Nokia’s optimistic outlook by stating that the company had received good feedback about the broad range of new products it expected to sell in its handset business. Last week, Nokia indicated that increased demand from Russia and India would help it achieve continued growth this year.
LG Electronics is forecasting a tough third quarter despite posting positive results for the second quarter. The handset vendor said it is facing slowing shipments to emerging markets and higher competition in developed markets.

The handset maker, which ranks fourth in the handset market in terms of global shipments, said revenues grew 39 percent from the year-ago quarter to $3.7 billion, which was more than one-third of the parent company's revenue in the second quarter. The division reported an operating profit of $531 million, up nearly 12 percent over the year-ago quarter.
By looking at the above Q2 results I feel highly optimistic for good times ahead. Verizon Communications support my view further when it announced its second quarter earnings rose 12 percent, galvanized by 45.3 percent year-over-year wireless data growth. Verizon Wireless added 1.5 million net customers in Q2, bringing its overall subscriber total to 68.7 million.