YOU ARE AT:5GAnalyst Angle: 5G – on generations and legacy

Analyst Angle: 5G – on generations and legacy

In the world of wireless, generations of technology come and go in what seems like the blink of an eye. As much as this rate of progress is wonderful, there are certain aspects of how quickly we change technology that are worrying.

Let’s quickly recap
The very first analog cellular system deployed was NTT in Tokyo in 1979, closely followed by the rollout of NMT across Nordic countries in 1981. AMPS, meanwhile, was more widely deployed seeing rollouts in: North America in October 1983; the U.K. in 1985 (famously with Ernie Wise making “the first call”); and Australia in 1987.

But, despite being much championed as a new wave of progress for wireless, analog cellular never even made it into its teenage years.

In 1991 the first GSM network, Radiolinja, launched in Finland, replacing the existing analog cellular networks with digital. As the 900 MHz frequency range was being used for both 1G analog and 2G digital systems in Europe, the 1G systems were rapidly closed down to make space for their more advanced 2G successors.

Just a decade later in 2001, the first 3G system was launched by NTT DoCoMo dubbed “FOMA.” The very next year, the first 3G networks to make use of the rival CDMA2000 1x EV-DO technology were launched by SK Telecom and KTF in South Korea, and Monet in the U.S. Meanwhile, W-CDMA was launched by Vodafone KK (now Softbank) and by Three/Hutchison in the U.K. and Italy.

If we count WiMAX and LTE as part of “4G,” they arrived very quickly: Mobile WiMAX was deployed in South Korea in 2007, and LTE was rolled out in Oslo and Stockholm in 2009.

That is four complete generations of wireless in less than 30 years – one human generation – with more already on the horizon. There are whispers of “4.5G” – with a shiny new brand for “the thing after LTE-Advanced” – launching soon. Meanwhile, wireless conferences and journals across the globe are full of discussions regarding “5G” with an expected launch in 2020.

And, of course, all of these are fundamentally incompatible. We have multimode phones and software-defined base stations, but these are “duct tape” integrations: there is no backward compatibility.

Compare this to wireline
Wireline technology was developed in the 1870s and deployed by the 1880s, with humans as telephone operators. The automated switch wasn’t widely deployed until 1919 – in a way, this was wireline’s 2G. Touch-tone dialing, electronic switches and TDM were introduced in the 1960s, arguably the third technology generation, and voice over Internet Protocol was only introduced in the late 2000s.

As a result, wireline technology progressed through four generations in 130 years – a century slower than its wireless cousin.

But what’s really interesting is how robust and resilient wireline technology is. Ancient systems are still in use and handle billions of calls per day: many might now be transmitted wirelessly, but the basic architecture still remains in place. We still regularly use 4 KHz 8-bit coding, half a century after its introduction. You can plug in a rotary-dial, analog phone from the 1930s and it will still work today.

As the old adage goes: ‘If it ain’t broke don’t fix it’
There is something staggering about a technology domain where the original design decisions taken over a century ago were good enough and, in application, work well enough to still be in use today.

In many ways the speed of wireless progress has been phenomenally positive. It has brought access to communication to literally billions of people. It has had a monumental impact on the economy; estimates suggest a 10% increase in cellphone penetration increases gross domestic product by 4.3%. It has resulted in an unparalleled increase in human well-being across the globe and transformed whole sections of society.

And that is the paradox: Investment in cellular has been massive, and much of that money has been written off quickly to pave the way for new generations of wireless technology.

So what happens as we move on to 5G?
Carriers now face the issue of turning off older, outdated networks, with some suggesting they expect to start phasing out 2G in the next few years.

“Dead at 30” might be a hippy mantra, but saying goodbye to a technology whose 130-year-old cousin continues to soldier on is truly sad, and arguably premature. GSM is global, cheap, long-range and robust – it delivers everything we need for technologies such as “Internet of Things,” GPRS and text messaging. It works.

On the other hand, W-CDMA is almost certainly doomed. Architecturally it is a kludge: CS and PS and HSPA bearers, IP over ATM and all the rest. There are many costs and few benefits – after all, anything 3G can do, either GSM or LTE can do far better and cheaper. I don’t doubt that it will continue to live on for a long while yet, but it’s a dead man walking.

Meanwhile, over the course of its life, LTE will receive an investment of almost $1 trillion – a terrifying amount if history repeats itself, and LTE enjoys as short a lifespan as its predecessors.

Here’s the real question: can 5G really be so much better that it justifies yet another deployment, involving more equipment, more protocol stacks, more management systems, software upgrades, and all the spectrum and interference implications?

How many networks can carriers afford to support at once?

How do they manage the pain for customers when it comes to switching off a service some are quite happy with that precisely delivers what they want to pay for?

There are no easy answers, but it’s vital that 5G is architected in a way that minimizes these problems. The technology should seek to learn a few lessons from its wired cousin: Upgrades, new architectures and new features should all continue to be delivered regularly, but it needs to maintain a compatible and coherent framework, otherwise it has no chance of outlasting its ancestors.

Rupert-Baines

Rupert Baines has 25 years experience in high-technology marketing and strategy. Most recently he was VP of strategic marketing at Mindspeed, following the acquisition of Picochip where he had been VP marketing. Baines was one of the founders of the Small Cell Forum, was elected to the board in 2007-2013 and chaired the marketing working group. In 2013 he was honored to receive the Forum’s award for Individual Achievement. He has also worked for operators, responsible for new product launches and strategy; for a venture capital fund; and for technology strategy consultancy Arthur D. Little. Baines has a BSEE and Diploma from Hull University, an MBA from IESE and is a Fellow of the IET.

Editor’s Note: Welcome to Analyst Angle. We’ve collected a group of the industry’s leading analysts to give their outlook on the hot topics in the wireless industry.

ABOUT AUTHOR