Where might 5G lead for radio? Radio World shared this week’s feature story with Michael LeClair, chief engineer of Boston’s WBUR and former tech editor of Radio World Engineering Extra, who has watched 5G’s development with interest, from a distance, and invited him to comment.
There are so many questions raised by 5G that it’s almost impossible to know where this will lead. We don’t yet have a clear direction defined for what 5G is and isn’t.
From what I’m reading, there are multiple implementations of 5G. What was initially promoted was the concept of using SHF band licensed channels (3 to 30 GHz) where they could fit them in. Those of us using licensed microwave links in broadcasting are familiar with 6, 11 or 23 GHz. These are allocated in bands of 10 to 20 MHz (you can combine adjacent bands for more bandwidth if you need it), which are like communication channel building blocks. Based on the distance you need and what can be done without interfering with other licensed users, you can build out links capable of doing 100 Mbps or greater. At the higher speeds, dynamic QAM is used to achieve very high modulation rates; but the tradeoff is the number of errors that will occur due to signal strength, weather conditions and the size of dishes.
But the promise was 1 Gbps for 5G. Bidirectional. And mobile.
The simplest way to increase the data rate is to increase the channel size. For example, to get 1 Gbps data with a very robust QPSK modulation scheme similar to what we already use in 4G, you would need channels 500 MHz wide. This one channel would utilize more spectrum than the entire radio and TV broadcast bands combined (plus the unlicensed 2.4 GHz band to boot!). It’s more than all the spectrum currently licensed for all wireless carriers combined.
The only place where this kind of spectrum is still available is in spectrum above 30 GHz, or EHF. Lots of spectrum for sale up there. There has been discussion of displacing satellite communications operating in the 4-6 GHz range with mobile data services. If they absorb those frequencies there would be four channels of 500 MHz bandwidth in every city of the U.S., enough to handle the largest cell carriers today (Verizon, T-Mobile, AT&T and Sprint).
But that spectrum is already largely in use. That is causing Ph.D.s and engineers to look at what can be done with transmissions at EHF (30-300 GHz). EHF attenuates in atmosphere very rapidly. The usable transmission distance might be 100 feet or so. To build cell service across one square mile would require 2,500 transmitters per square mile. Even a smaller city would require tens of thousands of transmitters, each with a dark fiber connection to some kind of central (or networked) router. Initial trials of this kind of 5G have taken place in Boston and have been found to only work on street corners at the moment. Once you move inside a building or any physical structure they fail.
Imagine how this would affect a product like the Comrex Access. I’ll stick with 4G.
There is a second approach to building 5G with lower frequency channels that are not as susceptible to attenuation in atmosphere. Cell carriers settled on channels in the 600–900 MHz range as being the optimal tradeoff between available bandwidth and data rates for 4G. To do so they have basically “taken” spectrum that was being used by UHF TV, essentially by eminent domain at the federal level. Auctions were used to determine the value of the spectrum.
At lower frequencies, by combining several more “blocks” of bandwidth together it becomes possible to get both a robust transmission system and higher data rates. For example, if I can put together enough blocks of 20 MHz (say five), I can get 800 Mbps using 256 QAM, which is somewhat robust for fixed location connections. Not quite 1 Gbps but still pretty impressive. Data compression would allow the capacity to go well over 1 Gbps but at the cost of overhead processing that may partially nullify the speed boost. This is the second form of 5G. I believe T-Mobile/Sprint is working on this method.
Again, the four major carriers, if they simply consolidate their spectrum efficiently could each acquire 100 MHz in every major city of the country (there is substantial spectrum around 1 GHz owned by various companies already).
If these services can be made reliable, I see home or small business Internet access as being much easier to build out wirelessly. Remote studios and broadcasts would no longer need to contract for wired data connections, especially in urban areas.
Remote transmitter sites would be able to use STLs based on wireless data services. Some technology would have to be added to these to protect them from congestion and interference reducing reliability.
What I don’t see with the SHF/EHF 5G is much disruption to radio beyond the cache streaming services already out there. The reception distance is too short for even someone walking down a city street.
However with lower frequency blocks, audio program providers could build a somewhat better real-time mousetrap than they currently have. With some consolidation of older services and multiple carrier entities, it might be possible to allocate enough spectrum in all the major markets that could come close to replicating the near-instant tuning of radio over distances that would be limited only by tower buildouts (highways would likely be good candidates for full service in rural areas, extending that mobile coverage in ways that radio can’t).
FUTURE OF CODECS
Any of these services at such high speeds begin to raise the question of whether super high efficiency audio codecs are really needed any longer.
Right now the most popular live streaming speed is 48 kbps mono. Millions of listeners use this on a daily basis for their “radio” feeds. The main reason is cost. As the number of streams multiplies, the amount of data at current rates becomes very expensive to support. It’s also robust enough for mobile services in real time. Cache services like Spotify or YouTube use cached file transfers instead of streaming to cut their costs (it allows demand to be managed more effectively than building streams in real time and the use of TCP to minimize errors). If the cost of data goes down due to the greater capacity of 5G, it might support standard higher streaming rates like 128 kbps and make the need for cache services less important (hard to believe YouTube won’t still need to cache files given the much higher data rate required for video).
To be competitive, businesses and IS’s will likely move their benchmark best delivery rates up to 10 Gbps or 100 Gbps over optical paths. Can copper lines still be competitive at those data rates? Office wiring systems are now deploying with 10 Gbps capacities over copper and 100 Gbps backbones over optical are a reality already.
In brief: The limitations I’ve described, which have been confirmed in initial Boston testing, are so significant it’s hard to see how the wireless carriers could be marketing this service unless they’ve got some serious cards up their sleeves.
There may exist some new concepts for this technology that haven’t been shared yet, such as a localized burst mode with the highest speeds that trims down to “enhanced 4G” for everyone else. Unfortunately, most of the blue-sky thinking has been based on the deployment of nothing less than perfection. It’s why I have my doubts about how it will all work out.
In fairness, “I don’t know what I don’t know.” It’s possible there are other ideas floating around that work better than trying to build something in the 20 GHz range of experimental spectrum.
At the same time, I remember the days when live streaming was supposed to completely displace radio “any day now” (this was in the late 1990s). Those who proselytized this technology takeover have been proven wrong time and again.
Someone back then who bothered to calculated the approximate data resources for point-to-point streaming to replicate even one major-market radio station in the top 10 found that streaming in that era couldn’t possibly hope to displace radio broadcasting; it had only a tiny fraction of the capacity necessary to replace one broadcast station. Similarly, bitcoin, if mined at the rate it is today, would in 2025 or so consume 120% of all the electricity on the planet in server farms; it can’t possibly work as a transactional technology for a global financial system. These are ridiculous claims that either entirely ignore, or intentionally distort, the laws of physics for a marketing advantage.
Count me skeptical on 5G for similar reasons.
What do you think about 5G and its possible impact on radio? We invite your opinion. Email [email protected] with “Letter to the Editor” in the subject line.