Commentary: IBOC Has Been a Mistake

If you doubt that IBOC is in trouble, consider the defensive responses my commentary ("Has Anyone Thought This Through?" in the April 20 issue) elicited from some RW readers.

"Jack Hannold's thesis," wrote John Byrns (Reader's Forum, July 19), "is that the FCC made a mistake in its choice of an FM stereo system in the early 1960s, that `the AM stereo mess of the l980s' was a similar mistake, and that this process is repeating yet again in the case of IBOC."

No, those were my supporting premises. My "thesis," stated in the very last sentence, was that "when technologies compete, economic and political considerations can trump technical ones at the FCC."

Just the facts

Byrns continued, "If the author's muddled facts with respect to the FM stereo system choice are any indication, then it is hard to accept his conclusions with respect to AM stereo in the 1980s and IBOC today. A few of these muddled facts, and the truth, follow."

I'm grateful to Mr. Byrns for boldly stating his motive for attacking the Crosby FM stereo system, which hasn't been used in this country, even experimentally, for decades. But unfortunately, it was Mr. Byrns who muddled the facts.

Like the Zenith pilot tone system, Crosby provided L+R mono-compatible audio and an L-R subcarrier. But instead of a noisy AM (actually DSB) subcarrier, Crosby used a wideband FM subcarrier, providing a better signal-to-noise (S/N) ratio in stereo from all but the weakest RF signals.

FM is not entirely free of noise. While AM has a rectangular noise spectrum, with the amplitude of random noise in the demodulated signal constant across the audio spectrum, FM has a triangular noise spectrum, with the amplitude of noise increasing with frequency, i.e., it rises at a rate of 6 dB per octave, or 20 dB per decade.

Thus at 4 kHz, the noise level is 20 dB higher than at 400 Hz, and at 40 kHz it is 40 dB higher. So the L-R subcarrier, whether DSB or FM, is accompanied by a high level of noise.

FM uses pre-emphasis to overcome high frequency noise. Sounds above 2 kHz are boosted 6 dB per octave at the transmitter by a pre-emphasis circuit. A complementary de-emphasis in receivers rolls off treble response at the same rate, restoring highs to their proper level and simultaneously reducing high-frequency noise. But it only works in mono, because the DSB subcarrier is not boosted by pre-emphasis, and is thus subject to AM noise when the main carrier signal is weak.

But an FM subcarrier, like the main carrier, benefits from limiting (the elimination of AM noise imposed by interference), as long as its amplitude - which remains constant in FM - is slightly above the peak amplitude of the noise within the subcarrier channel, yielding a much better stereo S/N than Zenith on all but the weakest of weak signals.

Mr. Byrns' argument that the Crosby system suffered distortion "caused by . filtering off some of the FM subcarrier's sidebands ... in order to fit [it] into the space available . on the main carrier" is nonsense. In theory, FM sidebands extend to infinity; but beyond a certain point - a point determined by the modulation index - sideband components are so small (and so insignificant) that eliminating them causes no audible distortion.

As long as the modulation level is compatible with the intended receiver, there's no audible distortion. If filtering out components of such minuscule amplitude created audible distortion, all FM receivers would suffer degraded fidelity just from having enough selectivity to reject adjacent-channel interference.

And would the leading component makers of 1961 - Fisher, H.H. Scott, Harman-Kardon, Marantz, McIntosh, Dynaco, Heath, et al. - have supported an FM system with more distortion than Zenith, even with a better stereo S/N? Of course not.

As for mono S/N, both Mr. Byrns and Hal Kneller (Reader's Forum, July 19) took some pains to point out that because the only real limit on main carrier modulation is that for the L+R audio and the L-R DSB combined, L+R can reach 90 percent modulation. Yes, but that's mono!

L+R modulation over 60 percent can only be achieved on material with relatively little separation; and making every recording fit that mold (by arbitrarily blending those that don't into near mono) entails a big sacrifice in stereo imaging for little gain in mono S/N.

Adjacent-channel interference

Ted Schober (Reader's Forum, Aug. 2), who calls me his "neighbor" though I live 15 miles away, took me to task for not knowing what's available here.

"Jack did not realize," he wrote, "that the very IBOC technology he decries provided him with not one, but two excellent [Philadelphia] classical music stations, WHYY-HD2 and WRTI-HD2."

But in fact, I listen almost exclusively to public radio. So though I don't own an "HD" radio (and don't plan to buy one), I've been hearing WHYY tout HD2 (via analog) for months. But I haven't heard any such announcements on WRTI, and its Web site says nothing about HD2.

And for that matter, Ibiquity's Web site lists WRTI's HD2 format as T.B.D. Is that "To be decided"? Could Mr. Schober be wrong about WRTI?

He certainly was wrong about adjacent-channel interference:

"Some of [Hannold's] other points about interference are well taken," he says. "Wilmington's WSTW(FM) is a grandfathered short-spaced station which caused a lot of interference to [Philadelphia's] WMMR(FM) and WYSP(FM) in the days of tube FM radios that had poor adjacent-channel selectivity. The interference from WSTW's IBOC is certainly no worse than the main channel signal caused in 1958." [Emphasis added.]

I beg to differ. In 1961, my first FM tuner was a three-tube Granco with a single IF stage. While the Granco sounded good on strong local stations, its selectivity was terrible, even by 1961 standards. Nevertheless, I could always get either WIP(FM) or WDEL(FM) without any problem, because their field strength was nearly equal at my house, and the Granco's capture ratio was fair.

WIBG(FM), operating well below full Class B power in those days in order to protect a co-channel station in Sunbury, Pa., was another matter; but I could get WIBG on the Granco after WDEL signed off at midnight - or any time on a radio with at least two IFs.

So it's disingenuous of my "neighbor" Ted to suggest that the nominal 250 kHz gap between two second-adjacent analog stations presented the same kind of difficulty for a tube receiver - even a cheap one - that the 2 kHz gap between the IBOC signals of the same two stations poses to a digital receiver.

Interested parties

I don't know John Byrns' profession, but Ted Schober is a consulting engineer who installs IBOC, and Hal Kneller is manager of public radio initiatives for Harris. They're hardly disinterested parties. They really believe IBOC is the next big thing, and they want to be in on it.

And of course, there'll be big money in patent royalties, not only from broadcasters but also from consumer receivers, if IBOC becomes dominant. So the gloves are off.

But this isn't the first time so many industry people have uncritically embraced a questionable new technology. Many broadcasters - and many in consumer electronics, too - were eager to adopt CBS Labs' FMX system back in the 1980s.

FMX used a second DSB subcarrier with a highly compressed difference signal. The DSBs were in quadrature, like the I and Q chroma signals in color television. And like those color TV subcarriers, the FMX signals suffered unacceptable levels of crosstalk whenever the main carrier was subject to severe multipath conditions.

While the L+R audio on FM is still effectively wideband FM at half the mono modulation level (deviation ratio: 2.25), the L-R subcarrier is effectively narrowband FM (effective deviation ratio of 0.637). But unlike TV, radio finds much of its audience in cars, where multipath is a major problem.

I thought I was alone in seeing potential problems until Amar Bose published a study documenting the problems with FMX, and both the broadcast and electronics industries rapidly lost interest. Dr. Bose is now 77 and retired from MIT, and perhaps that's why he hasn't looked into IBOC. (Or perhaps he has, as there are still no IBOC receivers on the Bose Web site.)

But maybe a Bose study isn't necessary. With the system in wide use, its shortcomings are becoming apparent to more and more broadcasters, and not just small-market people who have been criticizing it all along.

Consider the letter to Radio World from Robert Conrad (Reader's Forum, July 19), the president of Cleveland's classical WCLV(FM) and Seaway Productions, who complained that IBOC audio quality is not what was promised, and that signal coverage is terrible. I admire his courage. Most people in his position would be embarrassed to say that after having put so much of their money - and, in some cases, their prestige - behind this junk technology.

As Mr. Conrad said, broadcasters' efforts to promote IBOC "will only disappoint, and perhaps antagonize, a significant segment of the audience who find that the system doesn't deliver."

Let's hope the industry as a whole will recognize that IBOC has been a mistake, and that it does so soon enough that it will be only the larger broadcasters - and, I'm afraid, all too many financially strapped public broadcasters - who will have invested prematurely, and unwisely, in this ill-conceived technology.

Comment on this or any article to

Receive regular news and technology updates.
Sign up for your free Radio World NewsBytes newsletter here.

Thank you for your comment. Please note that posts are reviewed for suitability and may not appear until the next business day.