Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

What’s Behind AM Receiver Performance?

While engineering is my first love, I spent most of my years in the radio business earning a paycheck as a pretty good investigative street reporter, but even those skills have not helped me to answer a basic question concerning AM radio.

While engineering is my first love, I spent most of my years in the radio business earning a paycheck as a pretty good investigative street reporter, but even those skills have not helped me to answer a basic question concerning AM radio.

We have all read the stories about bandwidth. Purists tell us that AM can sound as good as FM if the bandwidth is allowed to be 15 kHz. Well we all know that will never happen again, allocations being what they are.

But some serious work was done on the matter. In the 1980s the NRSC developed the now mandatory 10 kHz cutoff, which in practice is actually 9.6 kHz.

The rationale was to set a standard receiver makers could use to allow the widest possible response with the best compromise for adjacent-channel interference reduction. The standard was voluntary but later became mandatory.

Every AM station in the United States abides by it, thus ending the days when AM transmission audio response was not predictable. That was more than a decade ago, but if you go into any Best Buy or Wal-Mart or car dealership and sample their radios you will see that just about every single one still has the same front-end performance they did before the NRSC standard was even being talked about.

The big question is why?

AM bandwidth issues

I have read for years that AM bandwidth has been held down because of complaints about adjacent-channel interference. Let’s nip that in the bud. The NRSC mask virtually eliminated that as an issue.

The 10 kHz restriction allows just about any radio to be able to separate stations 20 kHz apart in normal listening situations with no spatter heard. If the NRSC radios had been built and failed to stop the adjacent-channel noise, I would understand the industry going back to narrowband.

But the radios were never built. The few wideband radios out there do perform very well and are able to show good selectivity even in wideband mode. Has the FCC since been bombarded with citizens complaints about adjacent-channel noise?

Does anyone know anyone who has ever written to Sony or Delco or Panasonic in the last 10 years to complain about bandwidth being too wide and excessive interference — other than impulse noise?

Where are all these complaints? Why were NRSC-standard radios never mass-produced? Where is the rationale for not making them based on “complaints”?

Complaints or no complaints, why did manufacturers not implement the NRSC response or even anything close to it? Who designs these radios anyway?

I am willing to bet that most AM radios are based on no more than one or two chipsets that have become the standard for Asian manufacture of radios for America. It seems that only one or two suppliers of these building blocks would have to improve the design for these changes to be seen in most of the AM radios made for this country.

Some contributors to industry trade publications have done response studies on several AM radios and yes, bandwidth is pretty sorry, with most being several dB down at even 3kHz. Even so-called professional-grade monitor radios have sick analog response that barely beats a telephone on high end.

Some engineers on the broadcast side advocate further reduction of transmitted bandwidth to better match the performance of the “typical” radio. While this sounds like the tail wagging the dog, it would be acceptable if the master planners come up with a better chipset that is actually flat to 5 kHz and then cuts off sharply.

This would sound much better and still leave room for IBOC. But the receiver would have to really be flat in its bandpass, not rolled off starting back at 2 kHz!

First, let’s be clear, this has nothing to do with IBOC hash interference since that problem occurs on the main carrier frequency of a station whose adjacent is using IBOC. Narrowing bandwidth on the receiver will not stop that kind of interference from being heard.

But we do understand that receive bandwidth must be limited. Keep the station’s own IBOC signals from being heard on analog radios.

Okay, I can already hear some readers crying about cutting bandwidth to 5 kHz and how that is such a terrible thing. If we actually have a flat band pass of 5 kHz on the receive end you would be most impressed with the sound, even with a 5 kHz transmit standard.

The fact is that most radios will produce something past 5 kHz, even 7 kHz, but it’s so low that it’s not worth talking about. These same radios generally start rolling off at about 2 kHz.

If we could hold the response flat from 1 kHz to 5 kHz it would make a major difference. Motorola had that idea a few years ago.

What happened to Symphony?

Back in 2003, Motorola sent out a lot of paper about a brand-new AMFM chipset that was destined to turn the receive world on its ear. It was called Symphony and promised to bring DSP processing to the average receiver.

FM range was predicted to increase dramatically and AM would be reborn with variable bandwidth, flat response and digital filtering. We were supposed to see these radios in automobiles by 2005. Well?

For reasons I cannot explain Symphony was not adopted. It was supposed to be virtually as cheap as the systems already in place but with so much more to offer. What happened? Looks like it never made it to market although Motorola was bombarded with requests for chipset samples.

The Big Lie

As I mentioned, we have been told for years that AM radios remain so narrow because of all the complaints about noise. Well I say hogwash.

I own a 2001 Ford Crown Vic with a stock radio that has one of the best-sounding AM sections ever heard. I measured the audio on it and its good to about 4.8 kHz where it’s down 5 dB. It takes a nosedive after that but sounds great — all while having a very tight front end.

I can sit 20 miles from 50 kW WLS at 890 kHz and hear my 1 kW WGTO(AM) at 910 kHz hitting me with under .5 millivolt more than 80 miles away with no interference from WLS! That to me is more than acceptable performance and it sounds good.

However my other car, a 2004 Ford Crown Vic, also equipped with a stock radio, sounds like crap with a measured audio response of about 2.3 kHz.

Why did Ford go backwards on performance? I dare them to show me any complaint letters about the radios in the 2001 models that forced them to narrow the 2004 so much. Yes, it’s a big mystery to me.

You would think a simple nudge from the FCC would help. Years ago I co-wrote a petition to ask the FCC to implement a minimum standard for AM performance; see www.geocities.com/amstereo2001/fccpetition.htm.

The petition went nowhere. Where was the NAB on that issue? Until we get serious about the last link in the AM system, all the talk about cutoffs, pre-emphasis, bandwidth narrowing and the like will be just that: talk.

The people who make radios need to get their heads out of the sand and maximize performance to even half of the transmission standards adopted many years ago. The NRSC had manufacturing representatives at the table when the 10 kHz standard was adopted, but not one of them changed their radios to take advantage of the new cutoff that eliminated first-adjacent monkey chatter.

If the people at the table would not do it, how can we expect others to follow? All the improvement on the transmitter side means nothing if we cannot mandate even a minimum standard of performance for mass-produced receivers, be it 5 kHz or not.

RW welcomes other points of view toradioworld@imaspub.com.

Close