I was introduced to the international world of broadcasting when I worked at RCA in the 1970s. The experience brought to my attention how many second-tier nations use the U.S. FCC’s technical regulations as a foundation for their communications regulatory systems.
The lag between rule making here and acceptance there can span many years – as can the removal of regulations from the books.
Several fellow international engineers recently told me that their regulations still require a complete annual “proof of performance.”
During the tube’s reign as the ubiquitous active device, the challenge for broadcasters here in the States was to control and minimize the contribution in noise and gain variation of each audio and RF stage. We hoped that the total system could meet at least a minimum performance specification and be viewed as an uncolored, linear reproducer.
Not only did your radio station have to meet this spec, practically from the bolts on the toilet to the beacon on the tower; but you had to prove it by an annual measurement: the FCC proof of performance.
So every year, at about the same time, the station engineering staff would schedule “the proof.” Part troubleshooting, part measurement, part adjustment, it usually took most of the experimental period to accomplish, depending on the complexity of the station. The experimental period is that time between midnight and 0600 local when you are allowed to work on your station with both day and night powers and all other stations must accept any interference this might cause.
When done, you had to carefully and neatly graph the distortion and response as a function of frequency at several modulation percent levels and record the overall signal to noise. The complete document, including a description of the procedure, test equipment used and the qualifications of the signatory, was made a part of the station’s FCC records for years to follow.
The prescribed evaluation path was from the lowest level point – normally the control room microphone input – to the sample audio output of the modulation monitor. Everything between would be considered, including all the circuitry in the studio, the telephone or radio link to the transmitter, linear audio processing stages, the transmitter and even the influence of the antenna.
Except for modulation extremes, the standard for AM was -45 dB of noise and +/- 2 dB from reference response overall.
In the all-copper world of the wireline companies, telco STL circuits were notorious for the noise and loss of high-end response. “Making proof” often meant hours measuring and swapping telco lines for the lowest noise and flattest response. If you couldn’t achieve it via this optimization, often you had to hand-select the tubes in each and every station circuit until the needle on your distortion/response analyzer fell below that magic number.
Modern circuitry has no great difficulty with achieving these numbers; and, as a cost measure, the proof requirement has been dropped. Now we use higher-tech tools to proof stations: our golden ears and the ratings book.
Share with us your proof of performance recollections. E-mail [email protected].
The toughest proof the author had to make included two discrete STL hops and the selection, by hand, from a box full of General Electric Phasitrons to get under the FM noise specifications.