Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

Test & measurement

Test & measurement

Nov 1, 2002 12:00 PM, By Chriss Scherer, editor

Most facilities today are a mix of analog and digital source equipment. The ongoing push is toward increased maintenance of the digital signal path. Regardless of the compliment of analog and digital equipment, every station’s goal is to maintain a quality audio signal through the entire chain.

Several basic audio quality parameters are common to both forms of audio transmission. The tools for most of the analog methods are familiar to stations. However, the tools required for digital evaluation may be less familiar, and the elements to be measured may also introduce new concepts and methods.

Computer-based systems provide accurate test routines that can be repeated and automated.

Reliable and proven

The analog audio test set has faithfully served radio for many years. Basic measurements such as frequency response, distortion and noise are simple elements to evaluate. The signal generator and audio analyzer are necessary tools for these tasks.

Analog signals are subject to the medium through which they travel, and the parameters can sometimes explain what may affect that signal.

Bench-top test sets can provide highly accurate measurements of these parameters. These same bench-top systems have evolved into portable forms as well. Creating a test scenario with large boxes, once an awkward task, has been reduced to plugging in a device that can be as small as a PDA. The cables and connectors now occupy more space than the test equipment itself.

Analog testing is still an important task. Even through a completely digital system, the final output of the digital-to-analog converter is an analog source and needs to be evaluated from an analog point of view.

When digital test sets were first introduced, one of their obvious attributes was how much time a user could save by automating many of the functions in routine tests. The tedious task of nulling the test set between each frequency measurement is now done automatically. The entire discrete frequency series is done in one continuous sweep or with a broadband noise source.

Digitally-based systems can run automatically, while the engineer attends to other duties. For intermittent problems, the test set monitors several parameters at once until the failure the occurs, providing more information to the person performing the trouble shooting. At the same time, that person does not have to sit and wait for an intermittent problem to occur before he can diagnose a problem.

Smaller, portable test units pack lots of features in a small case.

Not always perfect

Digital systems were designed to eliminate the problems and shortcomings of the transmission medium and processing path. Each amplifier in an analog system introduces a small variation to a signal, whether it is noise, distortion or frequency response. The theory behind digital is that what comes out is exactly the same as what goes in. This is not always the case. While digital systems can be more robust than their analog counterparts, they can also be more fragile.

In most cases, analog signals fail because of a slow decay. Regularly checking and recording operating parameters can show that a problem is developing before it becomes serious. In a digital system, the failure may go unnoticed until it is too late. Occasionally, chirps or clicks in the audio can be a sign that something is not right, but when failure occurs, it occurs immediately.

Which to choose?

Like any other equipment decision, deciding on a system first requires that you determine your needs. Most of the available audio test instruments can be categorized by several functions.

Analog or digital This is the most basic consideration. Choose a compliment of analog and digital inputs and outputs. The digital connections may include more than one format (AES-3 and S/PDIF for example) and variable sampling rates.

Generator or analyzer Some units feature one function or the other, and some offer both. For stand-alone equipment tests, the generator and analyzer combination would work well. For system checks, where the input and output are separated by some distance, separate components would work better.

Basic functions or feature-rich This is the most complex area of concern. Each available product fills its own special niche. The ability to read bit errors and the amount of jitter may be important for one application, while simply knowing the sampling rate with an overall digital quality indication may be enough for another application.

The range of test equipment is broad enough to provide a workable system for any need. Even with the latest digital equipment, where a field repair at the component level may not be possible, the right test equipment can help to diagnose the complexity of a problem and aid the decision on how best to correct the situation.

New terms, new tools

Testing and measuring analog audio has its own set of traditional measurements that are used to verify a signal’s quality. Some of these measurements apply to digital audio, but simply knowing that there is a problem is not enough. Sometimes, distortion and other problems can be traced to a problem in the digital bitstream. Here are some of the common digital terms and their meanings.

bit-error rate or bit-error ratio (BER) A measurement of errors in a bit stream. The ratio is calculated by dividing the number of erroneous bits by the total number of bits transmitted, received or processed over a specified period. This can also express the number of bits processed before an erroneous bit is found.

bit-rate The rate of transfer of a digital signal. A signal’s bit rate is determined by multiplying the sampling rate by the number of bits in the sample by the number of channels. For example, an audio CD is sampled at 44.1kHz with a 16-bit resolution and has two audio channels (stereo). The bit-rate of a CD is 44,1000 � 16 � 2 = 1.41Mb/s.

jitter An error in accuracy of a digitally clocked signal, which appears as an unexpected phase shift in the signal. Jitter happens when a data transition does not occur at exactly the anticipated point of a clock pulse.

latency The inherent delay in signal processing and software processing. Latency is determined by the time it takes for a system or device to respond to an instruction, or the time it takes for a signal to pass through a device. To most people, a system latency of less than 35ms is not recognized as an actual delay.

Close