The author is vice president, engineering at
many times have you lost your car keys or wallet, only to find it was in your
pocket all along? It seems our search for studio interoperability is no
engineers, we’re always looking for new standards that can solve problems and
make life easier. That’s why I am a member of the Audio Engineering Society
Standards Committee X192 task group. I have no doubt that the work we are doing
there will result in interoperability standards for audio, some of which are
already tried and tested in our WheatNet-IP AoIP system.
For example, a lot of our automatic discovery technology for
polling the AoIP network and adding new devices as they come up is also being
pursued by the X192 task force.
what is missing in the open standards discussion is the fact that for us
broadcasters, studio interoperability already exists. In terms of being able to
pot up a fader from two rooms away or turn mics on and off, raise or lower
levels on processing, and select mixers or codec gear from anywhere in the AoIP
network, we’re already there. And have been for quite some time.
An infrastructure already exists whereby broadcasters can
control the entire studio, from the automation on down to the microphone level.
And in case you’re wondering, we can do all this and control levels, set gain
and do so much more without having to run a slew of individual apps from different
know of a thousand or more stations that are doing exactly that, all day, every
the reason why we are so far ahead in this regard is because of open standards.
IEEE standards like TCP/IP, RTP, IGMP, and so many others are what make
interconnectivity feasible and interoperability between sources, devices,
consoles and studios possible. Without these, we wouldn’t have been able to
design an AoIP system with true IP and Ethernet connectivity, which in turn
makes it possible to control consoles, devices and studios through a common
interface: the intelligent network.
single device in our WheatNet -IP system, from the largest console to the
smallest button panel, connects over IP and communicates using these standard
protocols, giving every device visibility and control of every other device on
the network. This is the ultimate goal of the standards groups and Wheatstone
and its partners are already there.
those earlier methods that are used in lieu of true IP, such as the CAN bus,
RS232, RS485, etc. serial bus protocols used in other AoIP systems to connect
consoles and IP engines, are standards-based. As limited as these are in their
ability to pass on the full control and logic necessary for true interoperability
due to their point-to-point connectivity, they are stepping stones along the
same goes for early audio standards like MADI, which is still going strong, as
evidenced by the racks full of MADI I/O boxes and intercoms out there. It’s the
reason we added a MADI port to our Blade IP access unit; we wanted studios to
interconnect to all that gear that existed throughout the facility, and we
recognized that a MADI interface could create that bridge for broadcasters.
interoperability exists on so many levels already. All the metering packets, command
controls — starts, stops, audio processing — all the intelligence that needs to
exist to oversee a large networked audio system, all that is taking place in
what new open standards are we really talking about here? Specifically, we’re
all examining the protocols and standards needed to solve latency and clocking
issues when dealing with multiple streams of audio in the context of the
economic advantages of Cat-5 cables, issues that are of critical importance to
pro audio more so than the everyday radio operator.
After all, it’s the pro audio guys who have to deal with OoS,
or lack thereof, inherent in IP when covering live, staged events that require
sending multiple audio streams to large speaker clusters in a stadium, for
example. And the pro audio market, being many times larger than the broadcast
market, has many more device types and manufacturers to deal with. That’s why
the AES is trying to develop these standards.
interested in these standards as an industry because AoIP manufacturers also deal
with clocking and latency issues to varying degrees, and Wheatstone is no
different. Lacking a sufficient standard for this, we all have developed our
own clocking schemes for striking that balance between the amount of buffering
needed to align audio packets in time and the very time it takes to buffer the
audio samples, hence latency.
approach is somewhat different than other manufacturers because our system
happens to run on a later, and therefore much faster, generation of IP
connectivity protocols. One gigabit/second transference, compared to others’
100 mbp’s network speed, gives us a more acceptable latency versus buffering
tradeoff. You don’t have to specifically choose to make a stream low-latency;
they all are.
It’s understandable, then, that AoIP manufacturers lacking
the network speed, especially those focused on the pro audio market, would look
to new standards to help solve some of those problems and give them the ability
to connect their devices together and pass audio between them.
of our goals for the AES-X192 task group is to create this standard so that one
day you’ll be able to stream your console monitor output directly to your X192-enabled
power amplifier or speakers. Protocols like Ravenna are a good first step.
the grand scheme of things, we all benefit from open standards. But while we’re
searching for the next open standard to solve these and other issues for the
entire audio industry, it helps to remember that the Holy Grail of true
interoperability can be found in a good many broadcast studios today. All we
have to do is look.
Comment on this or any story. Post below or email firstname.lastname@example.org with Letter to the Editor in the subject field.