Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

Off the Analog Copper, Onto Ethernet

Comerica Park's Gigabit Network Solves a Problem at the All-Star Game

Comerica Park’s Gigabit Network Solves a Problem at the All-Star Game

Here’s the challenge: Your international broadcast clients are booked to cover the 2005 MLB All-Star Game in Detroit, but there’re no audio circuits available at their commentary locations, nor can cabling be economically run. However, there is a robust Ethernet network running around the entire venue.

If this smells like an opportunity to innovate, do read on.

While broadcasting and information technology have been converging at a rapid pace, there’s still plenty of uncharted territory. Bill Durham, president of Commentary Systems International, along with Major League Baseball International viewed this cabling challenge as an opportunity.

According to Durham, “The commentary positions along the first base line were temporary booths constructed for the event with zero broadcast infrastructure facilities.” Moreover, the positions behind home plate had no circuits available. They needed another option: the venue’s gigabit Ethernet network.

Questions

After determining a local Commentary Control Room, or CCR, and necessary cable paths were unfeasible, Durham and MLBI approached Comerica Park’s helpful director of technical services, James Darrow, with the idea.

Durham found Comerica Park’s in-house Point-of-Sale Ethernet data network and routers could indeed carry his systems’ digitized audio throughout the venue, including the broadcast compound.

While some fiber would be dedicated only to CSI’s data, traffic on the gigabit Ethernet backbone would be co-mingled with non-broadcast data.

Would heavy photographer and sports writer Internet traffic saturate the available bandwidth on game day? Would the sales of souvenirs, food and beverage be brought to their knees by higher-priority broadcast traffic? How reliable is the LAN/WAN, anyway?

CSI’s commentary system is designed to run over Ethernet + CobraNet on a closed network. Normally, CSI runs Cat-5E cables from each position to their CCR, and DT-12 to link the CCR and the compound. Here, there was trepidation about utilizing an existing venue’s Ethernet network for a mission-critical live broadcast. But it was also essential.

Durham got on the phone with Bill Lance of Lance Design, who built the Digital Commentary Units (DCU) and their Digital Signal Processing (DSP) engine, and Steve Gray of Cirrus Logic (formerly Peak Audio), whose CobraNet interface provides the real-time transport of uncompressed audio and control data via fully-compliant Ethernet Layer II.

After coordinating their comments with Darrow, the technical realities began to gel.

“I told him, ‘As long as you account for the bandwidth used by other traffic on your trunk links, everything should work well; CobraNet co-exists quite nicely with data traffic,” Gray stated. “If you want to add extra insurance, you can always set up your switches so the CobraNet devices are on their own VLAN.”

The venue’s telecom team ran Cat-5E cable to each commentary position from a nearby telecom closet, terminating each run at a Cisco 3548 switch via a simple RJ-45 connector.

From that Cisco switch, each DCU’s dedicated 100 Mbps virtual-LAN (VLAN) “channel” was passed around on the venue’s gigabit fiber backbone to a Cisco 6509 master switch, and from there over a dedicated, private (and existing) mil-spec, multi-mode fiber line to the compound patch panel, with an additional run of mil-spec, multi-mode glass running to a Cisco 3548 switch in CSI’s Commentary Control Room located there.

As Darrow likes to say, “We have more fiber here at Comerica Park than you’ll find in bran flakes.”

Send and return

(click thumbnail)
This isn’t streaming audio or even “CD quality” audio; it’s better. What’s unique is that for each of CSI’s DCUs there are eight channels each of audio send and return, all with 20-20 kHz bandwidth, 20-bit word depth, 48 kHz sampled uncompressed digital audio (partially) over inexpensive Cat-5E UTP wiring.

When Ethernet switches are uplinked together via a gigabit span, about 320 audio signals can be accommodated by CobraNet in each direction. Potentially, via standard Ethernet networking methodologies, this digital audio can be distributed literally worldwide without audio signal degradation.

Power for the DCUs in Detroit was provided by laptop-type switching power supplies providing 48vdc @ 625ma, connected to each DCU via an RJ45 jack. This was done to accommodate Comerica Park’s Cisco switches, as CSI normally inserts power on unused pairs in the Cat-5E cable between the local Ethernet switch and the DCU’s via propriety interfaces and their own switches.

Each DCU supports four discrete program outputs (three analog, one AES-3 digital), three return lines (feedback/mix-minus, studio off-air coordination, and local technician), and two ISO transmits (studio off-air coordination and technician). Using CobraNet and Ethernet also enables a technician to remotely adjust all DCU parameters including routing, mixing and control, and listen to exactly what the commentator hears, aiding setup and troubleshooting. With the split commentary positions in Detroit, remote administration was crucial.

In the CCR, a frame houses terminal cards corresponding to each DCU. Each card has a jumper cable linking it to a port on the Cisco 3548 Ethernet switch, and Amphenol 25-pair connectors organized by signal type on the backplane for analog audio breakout. Additionally, each DCU card can be put into standby mode, sending out tone and/or voice ID loops for the program and coordination circuits (with 1 kHz tone for program, and 700Hz for coordination).

As for the evolution of commentary-over-Ethernet, Lance states, “Five years ago this wouldn’t have been as easy, but recent advancements in audio codec designs and embedded control systems make all this much more practical via Ethernet.” Indeed, with changes in firmware an Ethernet-connected remote controller could even adjust EQ and compression curves at each DCU’s DSP.

Regarding Ethernet, when asked about this unique application, Bob Metcalf, co-inventor of Ethernet by a memo in 1973 while at Xerox Palo Alto Research Center, admits, “It’s tempting to say so, but when Dave Boggs and I were building the first Ethernets, we had no idea they would someday carry MLB commentary.” Indeed, who knew?

Where are we headed? The recent growth of Voice over Internet Protocol popularity and podcasting alone proves audio distribution over IP-enabled networks such as Ethernet will grow beyond streaming radio. As more venues install Ethernets to support POS and Internet access as Comerica Park has done, more broadcasters should see the potential savings and growth in capability attained by moving off analog copper and onto Ethernet – and not just at the venue, but to their MCR’s via value-added networks like T1s, or some form of “assured bandwidth” Internet.

And not just audio, but instant messaging for call-in shows, sports stats, show rundowns and support for digital camera images for Web publishing from commentators at the venues, and even video.

As MIT Media Laboratory’s founding Chairman Nicholas Negroponte once said for his book “Being Digital,” bits are bits. How these bits are created and exploited most effectively rests with the creative efforts of broadcast and IT engineers, technicians, talent, writers, producers, management and the venues to invent this converged future of ours. Let’s play ball!

Reach the author at bill@bennett-ross.com.

Got a first-person or case study story to tell the industry? Write to radioworld@imaspub.com.

Close