This story is excerpted from “The Cloud Shines for Radio,” a free ebook. It explores trends in how radio stations are using cloud-based technologies.
Chris “Doc” Tarr is group director of engineering for Wisconsin-based Magnum Media, which has 22 stations airing 31 signals.

“Cloud-based content generation and delivery really took off during the Covid era,” Tarr said.
“By necessity, many of us moved to a distributed workforce, which really demanded the use of off-premises services. In our group, we moved to Google for mail, document sharing and meetings. We still hold our weekly sales meetings via Google Meet and share production orders and copy over Google Docs. That has had the biggest effect on how we operate since it has allowed us to spread our workforce and not worry about data and server management. We’ve also moved to Marketron for traffic, which is cloud-based.”
He said audio file transferring for voice tracking is web-based as well. “People in home studios never have to connect their computers to our network, exposing us to risks.”
Magnum is using Amazon’s Cloud Services to hold backups of its automation systems to allow for a quick rollout in the event of a failure or other technical issue.
“We’re now looking at local and cloud-based virtual machines for hosting our metadata systems. Our current system is already based on a server at our data center, but there’s no reason we must have it on-site. Shifting the management to pure-play operators shifts the burden to them which allows us to focus on day-to-day broadcast operations.”
While a station could choose to fully build out a cloud-based facility and have it piped directly to the transmitter site, Tarr said there are still benefits to having some things based on-premise.
[Related: “Midwest Regional Broadcasters Clinic Hones In on Tech Topics”]
“Even though latency has gotten better, cloud-based applications still aren’t the gold standard for ‘real-time’ operation in a fast-paced on-air environment. On the other hand, things like metadata, backups and business operations are very much better off in the cloud.”
But in all broadcast operations, he preaches the importance of redundancy.
“We’ve recently moved to fiber delivery for our STLs. However we are aware that technical issues happen, so we’ve put in a lot of redundancy. We have two diverse network providers into our buildings, high-availability routing and backup playout systems at the transmitter sites,” Tarr said.
“It’s all a matter of what I like to call ‘Tolerance to Pain.’ Can you deal with a 1% outage rate? Then you don’t need to worry too much about high redundancy. We have a pretty low pain threshold, so we’ve added quite a bit of it.”
And what advice would he give to a station or network considering migrating some or all of its operations to the cloud?
“When you move functions to the cloud, you are giving up control of the infrastructure. The upside is that cap-ex and maintenance costs are eliminated, but that also means you no longer interact with it,” he said.
“Security is also a key factor. We all know about data breaches, ransomware and other issues. You need to be sure and understand the type of security they have in place, and how vulnerable your data might be. You also must be sure that whatever you’re using is PCI-compliant if necessary. Be sure to inquire about uptime guarantees, talk to other users, and think about what you would need in the event of a provider failure.”
Tarr hopes that engineers keep an open mind about the cloud. “We tend to be guilty of being super critical of things we don’t quite understand or didn’t create. It’s hard to give up some control of your systems to a third party, but aside from being inevitable, it can help you make your systems more reliable.”