The author is a spokesperson for Wheatstone Corp.
Who could have predicted that people would be running around in parks looking for cartoon characters using their smartphones?
Pokémon GO has put augmented reality in the front and center of the public consciousness. The Pokémon GO phenomenon just goes to show you how much we’ve underestimated the importance of reality. This simple smartphone app has managed to get people out of their houses and into the streets in droves, and for broadcasters in particular, it has injected a healthy dose of reality into our quest for “virtual-dom” in the studio.
Virtualization is good — it’s great, actually — for reducing costs and streamlining operations. But there’s something to be said for augmented reality, which for our purposes could be described as a mashup of computers, IP audio, broadcast gear and reality.
Yes, IT and IP technology have made it possible to mix, process and route audio in the digital domain. And, yes, we’ve made studios smaller and budgets even smaller still. We have moved little black boxes out of the chain and into the software realm, where we can scale up or down, and store, recall or mix audio anywhere — whether it’s in the PC, on the console or as an IP audio stream anywhere in the network.
I think of those virtual mixers we designed into our WheatNet-IP I/O BLADEs about eight years ago now. And the compressors, limiters and EQ we put in our next generation I/O BLADE-3s awhile back. Wheatstone has augmented a few realities with that technology, namely eliminating the need for physical DAs and reducing, sometimes eliminating, outboard processing gear.
In fact, that tipping point where we could technically run an entire radio station off of a laptop and a few servers has happened already, and we are certainly poised for that eventuality. We have virtualized just about everything, from how we push around audio, to how, when and where we can control it. Even the IP control surface — which doesn’t actually bring up live audio — is itself another interface to this highly virtualized world.
But at the end of the day, the studio-in-a-PC and the all-virtual on-screen console haven’t really taken off in radio, and it’s not for lack of trying.
FROM VIRTUAL TO AUGMENTED MIXING
In fact, Wheatstone introduced the Glass E virtual mixer as an adjunct to our E series control surfaces some years ago now. If we had unrealistic expectations that this GUI screen version of the console would supplant the console on the desk, broadcasters quickly reminded us of another reality. We later introduced ScreenBuilder, our IP network custom on-screen interface creator, with the expectation of not doing away with tactile control, but more of a way to augment it.
ScreenBuilder is now widely used as a way to program events or functions by assigning faders, knobs and other widgets to various elements in the network — one of many in a string of studio augmentations. With ScreenBuilder, we can accomplish things that we have no way, or no easy way, of doing in the physical world, such as monitoring and controlling levels and other vital settings at stations scattered around an entire country.
It’s becoming increasingly clear that as an industry, we’re blurring the lines between what’s physical and what’s virtual, but wisely — to the extent that one augments or complements the other.
Instead of putting the console in the computer, for example, we’ve learned to keep it real by giving control of the computer to the console. After all, we still need an interface to the broadcast apps and functions that run radio, and what could be better than the trusted console surface that has evolved to fit this very purpose? It’s the best of both worlds, really. This arrangement gives us a whole new reality when it comes to personalizing the console for anyone or any purpose, without physically removing it.
Virtualization has happened, but it has largely happened across the entire audio ecosystem known as the studio operation, as we see with Wheatstone I/O units that put both processing and mixing at every access point in the network.
Whether the information resides on a server, a PC or in the I/O interfaces is really a moot point as long as it can be accessed and updated quickly and efficiently. We still need hardware to interface microphones, headphones, speakers, Ethernet connectors, etc. Why add more PC hardware when we can use these local devices, which need to be there anyway, to store, control and manage this virtual DNA?
Broadcasters have yet to fully embrace this idea of virtualization as a PC in the center of the studio. In fact, in many newer studios, broadcasters are doing away with the PC altogether by “virtualizing” automation to the rack room.
Instead, the next generation of IP consoles seems to be taking us deeper into this mashup of computers, IP audio, broadcast gear and reality.
For example, you can script every knob, every button and every switch on our new LXE IP audio console, but those are still physical knobs and buttons and switches that you’re programming. The difference now is that instead of one button being tied to one function, it can now be programmed for talkback or cue or start/stop or for toggling between functions, at any time. And instead of having fixed functions at fixed locations, you can augment them — even split them up into separate fader banks networked through Ethernet across the room or down the hall so that you can share mutes, tallies, speakers and other resources with others.
When and if that virtual station in a laptop or tablet happens, we’re already ready. But until then, we will continue to augment a few realities.
Comment on this or any story. Email firstname.lastname@example.org.