This article continues a series on the evolution of streaming technology as it applies to broadcasting. Previous articles on the same topic appeared in the August and October 2015 issues.
The British Broadcasting Company is one of the first media organizations to undergo a major overhaul of their streaming platform, and they�re using many of the technologies addressed previously, including some key products from Unified Streaming and Telos.
For this article I spoke with Jim Simmons, senior product manager for Audio Services at the BBC, and I posed the following questions regarding the new way of streaming at the BBC.
FvN:What is the main reason for moving to Adaptive streaming?
JS:� The BBC is a large organization across a number of sites and different divisions serving the different nations and regions. All these stations had developed their Internet streaming differently over the years. This had led to strange discrepancies such as programs from some stations not being available on some devices due to the formats they were encoded in. The hardware we were using was expensive to run and has reached end of life.
The BBC also had a project to refresh its video streaming infrastructure, and it was clear that many components could be shared, if the new chunked HTTP streaming formats were adopted. With chunked HTTP streams, the chunks of audio are distributed using the common HTTP protocol that is used for delivering most Internet content. This means we don�t need specialized serving infrastructure, and so we don�t have to rely on specific hardware, servers or content distribution networks to get our content to our audiences. This greatly increases our choices of partners.
BBC uses Telos xNodes to move program content to the cloud. We also went for a cloud-based solution. This means we effectively rent our computing power from a cloud provider such as Amazon AWS or Microsoft Azure. We send our digitized audio to remote data centres where we have our encoding and packaging software running on rented computing power in the data center. Not relying on hardware on our own premises allows us to scale our solution depending on demand. We can add capacity for new stations and remove it as required. This is very useful, as we do a lot of event-based Web streams.
We are now able to offer a range of bit rates best suited to the audience and its location and device at any time � from 320 kb/s for audiophiles on high end equipment down to 48 kb/s for mobile devices.
We can also offer the best delivery method for each platform by re-packaging the streams that we create. This makes it simple to support a wide range of devices well.
FvN:Did you build and develop from scratch or was there a ready-to-use solution?
JS: We used a combination of third-party products and our own software where there is no good commercial alternative.
Our audio is captured in multiple locations for resilience and converted to IP by Axia xNodes. This is then transferred to the cloud platform using our own software.
Once in the cloud, we use the Omnia Z/IPStream software product for encoding. The Z/IPStream software, which runs on generic cloud compute nodes, produces a chunked HTTP stream at four bit rates in the Microsoft Smooth format. The Smooth format is designed for multi bitrate video where an encoder will generate multiple bitrates which is pushed as a single stream. We only publish multi bitrate audio without any video, in some cases this does not work out of� the box with some encoders and origin servers.
With the help from Omnia for the Z/IPStream encoder side and Unified Streaming Platform for the server packaging side, we were able to use their software and apply it for our specific needs. The multi-bitrate Smooth stream is re-packaged into the different delivery formats for all the different clients by the Unified Streaming Platform packagers. This means that the four bit rates in the Smooth stream is made available re-packaged as HTTP Dynamic Streaming, HTTP Live Streaming and in the MPEG-DASH. This all came from the single Smooth stream generated by the Omnia Z/IPStream encoders. We use various traffic management and cache layers to get in and out of the cloud to ensure resilience.
FvN:What kind of issues did you run into during your development and deployment?
JS: There was quite a long period of development ensuring that all versions of software interacted correctly and were configured correctly. This is still on-going as different updates from different vendors keep us on our toes.
Also, the scale of our operation means that getting a system that supports the number of channels and amount of material we produce can be challenging. We currently run 11 national networks; seven networks for the nations of the UK, 41 services for the local regions, 31 networks for the BBC World Service and 24 webcast streams for sporting events. The majority of these services are streaming 24 hours a day, not including the webcast streams. We also run occasional specialist pop up services for example, Eurovision and the London Jazz Festival.
FvN:You say you went for cloud-based solutions for flexibility reasons. How do costs compare?
JS: Yes, we deploy as much as possible in the cloud. This allows us to scale up and down and make changes very quickly. We have often changed the size of our �compute instances� up or down to provide the most cost-effective result for the quality and speed we want to achieve.
We can do this several times in a week, if we want to, without having to buy any metal boxes. We do have to make allowances for the cloud, though, in terms of how we build resilience, and it is not always cheaper. The flexibility is incredible though. We also use a continuous delivery model so that we can have many software releases in a week which can all be deployed very quickly.
FvN:You mentioned a couple of packaging formats. What is the reach in terms of platforms (iOS/Android/desktop browsers, legacy Internet radios)?
JS: We aim to support all the major device types and browsers. We have Apple HTTP Live Streaming for iOS and also now used by some internet radio manufacturers, Adobe HTTP Dynamic Streaming on the desktop for flash players and for some Android implementations.
We are just rolling out MPEG-DASH. MPEG-DASH is a new, open standard, adaptive and chunked streaming format. It is vendor independent and offers cross platform support for both video and audio. It also introduces DRM in the form of Common Encryption that allows to encrypt content once and support multiple decryption schemes depending on client capabilities. MPEG-DASH will be used by some internet radio manufacturers and will be used on the Android platform.
We also currently provide a Shoutcast stream of each station for older devices.
FvN:Do you use Digital Rights Management for content protection?
JS: We don�t currently use a proprietary DRM for streams, but we do for our download offer, where rights require it.
FvN:How do performance and costs compare between the new adaptive streaming vs. the older Shoutcast streaming?
JS: We struggle to produce good quality Shoutcast compared to our chunked protocols. It has a very high cost in terms of support time and effort. Not in terms of the protocol. This is a very real cost as it is the cost of people being diverted from other work.
FvN:How does Adaptive Streaming compare to Shoutcast in terms of listener satisfaction (QoS)?
JS: Like I said before, we struggle to produce good quality Shoutcast streams. Nearly all our negative customer feedback is around Shoutcast and was even before we began audio factory.
FvN:What is your expectation for the future?
JS: Shoutcast will last for a few more years. There are many devices in the market and people do not want to replace their internet streaming devices as quickly as their mobile phones even though they are cheaper and do effectively the same thing. We will see more innovation but it will be based around the MPEG-DASH delivery method. For example, more multi-channel for our classical station Radio 3 over MPEG-DASH. I think more stations will move to adaptive streaming and more will move in to the cloud. Then something will happen that none of us have thought of yet.
FvN:Will Adaptive Streaming (IP-based services in general) replace traditional one-to-many type of broadcasts?
JS: Yes, I think IP will reach parity with broadcast in the next five to ten years, but broadcast will be around as a means of delivery for a long time.
There are massive economies of scale for broadcast. Each extra listener doesn�t cost you more but IP gives you incredible targeting for things like ad sales and for niche genres.
So maybe when Google wants to get all its self-driving cars communicating with each other, it will plant thousands of mini-transmitters along the highways and the cost of IP distribution will dwindle to nothing, or maybe an IP/broadcasting hybrid will be the next big thing. I think a two-way interaction between user devices and broadcasters will be much more commonplace. It is worth too much to be ignored.
Thanks to the overhaul of BBC�s streaming platform they are more flexible by using cloud-based solutions and can adopt existing infrastructure by using the new adaptive and chunked http livestreaming standard. Furthermore this increases their choice of partners greatly and are able offer cross platform support keeping them future ready.
Unified Streaming Platform: