Home Analysis Multicast ABR and low-latency streaming are the starting gun for migration to...

Multicast ABR and low-latency streaming are the starting gun for migration to an all-HTTP video future

Share on

In an ideal world, all Pay TV operators would transition from legacy broadcast or classic IPTV to all-ABR delivery so they can unify their delivery infrastructure. That is what Jacques Le Mancq, CEO of Broadpeak, suggested in April when his company announced the latest version of its nanoCDN multicast ABR solution, which enables live streaming channels to be multicast across a broadband network but then converted to unicast sessions in the customer premise so any multiscreen device can consume them without changes to its software. He believes there is a natural roadmap towards what he calls the mutualisation of operations around HTTP-ABR video, and he is not alone.

Mancq identified two major challenges that were stopping this transition. The first was lack of scalability, and the inability to manage live TV peaks, with streaming video. The second was the latency that exists in live streaming, which means adaptive bit rate viewing is anything between 30 and 60 seconds behind real live (compared with a delay of a few seconds on broadcast and IPTV networks). Broadpeak dealt with the first challenge with its original nanoCDN solution, providing what is effectively a managed network for live streaming that keeps a lid on total bandwidth requirements, even as audiences build.

This year the company also introduced its zero-latency solution as part of nanoCDN multicast adaptive bit rate 2.0. This reduces streaming delays by around 90%, bringing them into line with broadcast/IPTV networks. Broadpeak thinks this is a game-changer, opening the way to what Le Mancq says is “a fully converged video delivery architecture that is the future of television.”

As Nivedita Nouvel, VP of Marketing at Broadpeak, explains, the problem with a 30 second streaming delay is that viewers of live sport find out what happened in a game before they see it on their multiscreen device. Either the neighbours cheer a goal or a Tweet or mobile news alert pre-empts the on-screen action. “When you are dealing with a 30 second, or sometimes even one-minute delay, the players could still be at the opposite end of the pitch when you learn they have just scored.”

The delay is caused by buffering, of course. Buffering is a deliberate strategy to ensure enough video is stored locally to feed the device player at a constant rate, even if the network delivers the video in unreliable bursts. This is what made streaming video viable in the first place over unmanaged broadband networks – and dynamic switching between different bitrate profiles (with ABR) is what took online video to the next level.

ABR (adaptive bit rate streaming) protocols are built for buffering and in the case of Apple’s HLS you typically break the video stream into 10 second chunks and store three of them in the buffer, giving you a 30 second safety net if the broadband connection is interrupted (before the video stalls). So, stage one, if you want to eliminate live streaming delays, is to use a managed connection, like nanoCDN, into the home, so you can safely reduce the buffer size. Stage two of the solution is to reduce buffer sizes on this managed network but retain standard buffers on an unmanaged connection and do this without serving two versions of the live channel (one for in-home and one for out-of-home).

This means having the same sized chunks for the managed and unmanaged connections, so nanoCDN enables client devices on the managed CDN network (in the home) to start displaying content after a few seconds of the first ten-second chunk is inside the buffer (the player can request content that is not aligned with the chunk divisions). Mobile devices on the out-of-home connection continue as before, holding approximately 30 seconds worth of content in their local store. All playout and display remains standards-compatible.

There are already market pressures that will drive service providers towards this solution. First, more live content, including sports, is being watched as streamed video – consumed on portable devices (tablets, mobiles, laptops) around the home but also on connected TV sets via streaming boxes, connected STBs and casting devices. Moreover, 4K televisions are arriving and consumers are eager to try them out with 4K content, very often from apps (and again, this can be via a set-top box or streamer device to the television set).

All of this is multiscreen video. So while the technical demands increase (like 4K bandwidth), so does our reliance on streaming,  which means the quality of the streaming experience has a bigger impact on overall subscriber satisfaction.

Síminn, the Icelandic telco and IPTV provider, has been facing some of these challenges, notably a rise in demand for 4K streaming via apps to multiscreen devices, and wanted to limit the bandwidth dedicated to live streaming channels and get latency down to the levels of its standard IPTV service. The company has deployed nanoCDN to enable multicast ABR to multiscreen end-points, together with the new low-latency capability.

nanoCDN requires software on the customer premise, which could be within a broadband router or a set-top box. At Síminn the client software is in its 4K set-top boxes, which have effectively become active components within the content delivery infrastructure. Le Mancq emphasises that Síminn has cut out streaming latency completely. “We expect this multicast ABR deployment to be a blueprint architecture for the future of television,” he says.

With its new set-top boxes in the field, Síminn provides classic IPTV alongside a turbo-charged live streaming capability and the increasingly broadcast-like HTTP-ABR video (very high resolution, low latency). For consumers in Iceland, the line between ‘broadcast’ and online/multiscreen is starting to blur and, if the end result starts to look the same, why not converge the backend technology as well?

Nouvel sees the logic, with the convergence ultimately leading the TV industry to all-HTTP video. She believes the Síminn deployment is an example of what an IPTV implementation looks like at the start of such a transition. “This is what we are hearing, in terms of operator requirements,” she reveals. “We are seeing more operators who have traditional IPTV, launch multiscreen and are thinking about how they can optimise their operations with a unified headend and workflow for all screens.”

By ‘all screens’, she includes the televisions fed by a classic set-top box that is decoding a cable QAM channel, a DVB-S satellite signal or standard IPTV streams. “The set-top box becomes another multiscreen end-point,” she argues. “The [classic] STB is just another ‘screen’, where you watch your HD content, and it [only] receives HTTP, ABR video.”

The benefits of this operational ‘mutualisation’ start with reduced operational costs, since you only have one set of infrastructure and one workflow to manage. There is only one system to monitor, for example. Nouvel reckons you can deploy lower-cost set-top boxes as well, like Android-based STBs. You can read more about the benefits of a unified headend, which converges broadcast and multiscreen workflows, here.

With nanoCDN deployed, on-demand content is unicast and service providers need to decide which live channels are multicast and which ones are unicast. This could be as simple as making the most popular ten channels available in multicast ABR. Alternatively you can dynamically switch between unicast and multicast, depending on how big the audience is.

In the early version of nanoCDN, the first viewers on a channel, in this dynamic scenario, would take a unicast stream. As the channel became popular, they would remain in their unicast but new viewers would be directed to a multicast. This has been improved upon and now even the original unicast viewers can be switched to multicast as the audience builds.

The software agent inside the STB (or broadband gateway) keeps testing to see if the channel is available in multicast or unicast and connects to the right input. If the audience falls to the point where unicasting is more cost-effective, everyone on the multicast can be switched back to unicast. This process uses some of the buffering time to set up the new session so it is invisible to the viewer, and it works with low-latency streaming.

The latest version of nanoCDN will be highlighted at IBC in September. As well as low-latency, it includes support for live HTTP delivery via satellite and the ability to pre-cache VOD content on customer premise equipment.

Photo: IPTV at Síminn

 

Related content:

Major cable operator converges multiscreen and broadcast in core network, in vision of future

The benefits of a unified headend, and how you implement one

 

 


Share on