Home Analysis CMAF can reduce live streaming latency to two-seconds, but sub-second delays require...

CMAF can reduce live streaming latency to two-seconds, but sub-second delays require WebRTC

Photo: iStock/kynny
image1 (9K)
Share on

Limelight Networks reported this week that it is possible to get live streaming latency down to six seconds using standard HLS or DASH formats, if using two-second chunks, while CMAF (Common Media Application Framework) can reduce latency to as little as two-seconds. Steve Miller Jones, VP Product Strategy at the global CDN provider, told a Videonet webcast that his company has successfully tested CMAF at these low latency levels. This all compares very favourably with the typical 30-second delay associated with HLS streaming that has not been optimised.

But even being two seconds behind the real action in sports, for example, is not enough for all streaming providers, some of whom are demanding sub-second latency in order to achieve the feeling of ‘true live’ or to enable a range of new engagement or monetisation opportunities. Sub-second latency makes it possible to perfectly synchronise companion screen feeds, and deliver statistics that update instantly in a sports game, for example. For in-game betting, sub-second latency means viewers can place bets very close to an event that is about to happen (like a penalty being taken).

The biggest opportunity with sub-second latency, however, is to rule out delays behind broadcast and eliminate the possibility that you hear neighbours cheering, or receive social alerts, before seeing the action yourself when watching sports. Delays behind the live feeds over satellite, terrestrial or cable TV also have a psychological effect, according to Jason Thibeault, Executive Director at the Streaming Video Alliance, the collaboration forum that tries to drive interoperability in the streaming ecosystem. For him, the big win with sub-second latency is making streaming the true equal of broadcast, or even better than broadcast.

“People are coming to live streaming with the expectation that it will be the same as on television, and they are reticent about committing a whole lot of their time to the format when there are delays. For sure, they will ‘tune-in’ online if they cannot get home to the TV screen, but it still creates some psychological angst knowing that you are not truly live – knowing at the back of your mind that something has already happened.

“This is what the industry is trying to solve right now, removing the discrepancy between the broadcast feed and streaming during a live event. Everyone is moving rapidly to fix this.”

Research by Limelight Networks, revealed in its ‘State of Online Video 2018’ report, showed that 65% of people would stream more sports if there were no delays behind live. Miller Jones agreed that there is a strong incentive to ensure streaming viewers are “at the point of action” compared to broadcast.

On the webcast, which you can listen to on-demand, Miller Jones spoke about the two-second latency recorded with CMAF during tests on the Limelight Networks CDN, using standard HTTP 1.1 delivery services and standard playback buffers. But he emphasised that one-second latency is currently a step too far for this approach.

“We have seen CMAF work really well at two-seconds latency but as frames get smaller and smaller, and you try to break the one-second boundary, there are lots of request-and-response flows that result in ‘file not found’ errors and it creates lots of chatter between the client and server. It becomes difficult to manage the transmission of the content, and the Quality of Experience for the user begins to break down.”

Limelight Networks was determined to support the full-spectrum of latency requirements, so investigated a standards-based solution that could ensure anywhere-to-anywhere distribution with sub-second latency. The WebRTC standard, which supports real-time communications with browsers and mobile applications via simple APIs, was the answer.

WebRTC works with UDP transport and removes the need to segment streaming video into chunks – a process that introduces much of the latency associated with HLS and DASH streaming. You can still provide multiple bit-rate options for WebRTC-based streaming, and so keep the benefits of ABR. Without the error checking and packet retransmission associated with chunked streaming (which uses TCP protocol), you need to ensure a great connection – and Limelight Networks has the network optimisations to do this.

Applications for sub-second latency streaming include online casinos. Miller Jones explained how casinos have to wait until every player has turned their card and then placed their bet before a game can proceed – with viewers suffering from different delays behind live in normal circumstances. Jason Thibeault emphasised the opportunity for multiple camera angle choices, either for first-screen or companion screen viewing. For him, a key benefit of sub-second latency is that you can skip between the different camera options without delays, with what is effectively super-fast channel change. If you have to wait 20 seconds before you can see the next camera option, it becomes a bad user experience.

The webcast, called ‘The Power of Now: Enriching TV with Sub-Second Latency for Live Streaming’ is available on-demand, and free, here. You will learn about:

  • The need for sub-second latency – those applications that cannot be supported even with the very best CMAF-based streaming.
  • What WebRTC is, and how WebRTC-based streaming works, including differences to HLS, DASH and CMAF.
  • The addressable market for this approach, in terms of device/browser support and global reach
  • The impact on the media workflow before content is ingested by the CDN.

Massimo Bertolotti, Head of Engineering and Innovation at SKY Italia, will be speaking about low-latency streaming at Connected TV World Summit in March.


Share on