Imagine watching a live football match via OTT and hearing your neighbour, who’s watching the same game through traditional broadcast, cheering up to 60 seconds before you know why. The reason is latency, and it’s an issue that anyone who’s serving video via the Internet is having to face up to.
The video streaming environment has changed drastically since content delivery architectures were first created. Originally designed to serve images on websites over dial-up and early-stage broadband infrastructures, traditional content delivery networks (CDNs) are struggling to manage the relentless flow of high bandwidth video content, putting huge pressure on the Internet and frustrating users who want to watch their favourite content.
For example, Netflix users now stream 164.8 million hours of video per day, representing 15% of the world’s total bandwidth. YouTube is even bigger with over a billion hours watched every day, 70% of which comes from mobile devices. These two platforms alone put an intense amount of pressure on the Internet daily. And, with the emergence of 5G technology and as platforms like Disney+ launch, it will only increase.
By using the open Internet to serve content, operators can reduce their broadcast infrastructure costs – but they need to invest in new network infrastructures to overcome the challenge of latency delays to keep their customers.
The issue that broadcasters and OTT providers face is that consumers don’t just expect to be able to watch this content when, where and how they want; they also don’t appreciate buffering, even during the ad breaks. According to Conviva’s 2019 study, having to wait five seconds for an advert to play in the stream will result in 13.6% of the audience abandoning the stream, a clear sign that delays cannot be tolerated.
And, while innovations like 5G will help create and manage bandwidth, they will also make consumers more demanding, ultimately increasing the pressure. High bandwidth requirements and buffering therefore have a very real cost that cannot be ignored. That’s why it’s important to build latency into any streaming architecture.
But’s it’s not simply a case of just buying into one level of latency management – that should be dependent on the requirements of the business and adaptable as content delivery needs change. For example, providers of on-demand streaming services or non-news linear programming will be able to provide a high quality of service to their customers using solutions that have latency between 60 and 10 seconds.
However, if your customers are sports fans who want to watch their team live, those speeds could lead to buffering and unhappy users, and it’s therefore important to have low latency within the network that’s under ten seconds.
It’s even more important in online gaming, where split second actions could mean the difference between victory and defeat, and anything that gets in the way is a significant problem. That includes buffering and delays serving the data stream.
With revenues in video streaming expected to rise 3.2% and penetration reaching 16.2% by 2023, there’s a real opportunity for broadcasters and online video providers, but they must ensure the networks they use take latency seriously, otherwise they risk losing customers because of something they could easily manage by investing in the right infrastructure.
Investing in managing latency isn’t just something to consider as a means to stop customers leaving. It can be a great way to attract new users. If a broadcaster can guarantee that the user will be served the content without buffering or delays, it becomes a powerful marketing tool. Consumers will be attracted to those providers who can guarantee quality of service alongside premium quality content, both live and on-demand – especially if it means they can celebrate at the same time as the neighbours, or live stream a multiplayer game instantly!
However, when discussing low latency and delivering content in real-time, with so much traffic, and many more people consuming video online, how can we keep it consistent without straining the network? The answer lies in redesigning the network itself and how we request content.
Many existing CDNs operate on the basis that origin servers send content to acquirer servers upon request, which then in turn send the content out to caches that individual users’ devices can access. This system works well for static images, as each request creates copies of relatively small files, but this isn’t useful for live video and with the amount of connection requests being made, it creates congestion, putting the origin at risk of failure. Instead, by removing the acquirer server and creating a direct consistent connection between the origin and the requesting server, crucial delivery times and congestion can be reduced.
However, to successfully reduce latency, the workflow also relies on harnessing the latest in AI solutions that keep track of the network, understands where requests are coming from and can seamlessly identify the best delivery path for video. When high traffic volumes hit the network, it also needs the ability to predict if there will be congestion on the route it is using to deliver content, and intelligently switch onto a different route with more available bandwidth if trouble is detected – in order to prevent delay.
Working in tandem with ISPs makes it possible to keep the stream constant, as it only has to go between two points, with less requests. That keeps the bandwidth requirements as low as possible. By rethinking both the network architecture and harnessing the latest in computational technologies, it’s possible for video providers to offer a guaranteed quality of experience to their customers.
Audiences now expect and demand the highest quality content, and any latency issues hamper that. It’s therefore vital that broadcasters and OTT players ensure they’re investing in the right infrastructures that will guarantee quality of experience to gain the competitive edge over the providers who don’t. The strategy is to create an infrastructure built for video, with low latency guaranteed at its core, which will lead to happier customers who are more likely to stick to the service.