According to a comScore study from December 2016, more than 49 million American homes now use at least one OTT video service – and view an average of 19 days per month, for an average of 2.2 hours on each of those days. And while three out of every four of these OTT homes use Netflix, a surprising 25 per cent of households don’t – they only watch competing services.
This data illustrates just how much there is to play for in the OTT space. Brilliant content, whether home-grown or bought-in, is essential. But with competition intensifying as more players join the fray, getting the right content mix is only part of the story. Data is the other.
Today we are entering a phase where data points from every layer of an end-to-end OTT service can be used to predict future performance and inform decision making. When mined, blended and analyzed effectively – while taking into account important data privacy/security requirements – data can be used to enhance almost every aspect of the service, some of it visible to the consumer, some not. From backend cost savings through to an enhanced customer proposition, better user acquisition strategies, and the refining of future products and services, getting your data to work harder for you is essential.
Here are my layer-by-layer tips on how to boost the value of your data and make sure your OTT service stands out from the crowd.
Data from the video infrastructure layer – the first layer of any OTT system – is typically used for key operational metrics such as Quality of Service and Quality of Experience. Metrics on buffering ratios, join times, and bandwidth consumption are funnelled into service dashboards at network operations centres (NOCs) so that service levels can be monitored in real time.
But now it is possible to go one step further. Taking these metrics and looping them back into the relevant systems makes it possible to extract more value from the data.
For example, it is possible to analyse which bit rates are used where, in order to determine which content renditions can be discarded. This can lead to considerable storage cost savings.
Analysing the bandwidth metrics lays the foundations for real-time CDN switching – to ensure consumers always get the best experience available. This approach also helps to reduce the operational costs associated with having a sole CDN provider.
At an operational level, automating the monitoring of service level KPIs and SLAs is vital for an efficient and streamlined service, ready to tackle any issues arising with the platform. Automatic alerts and notifications can be sent to relevant stake holders when a threshold is crossed, or when an error is predicted to happen in the near future. This allows the team to step in quickly to resolve any actual/potential issues.
Content and Catalogue
Adding content and metadata management to a data-driven strategy gives OTT players insights into how the service is performing, and also into which content is driving revenues, and which is not. It also provides data on which content consumers keep coming back to, and the content trends bringing them back to the platform after a period away.
The next step is to use this data to predict the next trend: the content that will drive the most engagement, and the content that consumers will be most willing to pay for. Acting on these predictions with real-time recommendations to the user base makes for a nimble, flexible platform that reflects real-time consumer behaviour and makes your service more appealing. Building a direct relationship with users in this way certainly helps with the conversion metrics.
Another important aspect is connecting the catalogue to the data consumption habits to determine the level of engagement at a granular level. The ‘play vs play time’ data, for example, can be analysed to assess how long users watch a specific piece of content for, how often they rewind, and whether they come back to watch it again at a later date. All of this data can be used to inform decisions that will help to build a more sticky and engaging catalogue and encourage users to spend more time on the platform.
Outside of the TV platform, OTT players can also mine social data to help inform content acquisition planning: for example, to discover TV series that are trending well on social channels among their target audiences that might be worth adding to their catalogue.
Most OTT services rely on SVOD, TVOD or ad-funded models for monetization – or a mix of the three. Measuring consumption by monetization model and being able to predict which business model will optimise revenues over time is increasingly important as users can be fickle.
Of course in the OTT ad space, data is already king. Effective user targeting, real-time bidding strategies, and programmatic bidding based on real-time data, is already used to deliver highly personalised ads that meet the requirements of the advertiser and the consumer.
And as more data points become available, the ability to act on data to drive even more successful monetization strategies will grow. This may manifest itself in user-specific campaigns – targeted discounts and coupons, for example – to drive conversion. And in the setting of different price tiers for different user segments – for instance, yearly discounted pricing vs monthly recurring pricing, based on user conversion data. This is true for any business model: SVOD (as in the examples above), TVOD (pushing one-off purchases based on user behaviour to increase consumption), and AVOD (by correctly targeting users to drive higher CPMs).
It’s important to remember that seamless, targeted A/B testing of the various business models is key to predicting which strategies work best, and can be used to develop future business strategies and specific business campaigns (e.g. discounts and coupons), helping to increase retention rates and ultimately revenues.
CRM systems analyse consumption data (e.g. consumption habits, drop off rates) to predict future user behaviour and those who may be about to churn. This information can be used to proactively target certain users or groups of users: for example, anyone likely to churn can be sent discounts or coupons to entice them back, via a push notification system.
Similarly, returning users can be rewarded with special deals to keep them active on the platform, and can be contacted just ahead of a new series of their favourite show with a special deal. This creates an engaging experience, driving stickiness and loyalty.
Using an end-to-end TV platform also gives OTT players the opportunity to marry the back-end and front-end data and gain a 360-degree view.
Tying all of these layers of data together is the user experience. The user application – and specifically the experience consumers have while navigating the platform – is what will keep users coming back for more. Getting to know how consumers interact with the system, and their typical journeys, is an important aspect in creating the ultimate experience.
The placement of ads for an ad -based system, and the stage in the user journey at which premium content is introduced (behind a transactional or subscription-based paywall), directly affects conversion rates and revenue. Using A/B testing to experiment with the user experience – without the need for trial and error techniques or new releases – is also key to understanding what works now and how this can be tweaked to maintain a fantastic user experience going forward.
Adaptive layout management is one example of how data can be used to improve the user experience. Based on user behaviour and segmentation, a data-driven platform can dynamically adapt the layout and user experience for a specific application to take into account the navigation and consumption habits of a certain user segment. Users who are more inclined to navigate linear content will be presented with an EPG layout upon entering the app, whereas those who are more VOD driven will be directed to the VOD navigation areas. The same approach can be applied to rails placement: users who watch mainly newly released content will be shown a “newly released” rail first.
In today’s OTT world, going from data visualisation to actions, from manual operations to automation, and from trial and error to accurate predictability – is key, at all levels. With the latest end-to-end TV platforms, OTT players can push the boundaries and move from a static data ecosystem to a more automated model where data is used to predict future performance and behaviour, and can be acted upon immediately. Putting data at the heart of your OTT service like this means that you can have your OTT data cake and eat it too.