Every film or TV programme delivered over a digital TV system â€“ broadcast or online â€“ is associated with a set of â€˜metadataâ€™ of one sort or another that includes descriptive information about the content, including such things as title, story-line, cast, genre, release date, running-time, and so on.
It is this data that drives Electronic Programme Guides (EPGs) and search and recommendation engines â€“ and the more detailed it is, the greater the value it delivers. Peter Docherty, Founder and CTO of ThinkAnalytics, whose content and recommendations engine services more than 130 million subscribers worldwide via its various Pay TV customers, claims that one of the reasons his companyâ€™s technology results in viewers watching more channels and VOD is because it both re-analyses existing programme metadata (which he says is often incomplete) to extract more meaning from it and adds extra information to it from ThinkAnalyticsâ€™ own libraries (ThinkMovies and ThinkTV). Between them these contain detailed metadata for over a million titles. â€œIt is the combination of that data, that we’ve created, with the viewing data, that lets us really understand the consumer,â€ he maintains.
Recently, another layer of programme metadata has become available which potentially offers even greater personalisation benefits, by subjecting the video content itself to computer analysis, producing data similar to that created through audio description or closed captioning.
Alex Phillips, Senior Managing Consultant in the Communication Practice at IBM’s Global Business Services Division, notes that IBM has developed its own technology for doing this, whereby â€œanalysing the images in each video frame and looking at the features there, you are able to automatically produce metadata â€“ and tag the content automatically, too. You might be able to detect that there’s a sports scene, that there’s blue sky, there’s a crowd, there’s football being played, there are footballers running around the pitch.â€
Andy Aftelak, Vice President and Director in Advanced Technology at ARRIS, whose Pay TV and broadband solutions cover the network, backoffice, headend and home, dubs this â€˜temporalâ€™ metadata, and notes that ARRIS has been running such an analytics engine, which it calls its Media Analysis Framework (or MAF), for a number of years. â€œThe whole idea here is to extract some sort of meaning from the content, so that you can create new experiences or use your understanding of what’s going on in the context to be a trigger for something else,â€ he explains.
Potential use-cases for temporal metadata abound. Phillips cites one obvious application as â€œgoing back over news archives and automatically analysing them and tagging them and making them searchable.â€ Indeed, he suggests, one could in principle go back over any archive to re-process it if standard metadata were absent, rendering it potentially monetizable for the first time.
Aftelak suggests that once such systems can detect structure in programmes, these can automatically be â€œchapterisedâ€: thus a sports event could be split up â€œinto highlights and normal play, so that you can create personalised highlights for a specific demographic.â€
Eric Abbruzzese, Research Analyst at ABI Research (the technology market intelligence provider), notes that with traditional metadata â€œyou can search for an actor and see what he is in, but maybe with temporal metadata you can figure out exactly what scene â€“ so if you want to see where Tom Cruise first appears in his latest movie, you can skip directly to it when that happens.â€
Suranjan Som, Media Insights Evangelist at data analysis specialists Information Management Group Ltd (IMGROUP) believes this new type of metadata â€œdoes add a lot of value on the content side, especially if that content is meant for non-linear consumption â€“ video downloads and things like that. This metadata can then be harnessed to improve searchability. As soon as you do that, the ranking of the video goes higher, and therefore you have a better way of monetizing that video online.â€
Having more meaningful metadata at your disposal improves the prospects for television-related advertising as well. The ability of systems using temporal metadata to recognize onscreen objects â€“ particularly faces â€“is regarded as potentially valuable. â€œIf you can, for example, detect when a specific actor or actress is on the screen, you can use that as a trigger to do or not do things. You might want to trigger an overlay which is a promotion for another show that the actor is in,â€ suggests Aftelak. â€œOr you could use that to trigger an ad, because that actor or actress sponsors a particular product.â€
This is an excerpt from Videonetâ€™s latest report, â€˜Boosting television prospects with enriched data analyticsâ€™, which goes into more detail about the potential for temporal metadata to enhance advertising.
The report investigates how media companies can use a combination of viewing data, better programme metadata and network performance statistics to super-charge their content marketing and monetization, and strengthen advertising. It investigates what a holistic, enterprise-wide data analytics strategy looks like, with insights from Channel 4, Sky Media, ABI Research, ARRIS, ThinkAnalytics, IBM Global Business Services Division, Clearleap and IMGROUP, among others.
You can download the report (free) here.