Home Analysis Object-based broadcasting is coming, with wide implications for the media business

Object-based broadcasting is coming, with wide implications for the media business

With object-based media, content is reduced to its component parts for reassembly. Pic iStock/Vertigo3d
Share on

Of all the technology initiatives that broadcasters are exploring, the one with arguably the most profound impact is not UHD-HDR or virtual reality or even OTT streaming. It is the ability to slice and dice content into a personalised feed delivered just to you on-demand, with customised editorial, length and quality of experience that fits the device you are using and the environment where you watch. This is all underpinned by object-based delivery over an end-to-end IP acquisition-to-distribution chain.

BT Sport has been exploring the potential for this concept for at least two years at its data-centric sports property MotoGP, trying to get fans more immersed in the action. Last month, BT Sport chief Jamie Hindhaugh called object-based delivery “the next major initiative.”

By breaking down a piece of media (a frame, a piece of audio, an object in the frame) into separate ‘objects’, attaching meaning to them and describing how they can be rearranged, a programme can change to reflect the context of an individual viewer. The individual would, in effect, be allowed to curate their own programme.

Live sports programmes are already at the forefront of just-in-time content assembly, as small segments are created from the live event and used quickly. Catch-up and on-demand follow the linear programme with ever shorter delays. A tennis match can be available on-demand in a matter of minutes after a game has ended.

Another UK broadcaster, the BBC, has been pioneering research into object-based broadcasting. Their progress update last week imagined how audiences in 2022 might create their own personalised streams for ‘Match of the Day’ (its flagship live and highlights football show), the weather forecast or even the popular soap opera ‘EastEnders’.

The BBC goes further and imagines the production roles that could emerge. We could see a ‘live reversioner’ who edits news programmes on-the-fly. There could be interactive drama producers who use automatically marked-up rushes of actors to offer bespoke packages, and who have access to all camera streams (from the cloud), with rushes classified automatically from AI-powered transcription.

The BBC thinks this technology has the potential to transform the way content is created and consumed. It anticipates efficiencies and creative flexibility for production teams, enabling them to deliver a personalised feed that understands the individual viewing habits of every member of its audience. “It’s about moving the whole industry away from thinking of video and audio as being hermetically sealed, and towards a place where we are no longer broadcasters but datacasters,” explains the BBC’s CTO, Matthew Postgate.

The audio side of object-based broadcasting has been developed in parallel, and in many ways is more advanced. Dolby leads the way here. It has reworked Atmos, its cinema audio mixing and playback technology, for use with TV. Sky Sports has introduced Dolby Atmos for subscribers using its Sky Q set-top box. BT Sport offers similarly enhanced viewing.

App developer Axonista has built an online experience for the shopping channel QVC using what it describes as an object-based workflow. This is able to extract graphics from the live signal so the ‘Buy now’ button on the QVC app becomes a touchscreen option on a smartphone.

The next step for object-based media pioneers is to find ways of making this concept scale, and making it infinitely repeatable and standardised. BBC R&D is partnering with Germany’s Magix Software and the French researchers BCOM in an EU-funded project called Orpheus that is working to build an end-to-end object-based audio broadcast system. This initiative is based on the BBC’s IP production studio.

The BBC has devised a media composition protocol to help drive scale and standardisation. The result is UMCP (Universal Media Composition Protocol – only a working title) which enables descriptions of media sequence timelines, processing pipelines and control parameters. “The crux of the problem, as with any standard, is finding the sweet-spot between being well-defined enough to be useful, but free enough to allow for creative innovation,” BBC R&D says in a blog.

There is a maze of other complexities to solve. For instance, an object-based workflow will need to manage rights for new versions of content that are assembled from many existing content parts. Then there is the IP infrastructure needed to efficiently narrowcast different versions of, say, ‘Match of the Day’ to millions of viewers at a time.

Despite the challenges, this is the way forward – content tailored just for me and you. The more sophisticated this becomes, the more personal the service will be, as the User Interface itself will be different for each individual.

This trend impacts every area of the media business, with the structure of intellectual property rights just one example. It will influence how media is scheduled, and how advertising packages are put together and sold. Media asset management and broadcast business software systems need even tighter integration.

Metadata becomes all-important because of the need to create sufficient relevant tags to define preferences in ever greater detail. AI or machine learning – buzzwords for IBC2018 – will play a key role, processing large amounts of data in a meaningful way on a unique user level.

Related content:

The technology roadmap that could give everyone their own version of a TV show

Share on