At IBC, TV solutions provider Viaccess-Orca (VO) was demonstrating a completely re-designed version of the 360-degree VR application it has been working on for several years.
The solution – a collaboration with audio specialists DTS and low-latency video delivery experts Tiledmedia – claims to be the most reliable system available today to deliver best-in-class VR360 audio and video for VR and AR streaming to mobile devices.
VO says the system is tailored for live events such as soccer games and concerts, and – using its own ‘VO Player’ – can deliver high-resolution VR experiences for less than a quarter of the bit-rate of legacy VR streaming platforms.
According to Kevin Le Jannic, Emerging Business Leader at VO, the increased delivery efficiency arises in part from the company’s partnership with Tiledmedia. “We’ve been working with them for a year and a half, and the concept is that instead of distributing the whole 360-degree sphere while you are watching just part of the content, they will distribute only what the user is currently seeing in very high resolution.” The rest of the sphere remains in lo-res mode until the user turns their attention to another part of the sphere. “It is just pointless to distribute the whole sphere while you are just watching one part of the content,” argues Le Jannic.
Moving the head is accompanied by a rotation of the 3D audio ‘sphere’, so that the sound matches the changing video – a system provided by DTS. “You have spatial audio, meaning that you know where the sound comes from, and when you turn your head, the sound will adapt accordingly to the position of your head: it changes.”
One of the challenges such a system implies is that it has to make split-second decisions, in real-time, about which section of the sphere to boost to hi-res mode or to play sound ‘from’. Le Jannic reveals that Tiledmedia is working closely with Akamai and other CDN providers to reduce the latency of the user experience, so they are not aware a hi-res ‘tile’ on the sphere is being generated on-the-fly.
One way to do this is to include a predictive element which pre-caches tiles according to various cues. These could involve interpreting the ‘movement’ of the spatial sound track – as a user moves their head to respond to a sound heard in a different direction, for instance – or, indeed, by interpreting the direction of the head movement itself.
Previous VO demos have shown that it is possible to create ‘heat-maps’ of the VR ‘sphere’ by aggregating user behaviour – for example, eyeball-tracking to see which spots the visitor’s gaze most commonly comes to rest on (one way of deciding where to place ads in a VR environment). “This heat map can be also used for pre-caching,” points out Le Jannic, “because you know that most of the people will look at this specific position, so you just pre-cache that specific tile, because there is a big chance that people will read it or get it.”
The look-and-feel of the VR environment itself has been much improved. Certainly, when Videonet experienced the ‘Virtual Arena’ application (in this case, delivering ‘presence’ at a soccer match) through a headset on the VO stand, the experience offered noticeably better resolution levels than previously.
It was also much easier to navigate: the direction of the user’s gaze controls a red dot inside the 360-degree environment which can be used to ‘pick’ an option simply by staring at it for a few seconds. This means that manual adjustments only need to be made to the headset when initially putting it on, in order to control fine focus.
Le Jannic says the system’s improvements have enabled VO to experiment with less clumsy ways of introducing advertising and monetisation, and how to ‘push’ products inside the VR experience. “We let the user personalise their environment by choosing which team they prefer, and then we will push products that are related to these specific teams – like you choose Munich, and then we’ll have the shirt of the Munich team – that, of course, they can then buy directly from within the application.”
VO describes this as ‘gamification’ of the advertising: “Here, the advertising is really part of the experience. It’s not as annoying as a pre-roll video that you would have on TV. It’s interactive, it’s not annoying anymore. […] Of course, we also have insights that are really valuable for advertisers, because you know what the user is looking at. […] We know if the user watched the advertising, how long, how many times, etc., so this is new data that you have.”
Interestingly, Le Jannic suggested that using ‘eye-ball tracking’ for measurement and monetisation, which was all the rage in VR circles two years ago, might not represent the future after all. “Regarding eye tracking, I think that you can already do some interesting stuff just with the tracking of the head. […] If you just track the head, in the end, you are accurate enough, and you know exactly what the user is looking at without tracking the eye.”
Photo: A Viaccess-Orca exec demonstrates the company’s VR application at IBC.