By Barry Flynn, Contributing Editor
UHD TV services using High Dynamic Range (HDR) could become available in 2017, following an agreement by members of the European digital TV standards body DVB on the draft commercial requirements for â€˜UHD-1 Phase 2â€™.
This is the successor to the current UHD-1 Phase 1 standard, which provides the means to deliver a 2160p resolution image at up to 60 frames per second (fps), and is in effect the norm used by UHD or â€˜4Kâ€™ TV sets available at retail today.
Phase 2 will, over time, implement a number of different features designed to enhance the â€˜immersivenessâ€™ of UHD, with HDR likely to be the first of these, the DVB said.
It is also probable that one of the commercial requirements will be to provide for the creation of an HD-resolution UHD-1 profile, to allow for HDR to be combined with 1080p images as an interim step (see previous story).
However, the standards body would not be drawn on the detailed contents of the draft, saying only that the discussions that resulted in an agreement had been â€œlengthy and difficultâ€, and that the commercial requirements would pass through the formal DVB process in the coming weeks before being finalised.
DVB expects it will take a further 12 months to translate these into agreed technical specifications for UHD-1 Phase 2, hence its suggested 2017 timeline.
HDR â€“ in combination with two other new features, Wide Colour Gamut (WCG) and 10-bit sample depth â€“ is arguably the feature that will do more to enthuse consumers about UHD than anything else, according to Matthew Goldman, SVP technology for TV compression at Ericsson.
Goldman refers to this combination as â€˜HDR+â€™. Whereas quadrupling the number of pixels on an HDTV set to offer 2160p/4K requires viewers to halve their viewing distance to benefit from the â€˜ultra-high-resolutionâ€™ effect, â€œthe amazing thing about HDR+ is, it’s not impacted by viewing distance. It probably is going to turn out to be the most significant change in the television viewing experience since going from black-and-white to colour.â€
Goldman explains that HDR achieves its effect by increasing the contrast of the image to make it look much more realistic. WCG, meanwhile, adds to the realism by offering the viewer â€˜truerâ€™ colours by vastly expanding the colour palette available to a flat-panel display. Finally, 10-bit sampling is required because the current 8-bit standard is unable to do justice to the range of colours and contrasts WCG and HDR can convey.
Of the three features, Goldman concedes that the impact of HDR is probably the most difficult to convey: â€œIt is really hard to explain what it is, but when you see, you know it. What it is, is the picture just pops. In other words, because the contrast level is so much deeper, it looks like you’re looking at something real versus something that’s sort of realistic on a display: the colours look more realistic, but mostly it’s about the levels of brightness, the difference between how white the whites are, versus how deep the blacks are, that separation. It looks much more natural.â€
One consequence of this is to banish a well-known issue that crops up when broadcasting large sporting events from a stadium on a sunny day: either the sunlit area is washed out and the detail within the shadowed area is visible, or vice-versa. The use of HDR allows detail and contrast on either side of the â€˜shadow-lineâ€™ to be properly displayed, as the human eye would experience it.
Goldman argues that today the visual perception system is tricked when watching TV into thinking that the image on the screen is somehow â€˜naturalâ€™ and â€˜realisticâ€™. If it were possible to place a TV display showing a field of flowers on a sunny day out in the same sunlit meadow that was being televised, and ask the viewer to compare the reality and the televised version side-by-side, the difference would be immediately obvious, he suggests â€“ but in reality, such an experiment would be impossible to set up in practice because the TV picture would be impossible to see.
Goldman admits that adding HDR+ to the UHD mix implies that â€œthe ecosystem’s going to have to change, to support all this.â€ However, the impact â€“ in terms of bit-rates, at least â€“ is not as severe as it might seem.
Neither WCG nor 10-bit sampling imply a bigger UHD payload, he maintains. As for HDR itself, â€œyou will add some bits, depending on the system that’s being used.â€ While no decision has been made about that yet, Goldman suggests the impact will be between zero and 20 per cent extra.
â€œThe key thing is everything along the entire production chain, everything along the delivery chain, is going to need to change, to support this. We have to focus on that [first] and then the eco-systems.â€
Goldman notes that MPEG (the Motion Picture Experts Group) and VCEG (the Video Coding Experts Group) have created a joint working group to look at the question â€˜how can we more efficiently encode HDR+?â€™ â€œThey’re working on that right now. They have it in a fast track, and they’re hoping that’s going to be done by October of 2016. We need to work on the interfaces, as well, and we have all the different standards bodies â€“ there’s over a dozen of them working on this â€“ and we’re all shooting to have all this done by the end of 2016.â€
That would mean that the first UHD-1 Phase 2 services containing HDR+ could be launched â€œin the 2017, 2018 timeframe,â€ Goldman estimates.
This is broadly in line with DVBâ€™s expectations.