Is HDR the next HD – or the next 3D?
Is HDR the next HD – or the next 3D? Ian McMurray picks his way through the minefield.
There is a cynical point of view – and thank goodness I’m no cynic (actually, that may not be true…) – that says that every technology advance made by consumer TV manufacturers is designed purely and simply to get us to buy more TVs. I’m far from implying that’s a bad thing, by the way.
Once upon a time, a TV was for life (almost) – but that’s no great business model for a manufacturer. If we all had a washing machine that would last us 50 years or more – well, washing machine manufacturers would rapidly go out of business, having saturated the market.
And so it was that we went from CRT TVs to plasma TVs to LCD TVs to LED-illuminated LCD TVs – and soon, if manufacturers have their way, we’ll be moving on to OLED TVs. We had standard definition, then high definition, then ultra high definition – and, doubtless, soon we’ll have – what? Super ultra high definition?
Then there was the fiasco of 3D TV. The glasses for mine are still in the box in which they came several years ago. “Well, that didn’t work – let’s see if the public will buy curved TVs.” By and large, we didn’t.
A good thing
However: just as we’ve all pretty much concluded that 4K resolution is a good thing – even despite the relative lack of availability of 4K programming – the manufacturers have gone and confused us again. We have a new question to ask: do we need HDR?
High Dynamic Range, to give it its full name, does pretty much what it says on the tin. Where 4K gives us more pixels, HDR is designed to give us better pixels. How? First, by increasing the ‘distance’ between the blackest blacks and the whitest whites. Second, it uses 10-bit processing to deliver a billion colours, rather than the previous 8-bit/16 million colours.
(For the technically minded: HDR is specified to use the Rec.2020 colour space, rather than today’s Rec.709. For LED-illuminated LCD TVs, the HDR specification calls for more than 1,000 nits peak brightness and less than 0.05nits black level. An alternative standard has been set for OLED TVs, taking account of their lower maximum brightness level: more than 540 nits brightness and less than 0.0005 nits black level. For comparison, the typical TV of either type today delivers fewer than 500 nits.)
So: more colours and a wider contrast range, meaning that you can, for example, discern levels of blackness in an image that, today, would appear to be all the same colour. So far so good. What’s not to like?
Well: being the consumer electronics industry, HDR isn’t quite that straightforward… First, you’ll be absolutely amazed to hear that HDR isn’t just a single standard. If there is one, it’s HDR10, a license-free technology that is largely considered as the default. But wait – what’s this? Dolby thinks it can offer more with its 12-bit Dolby Vision: LG and Sony have been among the first to support this (and, by the way, it’s good to know that HDR10 and Dolby Vision are not mutually exclusive. Sony, for example, supports both).
Remember the infamous format wars when high definition optical disk first became a thing, with HD-DVD battling it out against Blu-Ray? We could be about to see something similar with Dolby Vision and HDR10.
Samsung, being Samsung, has come up with HDR10+, which, among other things, ‘borrows’ one of Dolby Vision’s advantages by using dynamic metadata instead of static metadata (HDR works by including metadata in the transmitted image). Samsung are making HDR10+ freely available, and Philips and Panasonic have been among the early adopters.
And there’s more. Advanced HDR by Technicolor, for example which, at the time of writing, has the support of only LG. To confuse the issue even further, the BBC and Japan’s NHK have come up with HLG – Hybrid Log Gamma. It takes SDR and HDR images and combines them into one feed, which has practical advantages for broadcasters, with HLG-compatible 4K TVs able to decode and show HDR images in all their glory. It seems likely that all new TVs will support HLG as well as one or more other HDR standards.
Just to add to the confusion: remember ‘HD Ready’? Which, as it turned out, was borderline misleading? Well, look out for ‘HDR+’ and ‘HDR Effect’. Nope, not HDR TVs at all: rather, they’re TVs that claim to be able to deliver ‘HDR-like’ images from non-HDR material. Hmmm… The thing to look out for on the box is ‘Ultra HD Premium’, a stamp of approval from the UHD Alliance: it means you’re getting a TV that delivers both 4K and HDR10. Anything that doesn’t say ‘Premium’ is likely to fall below the brightness required by the HDR specification (and even that, unfortunately, is a moot point because it’s the manufacturer who determines whether a TV meets, for example, the minimum brightness spec for HDR10).
The big issue, in theory, when it comes to HDR, is that of content. Just as free-to-air broadcasters are not yet geared up to transmit 4K, neither are they yet ready for HDR. Once again, Netflix and Amazon Prime Video are your friends. A relative lack of native 4K content hasn’t slowed down 4K TV sales, though, and it’s unlikely to impact HDR-capable TV sales either. There is, of course, also Blu-ray.
And, as with 4K: HDR is an end-to-end process. Just as content needs to be created in 4K in order for you to see it as it was intended on your 4K TV, so too HDR has to be encoded into the content at source. While a 2K TV will downscale 4K to deliver a (mostly) highly acceptable image, unless a TV is equipped with HDR decoding, you won’t see any difference.
Then, there’s content delivery. Some say that you’ll need download speeds of at least 25 Mbits/second for 4K/HDR programming. Netflix, however, claims that you only need 12 Mbits/second for 4K, and another 2.5 Mbits for HDR.
So: what does all this mean in terms for anyone’s next TV purchase? It’s probably fair to say you’d be mad to buy anything other than a TV that’s clearly marked as Ultra HD Premium. If you’re primarily a free-to-air viewer, you’re unlikely to see any benefit in the near term. The issue, as ever, is one of future-proofing – and, anecdotally, HDR is finding more favour among the creatives than 4K ever did, so content will likely start to flow more rapidly. It’s also said to be easier to produce than 4K. If, on the other hand, you’re becoming increasingly seduced by binge-watching box sets and are a customer of Amazon or Netflix or both, an Ultra HD Premium TV is pretty much a no-brainer.
There is an extremely valid point of view that says that 2K/HD resolution with HDR will give you a more satisfying picture than 4K without it – but that combination is unlikely to be an option as the industry transitions to 4K exclusively. In fact, thinking about it, you may struggle to find a new TV without HDR, assuming you stick with the tier one brands.
But which HDR? Some analysts recommend choosing Dolby Vision, not so much because it’s theoretically better, but because it inherently supports HDR10 – so you’re unlikely to get stuck up a technology blind alley, whichever prevails. HDR10 TVs, however, don’t support Dolby Vision.
As we’ve all learned to understand, the number of on-screen pixels isn’t the only thing we should care about. HDR is potentially significantly more transformative, in terms of image quality, than 4K ever was: it just brings a new realism and dynamism to images. To all intents and purposes, HDR delivers the same quality of image you can expect from a movie in a cinema.
And: it’s ‘effective’ on smaller screens. The difference between 2K and 4K resolution is scarcely noticeable below 40” – perhaps larger. The magic of HDR can clearly be seen on pretty much any TV.
So, yes, it looks inevitable. The TV industry may have mis-stepped with 3D, and miscalculated the desirability of curved screens – but HDR delivers truly worthwhile benefits. Time to go and buy another TV, then.