1 comment

According to Netflix, High Dynamic Range is more important for digital video quality than 4K resolution

by on January 12, 2015

Stephan Jukic – January 12, 2015

Netflix, something of an expert on creating the best possible visual experience in home entertainment, claims that while 4K is definitely valuable, the much bigger impact will come from the use of High Dynamic Range technology (HDR).

Last week’s CES 2015 gave us a ton of news about all sorts of important 4K related technologies. These included quantum dots, OLED (already made famous last year) and of course new ultra HD screens with resolutions even higher than full 4K.

Now however, there’s yet another term to add to the above list for 4K TVs and it’s called High Dynamic Range. Many photographers are already familiar with HDR technology but its importance to the visual display quality of 4K TVs is now also being underscored, especially by companies like Netflix.

HDR is basically a phrase used to describe the technology that creates a strong contrast between the bright parts and the dark parts of an image. The importance of this lies in the fact that higher dynamic range (more contrast) means much more realism in a particular image.

As far as digital video is concerned, HDR technology is all about making the bright pixels on a TV as bright and vibrant as possible while making the dark pixels as dark as possible. This is especially valuable for outdoor scenery and sharp contrast movie scenes.

Furthermore, according to executives at Netflix, HDR will immediately create a more notable difference in quality to viewers than 4K resolution alone could.

More specifically, as Neil Hunt, chief product officer at Netflix explains: 4K creates enough pixels on a screen so that the human eye can’t really perceive any greater detail and because of this, we as users start paying attention to other qualities of the display in our quest for the ideal image. This in turn makes the manufacturers now start looking at ways of putting better pixels on the screen instead of just adding more pixels.

This means that getting the brightness, contrast, detail and color realism to improve is more important than upping resolution even further than 4K levels.

For example, the luminance of TV displays is often measured in what are called nits. And while most TVs sold today have a brightness level of about 100 nits, the peak brightness of a high quality screen with optimal HDR technology is closer to 1,000 nits.

Given that a real life event like sunlight reflecting into a viewer’s eyes from a bright white wall creates brightness factors of 10,000 or more nits, getting TVs to go way beyond their standard 100 nits of luminance is extremely important for the sake of creating truly realistic visuals.

With HDR, the important component to brightness lies in also making distinctions of luminance stand out as much as possible, even where they only lightly apply. Thus, with a TV that has a strong HDR technology built into it, white clouds shown on the screen wouldn’t just be a patch of washed out looking white, they’d actually be extremely bright dynamic things with touches of darker texture.

Furthermore, things like reflections of light off surfaces and the contrasts between dark objects and a bright outdoor sky would all have much more realistic weight to them on HDR TV sets.

The overall effect of this is a much greater degree of realism than what we’ve seen in TVs so far.

However, one problem that comes with increasing HDR, and consequently nits, in a 4K TV is that the extra brightness needs more bandwidth to transmit to viewers’ TVs. Thus, while 4K resolution alone can be delivered with a minimum of 15Mbps, the associated HDR that would really make that 4K look good needs a further 18Mbps.

Given that many current internet connections can just barely handle the 4K resolution of streams from Netflix and others, a little while yet is going to pass before most home web connections are also capable of transmitting HDR effectively.

HDR will bring visual quality in 4K content much closer to how a studio filmed the video

HDR will bring visual quality in 4K content much closer to how a studio filmed the video

A number of TVs with excellent HDR capabilities were shown at this year’s CES in Las Vegas, mainly from Panasonic, Samsung, Sony and LG of course. All of them were just prototype models but it’s very likely that the first of these showcase models will start selling in stores by mid-2015.

Netflix itself is hoping to make the technology go mainstream within no more than a couple of years and is currently working with the newly assembled UHD Alliance to make sure that HDR becomes part of the overall Ultra HD standard that gets applied to all name brand 4K TVs made by the companies that form the Alliance, such as Samsung, LG, Sony and Panasonic.

However, the road towards adopting HDR will be quite technical, given that broadcasters who want to transmit their studio-produced HDR quality footage will have to invest in decoders, new channels of transmission, and additional video compression technologies. This means that until these broadcasters see enough TVs with HDR built into them in households, they won’t even bother with mass distribution of video with high HDR.

Story by

Leave a reply »

  • Tim
    January 13, 2015 at 5:32 pm

    It’s really an increase in luminance resolution that HDR delivers, not increased contrast. The brightest and darkest pixels on a standard and HDR display may well be equal, but the increased “steps” of brightness with HDR deliver an image free of the contouring artefacts that plague 8-bit video.

    It’s quite telling that cinemas currently use 2k DCP sources with 4k projectors. At typical viewing distances few can perceive much improvement with 4k source material, but even my grannie can see 8-bit artefacts in a a sunset on broadcast TV.


Leave a Response