News
3 comments

4K augmenting HDR standards are going to be here by the end of 2015 and it might just be awesome

by on March 27, 2015
 

Stephan Jukic – March 27, 2015

In the forward drive to revolutionize TV display vibrancy in ways that make it into something dramatically better than any TV before, color and brightness are two of the key battlefields that need to be conquered, in addition to ultra HD TV resolution itself.

And the good news for both color and brightness is that a solid high dynamic range standard could emerge as early as the summer according to the BBC. If this is true, the HDR technology a lot of TV makers and broadcasters are waiting for could move forward much better very soon.

HDR technology heavily expands the range of darkness and light between the two extremes on TV images and while already developed, is not yet standardized across the board for all TVs and content transmission systems (which will carry HDR optimized content to HDR equipped TVs when all this develops fully).

Currently, there is a serious move being made to get this kind of across-the-board standardization moving and recent meetings such as those held by ITU just over two weeks ago have put to consideration the four main proposals for High Dynamic Range standards so that one can be selected by the end of 2015 if not sooner.

According to Richard Salmon, head of innovations and standards at BBC R&D, “Hopefully by middle of this year we will have standards set in the ITU for HDR”. This was one of the comment she delivered at a recent SES organized event on Ultra HD technology in London.

The current four main proposals from HDR come to us courtesy of Dolby, Japan’s NHK, Philips and the BBC itself. So far, Hollywood studios are showing a preference for the U.S-based Dolby HDR standard proposal but one of its essential characteristics is a requirement for metadata from end-to-end for the sake of defining brightness accurately and precisely.

Phase 2 4K TVs will have markedly better picture than their earlier counterparts which we're seeing on the market now

Phase 2 4K TVs will have markedly better picture than their earlier counterparts which we’re seeing on the market now

On the other hand, the Japanese state broadcaster NHK and the UK’s BBC are taking a similar end-to-end approach to gamma curve while Philips proposal is aimed at constant luminance despite its not being backward compatible, though it was described as being a very interesting and elegant proposal.

One other possibility for the end or middle of the year is that we will see two widespread standards dominate. One of these would be for cinema content and the other for television content. This result could emerge simply because, while not incompatible in any technical way, the two display mediums have very different core needs in certain ways.

HDR as a whole ties into a larger dispute that exists about what the essentials of a really profound and unique ultra HD experience are. Aside from the much larger resolution created by 3840 x 2160 pixels, there is a lot more which can be invested in the TV and cinema displays of the coming year, in terms of HDR, wider color gamuts, superior UHD technology (“better pixels”) and also the addition of much faster frame rates for smoother motion. These technologies are what will make the major difference between the phase 1 grade of 4K we’re mostly seeing now and the phase 2 4K technology of 6 to 12 months from now.

HDR itself is known to be crucial because while UHD is only easy to note from certain distances and depending on screen size, the higher contrasts of HDR are immediately visible to any viewer on any TV screen size from any distance at all. In basic terms, a lot of broadcasters and content studios are f the idea that a metaphorical ounce of HDR contrast augmentation is worth a whole pile of pixels.

As for color gamuts, one of the other pillars of phase 2 4K displays, these two are extremely crucial and particularly so when combined with the sharper contrasts of HDR. However, expanding gamut is challenging and moving TVs from their currently common 8-bits to a much larger 10 and 12-bit color gamut that complies with Rec-2020 (the jump from 8 to 10 or 12 is exponential and while 8-bit color provides 256 potential shades of a given color, it’s 10-bit version expands the potential shades to 1023) is something that only a couple of TV manufacturers have done.

Finally, another benefit of HDR that’s worth mentioning is its lack of a compression penalty. While higher resolution and more frame rates mean bigger bandwidth costs, HDR technology can be implemented without requiring more data transfer capacity. This means that aside from standardization and technical perfection, implementing HDR will be much easier than implementing 4K resolution. HDR could even be added to regular HD content if HDTVs are built to display it.

Story by 4k.com

3 comments
 
Leave a reply »

 
  • Eric
    March 30, 2015 at 10:35 am

    >>HDR technology heavily expands the range of darkness and light between the two extremes<<

    That's actually kind of backwards. HDR really tries to compress the range.

    There are several forms of HDR but at the end of the day there's only dark (black) and light (white). A photograph can use a range of colors between those extremes. HOW they are used and displayed is where HDR comes in. A typical photo has traditionally been skewed towards whatever the photographer was focused on. If focused on something bright (the sun), the camera would lower the amount of light coming into the lens and the user would see rather dark, but high contrast image. The sun being bright, therefore everything else was dark. If the photographer focused on something dark, or dark shadows, the lens tried to increase the amount of light coming in, thereby making what was dark, lighter, and was light, even light to the point where it became white.

    So one ended up with either a very dark image with one bight focus area, or one very well lit area with everything else too light and basically white, or overexposed.

    Where HDR comes in is to attempt to Average this out. Is it realistic? Not really. Our eyes in real life don't work in HDR. At night our lens / iris open to take in more light and if you shine a flashlight in our eyes it blinds us (like a camera turning white). During the day looking into the sun or on a sunny day, we can see well the things that in the light, but shadows, unless we focus for a few moments, things are hard to see.

    What HDR then does is basically bring the two extremes together. Increase the darks to make them visible, and lower (just a touch) the brights to the point where they have color (and are not white).

    The effect is nice under some circumstances but arguably ugly under others where things that SHOULD be dark are unnatually bright and things that should be bright, are unnaturally dark and when combined, you have a very artificial image.

    Under good circumstances, it can enhance some images.

    HDR can also refer to the simple fact that TV's for the most part, display 16 million colors. 255 shades or red, 255 shades of green and 255 of blue. 8 Bits for each R, G, and B. Any combination of those is considered a color and there are 16 million combinations. This is called 24 bit color.

    24 bit color is pretty good but if you've ever worked with digital photos, the one shade that feels like it needs more colors is blue, and this is perhaps because our world is basically bathed in blue light, the atmosphere, so our brains are finely tuned to see blues. But 24 bit doesn't have enough visible blue colors ranges. Our eyes see banding of colors if you tried to create a gradient of dark blue to light blue on a computer screen.

    So HDR is also the technology to up this 24 bit to perhaps something like 16 bits (per channel) or 48 bit color. And then there trully would be more range of colors to see.

    Although again at the end of of the day you can't make something more bright than white or more dark than black.

    HDR attempts to equalize, not expand, what the eye sees. Bring up the darks, and lower the lights.

    In alot of ways HDR, in this sense, is rather misleading. Expanding to 48bit color would truly be more like HDR

    Reply

    • John Ddd
      May 4, 2015 at 4:05 am

      You are confusing it with HDR from photography where you shrink the range of an image so that it can fit into what a screen can display. Here HDR actually refers to expanding range of darkness/brightness that screen can display.

      This can be compared to trying to put a big cake in a small box:
      1. Traditional photography – put only part of the cake in a box.
      2. HDR photo – compress the cake to put it completely into a box.
      3. HDR TV/monitor – use a larger box.

      Reply

      • Eric
        May 4, 2015 at 8:24 am

        >Here HDR actually refers to expanding range of darkness/brightness that screen can display.<

        This is correct, but at the end of the day, you can't make something whiter than white or blacker than black. But you can make them cleaner.

        You can also make the screen show more range of blues and greens and reds, which current 24bit (or 8 bit per pixel) monitors and tvs show banding. But I don't get the sense this is the direction TV's are taking. Or maybe they are, to be honest, it's impossible to find real world specs that one can relate to.

        Jumping from 8 bits per pixel to 16 bit would up the range of color from 255 levels to 16,000 so yeah, to me, that would be what I would consider HDR.

        But the problem is, monitors like this don't exist. So not totally clear how A TV, aside from just offering blindingly brighter images, would achieve "HDR" they way we think of it. Bright is nice, but staring at a lightbulb is not the kind of experience most people are after.

        Reply

Leave a Reply to John Ddd  Cancel reply