These are the awesome new technologies of next-generation 4K TVs
Stephan Jukic – July 26, 2016
4K TV display and content both used to be almost entirely about the resolution itself. Those rich, dense 3840 x 2160 pixels of fine-grained detail were what most distinguished this new technology from the SD, 720p and even Full HD TVs which have been the case for decades prior to 4K. However, with pixels as the only attraction of 4K, lots of potential buyers could be left underwhelmed by the overall value proposition that this new technology had to offer, especially consumers interested in smaller televisions who had a hard time noting much difference between ultra HD and Full HD on a show room TV screen at their local Best Buy.
This is where the serious innovations that followed in the wake of 4K resolution itself come into serious play. Since the first UHD TVs emerged in early 2013 or even as far back as late 2012, standardization of the basic display technology and content compression for 4K ultra HD video sources were the main driving points for most manufacturers for the next couple years but now, since mid-2015 and into 2016 further innovations which have emerged to make 4K TV and 4K content about a lot more than extra pixel detail.
From HDR to Wide Color Gamut to the new connectivity specs around these further enhancements on 4K resolution itself and a pile of other fascinating technologies, there is plenty to make the newest 4K TVs on the market worth considering very carefully.
High Dynamic Range
By far the hottest single technology to hit the world of 4K UHD content and displays is high dynamic range. The HDR of today’s best TVs first really started to emerge on the consumer market in 2015 on select premium models and in 2016 it has penetrated the market quite a bit further to hit a majority of mainstream models. We expect this trend of HDR inclusion to do nothing but continue for the simple reason that, unlike 4K resolution itself, high dynamic range makes a serious and positive visual impact in a TV of any display size at pretty much any normal viewing distance.
We cover the mechanics of HDR in a lot more detail in this guide to the technology which we’ve put up here but to summarize briefly, the majority of HDR revolves around a wider range between darker blacks and brighter whites while also creating wider color range as well.
Currently, multiple HDR standards exist for both TVs and content, as our linked-to guide above covers, and we’re still waiting to see the ecosystem stabilize into something more standardized. However among the HDR 4K TVs of 2016 from Samsung, LG, Sony and Vizio, the two existing standards are either HDR10 or Dolby Vision, with HDR10 dominating. HDR10 also dominates among sources of 4K HDR content in streaming or UHD Blu-ray format, though some service providers like Netflix and Vudu also offer Dolby Vision-formatted HDR entertainment
Connectivity & Codecs
Just as 4K content itself is evolving, the codecs and connectivity formats that go between 4K TVs and their media sources have to evolve to keep up as well. While the first 4K televsions of 2013 and even early 2014 often came with HDMI 1.4 connections which can only process 4K video at 30 frames per second, or lacked any sort of 4K-specific video compression like H.265, things have now change dramatically here. For starters, smooth 4K video transmission requires a higher caliber of HDMI and this is where HDMI 2.0 came into the picture. It’s now a universal feature of pretty much all 4K televisions from all major and minor brands anywhere. Then as HDR UHD content came into the picture, HDMI 2.0 itself no longer was enough and the even newer firmware enhanced HDMI 2.0a came along with the ability to transmit HDR metadata with ultra HD video sources. Now this is also becoming a core standard for all newer 4K TVs and something that many older TVs have been upgraded to via firmware updates.
Along with the connectivity of HDMI, come standardized 4K video compression and content copy protection standards. As we cover in our connectivity guide, these standards mainly consist of HEVC in its H.265 format, which is designed specifically for efficient compression of 4K resolution video for easier streaming and HDMI transmission. Along with HEVC, there is also Google’s VP9 compression format, less popular but also increasingly common in newer 4K TVs. In addition to both of these, HDCP 2.2 has become a mainstream feature of all newer 4K TVs as a content protection mechanism against piracy. In fact, no modern 4K set-top box or Blu-ray player will work on a 4K TV without both HEVC and HDCP.
Down the road, we’ll probably see even further 4K connectivity developments which involve new versions of HDMI, DisplayPort, USB-C and others.
Wide Color Gamut and Bit-Depth
Wide color gamut (WCG) is one of the real cutting edge technologies to have developed around consumer 4K content and display devices only in the last year and a half or so. In part, this technology revolves around the broader color standards of 4K HDR as described earlier above and while HDR basically requires WCG, WCG doesn’t have to come with the broader range of white and dark in HDR. In essence, WCG specifically means a broader, richer and more finely gradated color space that displays far more colors than previously possible.
Currently, the general WCG standard for 4K HDR TVs requires delivery of more than 90% of the DCI-P3 color space, which is itself a subset of the Rec.2020 color space that’s now being adopted for the next generation of digital content. DCI-P3 is only a subset of Rec.2020 so most TV makers only state their HDR model’s DCI-P3 coverage because saying that 95% of this color space is covered in a 4K display sounds a lot more impressive than stating only 72% Rec.2020 color space coverage. However, the long range aim of Wide Color Gamut is exactly that, full coverage of the entire Rec.2020 color space. This however is still a few years away at least.
A subset of Wide Color Gamut is bit-depth in color measurement on a 4K TV. All HDTVs and the majority of older SDR (as opposed to HDR) 4K UHD TVs offer up 8-bit color space coverage while the newer Wide Color Gamut 4K HDR TVs offer 10-bit color depth. The difference may seem small at a glance but it scales exponentially in a dramatic way. Thus, while an 8-bit TV offers 256 possible shades of any primary pixel color, a 10-bit TV offers up some 1024 shades of each primary pixel color. Blend those extra shades together and you get TVs with the ability to display over 1 billion colors instead of the 16.7 million that 8-bit TVs can deliver. How does this work? Quite simply like this: In an 8-bit TV, 256 green x 256 blue x 256 red amounts to 16.7 million, while in a 10-bit TV, 1024 green x 1024 blue x 1024 red amounts to over 1.07 billion colors.
Just as full Rec.2020 color space coverage is coming, we’re also going to see an upgrade to across-the-board 10-bit color in all new 4K content and TVs and from there, the real aim is even crazier 12 bit color for the future generations of 4K HDR TVs. This will mean 4096 shades per primary color.
Along with the visual aspects of next-generation home entertainment, there is also the audio part of the equation, and it’s advancing wonderfully too. Currently, one of the best broadcast surround standards for sound is called 7.1-channel Dolby Digital Plus. Now however, the same group behind this, Dolby have taken things in a new direction.
Their solution is called Dolby Atmos and while it was originally designed for cinematic audio on the big screen, we’re now starting to see its arrival to the home theater experience, albeit slowly for now. Dolby Atmos offers what is called object-based audio and the resulting sound experience is one of audio which is better delivered around a whole room in a crisper, much more realistic and immersive way. This is done through a wider range of speaker positions and the support for their positioning.
With Dolby Atmos, what you get is support for as many as 9 speakers at base, four height channels and a channel for LFE, designed for a subwoofer. Dolby Atmos support is already here for a small but expanding selection of 4K content offerings and support for it is also being integrated into a growing range of 4K TVs and media devices.
Beyond 4K UHD Resolutions
Finally, we come back around to resolution and its exciting future prospects. Obviously enough, 4K ultra HD at 3840 x 2160 pixels isn’t where things are going to end. Just as Full HD 1920 X 1080p resolution was replaced as THE premium resolution of high end digital display, so too will 4K resolution be replaced by what we could call the next level of ultra HD. Most directly, this likely will mean the arrival of 8K displays at a resolution of 7680 × 4320p or higher. This standard, while not formally “True” 8K resolution is what the industry generally accepts as 8K resolution and some sources even call it UHD 2 (with 4K being UHD 1 obviously enough).
Obviously, given the difficulties even 4K UHD resolution still presents to display developers, content providers and data transmission technologies most of all, the much more data-intensive resolution of 8K is still quite a ways away from widespread commercial adaption but inroads are already being made with some broadcasters like the Japanese state company NHK planning for a wide rollout of commercial 8K broadcasts to public and private screens in time for the 2020 Tokyo Olympic Games.
Several major TV manufacturers have also revealed their own prototype 8K TVs recently, though none of these models are yet feasible as consumer products.
However, for the true near-future maximum in crystal clear digital resolution, 8K is definitely coming, though you need not worry about getting rid of your 4K TV for a few years more at last.
Story by 4k.com