The transition to 4K (or more specifically, media content and displays with a resolution of 3840×2160 pixels, a/k/a Ultra HD) is happening with incredible speed. Consider that the first 4K LCD display sold in the United States was an 84−inch monster, manufactured by LG Display and private−labeled by Sony, LG, JVC, and Toshiba, and it retailed anywhere from $17,000 to $25,000.
Flash forward just four years, and you can buy a 60−inch Ultra HDTV with the latest “smart” features for less than $700. That’s just 3.5% of the average price for that first 84−inch set. In fact, there is so little profit in manufacturing LCD panels and TVs with 1920×1080 resolution that factories are winding down those production lines in favor of “4K’ displays – the demand for which is increasing by double digits each year.
And what exactly does that have to do with interfacing− Along with this surge in interest about all things 4K, the marketing folks have gotten out ahead of the crowd once again, throwing the phrases “4K compatible” and “4K certified” around like confetti at a New Year’s Eve party.
Sounds good – but what does it really mean− Let’s zero in on the most misunderstood part of the 4K signal chain – the display interface. And going forward, let’s not focus so much on the image or display resolution as the calculated data rate through the physical interface. (In numbers, there is truth!)
For better or worse, our industry relies on HDMI for the vast majority of its signal management products. And the vast majority of those HDMI interfaces are versions 1.3 or 1.4, the difference being support for 3D imaging in the latter version. Both interfaces have a capped speed of 10.2 gigabits per second (Gb/s), and are indeed capable of passing a signal with 3840×2160 pixels – but with limitations.
First off; if that Ultra HD signal is present in the RGB (4:4:4) color format, it can’t have a refresh rate higher than 30 Hz with a color bit depth greater than 8 bits per pixel. Such a signal would require a data rate of 8.91 Gb/s with a pixel clock of 297 MHz. That data rate is just slightly higher than what a 1920×1080 signal with 16−bit RGB color requires − 8.019 Gb/s – which gives you an idea of just how much more data is carried in a “4K” video frame!
But video content isn’t usually encoded in the RGB format – instead, it presents as a YCbCr signal, with “Y” equaling luminance and “Cb” and “Cr” representing color difference signals. Indeed; the most common video coding format uses 8−bit color and 4:2:0 sampling. And that type of signal requires only half the bit rate of a comparable 4:4:4 signal.
There are no tricks here. If an interface is fast enough to pass 2160p/30 4:4:4, then it is also fast enough to pass 2160p/60 4:2:0. Hence you will see the designation “4K60” on some models of HDMI switchers, distribution amplifiers, and signal extenders, and more often than not, it means the product is still using HDMI version 1.3/1.4 interfaces.
Nothing has really changed – the specs have just been re−interpreted to show that a rudimentary “4K” signal can pass through the circuits. And presumably the signal management device will also pass EDID from an Ultra HD display back to the source.
But there’s something more to the story! A newer version of HDMI (2.0) was announced in September of 2013, although it has yet to be rolled out on a wide basis in our industry. HDMI 2.0 has a maximum pixel clock rate of 600 MHz, which means the interface can pass a 3840×2160 signal with a 60 Hz refresh rate and 8−bit 4:4:4 color. Or, it can pass 2160p/60 with 10−bit 4:2:0 color. Such a signal has a data rate of about 17.82 Gb/s.
Trouble is; signal management products with this newer interface are also listed as “4K60” devices, even though they’re much faster than their siblings. You need to scrutinize the technical specifications carefully to determine which version of HDMI is in use and what the fastest color mode is – 4K60 4:2:0, or 4K60 4:4:4.
Another wrinkle is the new High Bandwidth Digital Copy Protection (HDCP) version, 2.2. Unlike older versions, 2.2 polls for keys very quickly and if they are not presented and exchanged within 20 milliseconds (ms), the content will shut down and you’ll have a blank screen. Or, the media player may down−rez the output to 1920×1080 instead, which is a fine kettle of fish if you’ve just picked up a new Ultra HDTV only to discover you can’t actually watch movies from your UHD Blu−ray player in full 4K. (Yes, those UHD players are available, and yes, this scenario has actually occurred!)
Here’s another wrinkle − HDCP and HDMI are independent of each other. Adding to the confusion, you will find signal management products that support HDCP 2.2 but only use HDMI version 1.3/1.4 for the physical connection. That’s because the output of a UHD Blu−ray player can be 24Hz or 60Hz, and while the latter won’t pass through HDMI 1.3/1.4, the former most certainly will. (Although some display devices may not recognize that signal format, which throws another monkey wrench in the works.)
None of this 4K branding is necessarily deceptive or misleading, so long as the technical specifications for the product clearly spell out which version of HDMI is in use and/or list the maximum resolution and color bit depth that the product can switch, distribute, or pass. And we’re counting on manufacturers doing just that in the interest of clarity.
Summing up, the transition to Ultra HD can be both exciting and confusing as heck. But all you really need to remember is this: Do the math. Read the specs. And if you plan on transporting Ultra HD (or even cinema 4096x2160p) signals through your system, make sure ALL signal management products in the chain are compatible with the refresh rate and color bit depth in use…not to mention HDCP 2.2 support and EDID support.
Caveat emptor!
If you enjoyed this article and want to receive more valuable industry content like this, click here to sign up for our digital newsletters!
Leave a Reply