What’s a 10-bit TV, and is it better?
Starting this year you’ll be able to buy 10-bit LCDs. Well, true 10-bit. Excited?
Would it help if we talked about what that means? Probably.
Here’s why 10-bit LCDs will be (and are) cool, and what they mean for the future of TVs.
Nearly all current TVs are “8-bit.” Some may advertise that they’re more, but they’re all inherently 8, as that’s what our TV system is based on.
What this means is the TV is capable of 256 shades of color, for each of the three primary colors: 256 shades of red, 256 shades of green, and 256 shades of blue. (It’s actually a little bit less than 256, but let’s stick with that to make our math easier.)
So with our current TV system, it means there’s a possible 16.8 million colors (256x256x256=16,777,216). That may seem like a lot, but it’s actually not. The human eye can see way more.
The biggest limitation to our 8-bit system is a limited contrast bandwidth. Which is to say, if you want to take a shot from the inside of a dark room looking to the bright outside, your choices are making the interior of the room visible, but blowing out the windows into white blobs, or making the windows correct so you can see outside, but making the room a dark mess of shadow.
The other limitation is with gradations and banding. Imagine a sunset, with what should be a smooth transition of colors from top to bottom. In the early days of HD, many TVs would exhibit obvious bands between colors. While often this was caused by some aspect of the TV not being 8-bit, even with some source material and some TVs, some banding can be visible.
Enter 10-bit, and HDR
The big push for 10-bit panels is due to the advent of High Dynamic Range content. This 10-bit content can have more detail in the bright areas of the image, and/or more details in the dark parts of the image. It can also have a greater range between the brightest and darkest parts of the image.
To use our example from before, in theory with HDR content, you could have the dark room while still seeing what was outside at the same time. Done right, the image looks far more realistic.
It also means more shades of color: 1,024×1,024×1,024= 1,072,341,824. Yep, one billion colors. Potentially.
But what about…
For several years some TVs and computer monitors have been “faking” 10-bit color. It wasn’t true 10-bit as there was no 10-bit source material. Essentially what they’d do is flash two adjacent colors, and your brain would think there was a color shown in between those two. It’s (oddly) called Frame Rate Control (FRC), a form of dithering.
Done right, the new TVs would have 10-bit panels, and every step in the electronics chain, from the HDMI input to the panel, would all be 10-bit.
It’s a short list of TVs that will support HDR this year, specifically: Samsung’s JS9500 series LED LCD TVs, Sony’s X930C, and X940C series, LG’s EG9600 series OLED TVs, Panasonic’s TC-65CX850U, and Vizio Reference Series TVs. As to which use true 10-bit panels, the manufacturers are tight lipped. We’ll update as soon as we know more. So, how well they handle HDR, and how much of an improvement they show, at this point remains to be seen. This is cutting-edge technology, and there’s bound to be growing pains.
As the cost of production of 10-bit TVs drops, expect to see more, and cheaper, models in the coming years.
The real benefit of all this is to get us HDR content. The banding and extra shades of color is great, but that’s just the icing. The cake is the much more lifelike images possible with HDR.
Have a question for HD Guru? Email us.
Copyright ©2015 HD Guru Inc. All rights reserved. HD GURU is a registered trademark.
Comments are closed.