HDR vs SDR
Is it worth upgrading to a new TV?
When comparing TVs of today to the models of yesteryear, the biggest difference between the two is the widespread ability to read an HDR signal. The question then asks itself, how big is the impact of HDR? Does it make previous TVs obsolete?
Unlike the upgrade from 1080p to 4k, the difference between the two bears no relation to the environment the TV is watched in. The biggest factor is the picture quality of the TV itself since what HDR actually is is a new, more precise way to describe what the TV needs to display.
What is HDR?
HDR is an initialism for High Dynamic Range. The term has been used for a long time now but nowadays when talking about HDR video; it is entirely about metadata. Well, what is metadata then? HDR metadata is simply additional information sent with the video signal. This information is complete color descriptions that the TV can read and then display precisely. Metadata is then distributed through two different standards, HDR10 and Dolby Vision. While this is the only parameter required to support HDR, TVs also need to reach other requirements to make use of this additional information and show a difference.
Think of it this way: With SDR, a car would be ordered to apply "full throttle" or "50% throttle." Instead, the HDR car would be asked to "go to 120 mph" and "go to 40 mph." Some vehicles would provide a worse experience than others working towards this task, and most might not even succeed. TVs are the same. In the past, the signal would be a level of power, while on HDR, it is a specific set goal.
The latter is set beyond what current TVs are capable of, requiring levels of brightness and colors the display might not be capable of reproducing. With HDR, TVs with higher peak brightness and wider color gamuts gain a purpose. Flagship models are capable of getting closer to the requirements and offer a more accurate picture compared to lower end models with limited performance in these aspects.
For the purpose of this test, we will compare two different TVs over three scenarios: A high-end TV fed an HDR signal, the same TV being sent an SDR signal, and a mid-range HDR TV in HDR mode. For the SDR signal, we used an HDFury Linker and replaced the TV's EDID with one that hid HDR support effectively making the player think it was connected to an SDR TV.
Wide Color Gamut
A TV that supports a wider color gamut is capable of displaying a palette of colors with more saturation than a standard TV. While this isn't a necessity for HDR, they go hand in hand. In the past, even if a TV was capable of supporting one, almost all of the content was produced to fit into a smaller color gamut.
As you can see in this comparison, there is a noticeable difference between HDR and SDR, mostly in the greens and reds which are where most of the expansion went towards. However, you can also see that this isn't inherently a factor of HDR. There is little difference in color saturation between the SDR picture and the low-end HDR one. Since a lot of budget HDR TVs lack a wide color gamut, they will see no benefit from this aspect of HDR.
Winner: HDR, but only on a TV with a wide color gamut.
Much like color gamut, color depth refers to the different colors a TV can display. The difference between the two can be a bit confusing. Color gamut refers to the level of saturation the TV can display, while color depth refers to the number of colors the TV is capable of showing within that palette. A limited color gamut would stop a TV from displaying the red of an apple accurately, while a limitation in color depth would make the red gradients on that apple look uneven and with visible steps.
What is commonly called an 8-bit TV will have 256 shades of red, green and blue or about 16.7 Million in total. This seems like quite a small amount when compared to 10-bit TVs which would have 1,024 shades of each channel or 1.07 Billion colors.
Color depth affects gradients the most: a TV with a lower bit depth will have to spread it over a far smaller amount of steps. A limited bit depth can lead to blockiness and uneven gradients, which you can often see on skies like on the SDR picture above. This is one of the rare cases where the HDR related features on the lower-end TVs will find a purpose since most of them have a 10-bit panel nowadays. Unfortunately, it's the least visually impactful one.
Finally, we take a look at dynamic range. This is where an HDR TVs will show the biggest difference. HDR content makes use of their higher brightness capabilities to show lifelike highlights. When a TV has a limited dynamic range, it can only display highlights while crushing the dark elements and vice versa. A TV with higher dynamic range is capable of displaying more of both at the same time. Peak brightness, contrast as well as the quality of the tone mapping have the biggest impact on this aspect.
The difference is visible on the X930D, one of the brightest TVs we've tested this year. The amount of detail resolved in the sky and the mountain in the background is better than in the other two examples, all while maintaining detail in the shadows under the cars. The difference can still seem minimal, though, and that is both because capturing it with a camera is impossible and because even the best TVs of today barely scratch the surface of what HDR brings forward.
Winner: HDR, but only if peak brightness is high enough to be noticeable.
In the grand scheme of things, HDR is a great advancement in the world of TVs. It however only just started to gain traction. Current TVs, even the flagship ones, don't really make much use out of it. You won't gain much visually from having an HDR signal sent to a mid-range TV since any picture quality enhancements are reliant on the capabilities of the set itself. High-end models do see a benefit, but it's a limited one (see our recommended HDR TVs). If you bought a good TV recently and you were thinking about upgrading for HDR support, it's not worth it just yet.
Questions & Answers
In the SDR photo the backlight isn't at max, but is reduced to produce the same average brightness (APL), resulting in deeper blacks. When viewing HDR content it is necessary to use the maximum backlight setting to produce the bright highlights. Due to limitations on the native contrast ratio of any LCD TV, this results in a slightly raised black level (visible in the picture). Ideally, good local dimming should counteract this effect.
While local dimming with an edge lit backlight typically isn't as good as local dimming with a full array backlight, and is certainly not as good as the perfect "local dimming" of an OLED, some edge lit TVs like the X930D still have good local dimming. That said, the edge lit local dimming of the KS8000 isn't very good. When a very bright area is shown the black space above and below are also bright, producing a bright column from the top to the bottom of the screen. This is visible in our local dimming video for the KS8000. However this "blooming" effect isn't as noticeable during typical movie scenes and we still recommend leaving local dimming on. The very high native contrast ratio of the KS8000 also helps mitigate this problem.
Overall the KS8000 still provides very good picture quality for HDR content. The LG B6 is a little better but it's not a huge difference.
Your site helped me purchase the MU8000 as it was on sale at BestBuy and it fit my budget perfectly.
HDR TVs still have a few areas they can improve on. Local dimming has been steadily improving year over year, and should be even better in two years. Another problem that remains with HDR (except HLG) is that it's mastered for a dark room, so it's often too dim when viewed in a bright room. So far only Vizio has given users the ability to drastically brighten everything for bright room viewing, but its possible other manufacturers may do so in the future. Overall, the MU8000 will likely still be a decent HDR TV in two years, so it's a toss up whether it's best to keep it or buy cheaper and save up for future improvements.
Dear team, since you mentioned most probably very correctly that you consider an 8-bit display with FRC (dithering) to be on par with a 10-bit one in terms of HDR performance, I want to bring your attention to the contradicting statement you give in this article when it comes to the color depth.
While the observation that lower bit depths may lead to "blockiness and uneven gradients" is certainly not wrong and easy to witness in practice, it gives a distorted impression to the usual reader that this would be a direct and necessary consequence of the bit depth. That it is not can be proven by dither where in theory there isn't any banding even when using only one single bit per color channel. Of course, the image becomes extremely noisy but still, banding won't be an issue.
Thus from my understanding, assuming an ideal dither implementation, a given bit depth doesn't limit the amount of possible colors or the color space or the gamut but "only" the SNR. The lower the bit rate, the higher the noise floor will be, effectively limiting the dynamic range if one defines it to be the ratio of the maximum not-yet-clipping brightness to the darkest parts before masked by the noise.
At least for audio, this is definitely true (there, the bit depth doesn't determine the loudness levels but only the noise floor when dithering is applied) and since a video signal in its analog variant is nothing different than an audio signal except for its (way) higher bandwidth, it must apply to video in general as well.
Hi and thanks for contacting us.
Since we only look at the final display performance (the way the gradient looks) we don't distinguish between a 10-bit panel and a good 8-bit panel with dithering. If a display is able to display a smooth 10 bit gradient using an 8 bit panel with FRC then we classify it the same as native 10 bit panels. Sometimes dithered 8 bit panels look better than native 10 bit panels when displaying a gradient.
You're correct that theoretically dithering will raise the noise floor slightly, however in practice we haven't seen any implementations which are bad enough on TVs for this dithering to be noticeable. This is likely because it is happening at a high frequency, and between two very similar colors.
Before asking a question, make sure you use the search function of our website. The majority of the answers are already here.