Gradients on TVs
Color bit depth

What it is: How finely levels of color can be displayed.
When it matters: Details in shadows, sky and skin tones. Matters more for HDR content.
Score components: Subjectively assigned
Score distribution

A color gradient is a stretch of gradually changing color. For example, the far left side might be a dark green that progressively changes to a lighter green on the right. If a TV is able to display a gradient smoothly, that means it is able to capture small differences in color and is therefore good at reproducing details in color. More detailed color is one of the promises of HDR media, so it’s worth getting a TV that performs well on this test if you’re interested in HDR.

For this test, we display an image with multiple horizontal color gradients and evaluate how smoothly the TV is able to reproduce all of them. We use this result to determine the color bit depth of the TV’s panel.

Test results

When it matters

No banding on skyNo banding on sky
Significant banding on skySignificant banding on sky

Performance with gradients illustrates how well-equipped a TV is to reproduce fine details in color. In particular, a good performer on this test should have much less of the banding that can often be seen on wide spaces of gradually changing color. Compare a TV that performed well in this area (above-left) with a TV that performed poorly in this area (above-right). Since detailed color is meant to be one of the benefits of HDR video, the results of this test are quite important for people interested in that kind of media.

Poor performance with gradients is unlikely to be a deal-breaker for most, so this isn’t the most important category we test. However, as you can see from the side-by-side comparison above, there is a noticeable benefit to a TV that does well with capturing details in color, so if you want to watch HDR video, it’s still worthwhile to get a TV that performs well.

Our tests


Sony X850D - Good gradient - less bandingGood gradient - less banding
LG UH8500 - Worse gradient - more bandingWorse gradient - more banding

Our picture test captures the appearance of gradients on a TV’s screen. This is meant to give you an idea of how well the TV can display slight differences in color, with worse reproduction taking the form of bands of color in the image. Note that because this photo's appearance is limited by the color capabilities of your computer, screen, browser, and even the type of file used to save the image, banding that is noticeable in person may not be as apparent in the image. Above, you can compare good gradient reproduction (left) with worse reproduction (right). If you look closely, you can see more obvious banding in the right image (particularly in the green gradients).

To evaluate gradient reproduction, we take a photo of our gradient pattern in a pitch black room, with the following camera settings: F4.0, ISO-200, 1/15 sec shutter time. The image file we use is a ‘.tiff,’ as most typical image files (JPG, PNG, etc.) either don’t support 10-bit color or don’t support it well.

To display the image, we connect our test PC to the TV via HDMI, with the signal output via a Nvidia GTX 1060 6GB graphics card. We display our gradient test image via the Nvidia ‘High Dynamic Range Display SDK’ program, as it is able to output a 1080p @ 60 Hz @ 10-bit signal, bypassing the Microsoft windows environment, which is limited to an 8-bit.

After determining the highest possible color depth of the TV (see test below), we take a photo of the TV’s screen while it is displaying our gradient test image at that color depth.

Color depth

What it is: Number of bits per pixel to represent a specific color. Note: we consider 8-bit with dithering to be equivalent to 10-bit, as long as the 10-bit gradient looks smooth.
When it matters: HDR content like UHD Blu-ray players. Won't matter for cable TV, regular Blu-ray movies, video game consoles or content displayed from a Windows PC. Those are limited to 8-bit color.
Good value: 10-bit.
Noticeable difference: 1 bit.

Color depth is the number of bits of information used to tell a pixel which color to display. 10-bit color depth means a TV uses 10 bits for all three subpixels of each pixel, compared to the standard 8 bits. This allows 10-bit to specifically display many more colors; 8-bit TVs can display 2^(8*3) colors (16.7 million colors), versus 10-bit’s 2^(10*3) (1.07 billion colors). The images below provide an idea of what difference this makes.

LG UH7700 - 10-bit gradient - very little banding10-bit gradient - very little banding
Vizio D-series 4k 2016 - 8-bit gradient - obvious banding8-bit gradient - obvious banding
8-bit vs 10-bit color gradation8-bit vs 10-bit color gradation

10-bit is capable of capturing more nuance in the colors being displayed because there is less of a 'leap' from unique color to unique color.

A 10-bit display is only useful if you are watching a 10 bit content, which is really rare. Currently, almost everything is 8 bit, including Windows or game consoles. HDR takes advantage of 10-bit, and so getting a TV that supports 10-bit color is only important if you intend to watch HDR videos.

We verify color depth while performing our picture test. Using the Nvidia ‘High Dynamic Range Display SDK’ program, while outputting a 1080p @ 60Hz @ 12-bit resolution, we display our 16-bit gradient test image and we analyze the displayed image and look for any sign of 8-bit banding. If we do not see any 8-bit banding, it means the TV supports 10-bit color. We do not differentiate between native 10-bit color and 8-bit color + dithering because we score the end result of how smooth the gradient is.

Note: Current TVs max out at 10-bit color, but sending a 12-bit signal helps to allow processing (like white balance adjustments) to be enabled without adding banding.


Our gradient score is based on the subjective analysis of the visible banding from the gradient test picture. The test picture is divided into 24 distinct images and each image is then visually analyzed by one of our staff and given a score depending if there is or not visual banding (this include banding from 8-bit panel). The final gradient score is determined from the test result and calculated to give the final gradient rating. The maximum rating for an 8-bit TV is 9.0 since a 1.0 point penalty is automatically given to any 8-bit having visible 8-bit gradient banding.

Generally, a TV supporting 10-bit color should score higher than a TV that only supports 8-bit color, but this is not always true. Some 10-bit TVs struggle to display gradients smoothly, and some 8-bit TVs are very good.

Additional information

Banding in gradients

Two things happen with ‘banding’: colors that are only fairly similar are made to look very dissimilar, and very similar colors that are meant to be reproduced uniquely are grouped together and made to look the same. This combination results in the appearance of bands of colors on the screen. Image processing can also create banding.

Therefore, if you see lots of banding in a gradient, it means one of three things:

  • The signal isn’t carrying enough bits to differentiate lots of similar colors.
  • The screen’s bit depth is not high enough to follow the detailed instructions of a high-bit-depth signal.
  • The TV’s processing is introducing color banding.

With a high bit-depth signal played on a TV that supports it (and minimal processing enabled), more information is being used to determine which colors are displayed. This allows the TV to differentiate between similar colors more easily, and thereby minimize banding.


There are two kinds of dithering, both of which can simulate the reproduction of colors:

  • Spatial dithering. It is done by sticking two different colors next to each other. At a normal viewing distance, those two colors will appear to mix, making us see the desired target color. This technique is often used both in print and in movies.
  • Temporal dithering. Also called Frame Rate Control (FRC). Instead of sticking two similar colors next to each other, a pixel will quickly flash between two different colors, thus making it look to observers like it is displaying the averaged color.

With good dithering, the result can look very much like higher bit-depth, and many TVs use this process to smooth out gradients onscreen. 8-bit TVs can use dithering to generate a picture that looks very much like it has 10-bit color depth.

How to get the best results

Both the screen and the signal need to have high bit-depth for the more detailed color, which means for minimal banding with TVs, you must watch a 10-bit media source on a 10-bit TV panel.

When watching HDR media from an external device, like a UHD Blu-ray player, you should also make sure that the enhanced signal format setting is enabled for the input in question. Leaving this disabled will result in banding.

If you have met these steps and still see banding, try disabling any processing features that are enabled. Things like ‘Dynamic Contrast,’ and 2 pt./10 pt. white calibration settings can result in banding in the image.

Related settings

  • Wide color gamut: Allows a TV to display a wider range of colors than average. This is one of the other important elements of HDR and should be enabled with that kind of media. Regular media will become oversaturated with this setting. This setting operates independently of bit-depth, but enabling WCG can accentuate poor gradient reproduction. Learn more about color gamuts
  • Peak brightness: Makes highlights in an image extra bright. This is one of the other important elements of HDR and should be enabled with that kind of media. Often bundled with local dimming, so enabling can sometimes introduce light blooming into darker portions. This setting operates independently of bit-depth, but enabling peak brightness can exaggerate poor gradient reproduction. Learn more about peak brightness
  • Enhanced HDMI signal format: Allows for wider bandwidth signals to be received by the TV. HDR signals exceed the 10 Gbps bandwidth of typical HDMI signals, and it is necessary to enable this setting in order to prepare the TV for the wider bandwidth.

Other notes

  • Pretty much everything is 8-bit. Windows, OSX, JPGs, video games, etc. 10-bit media is very rare.
  • For computers, only specific, professional graphics cards are guaranteed to be able to output a 10-bit signal. The Nvidia Quadro and the AMD Firepro lines both support 10-bit, so if you need that capability with your PC, you should get one of those. It is possible to force 10-bit with regular high-end graphics cards that support HDR, but this is dependant on each model and brand, so your mileage may vary.
  • Some TVs will exhibit a lot of banding when the color settings are calibrated incorrectly. If you have changed the white balance or color settings and find your TV has banding, try restoring those settings to defaults and see if that solves the problem.


A TV’s reproduction of a color gradient indicates how well it can display details in color. It’s an important part of HDR pictures, so if you want something that will handle HDR well, you should make sure to get a TV that does well on this test. For this test, we determine a TV’s maximum color depth, photograph a gradient test image displayed at that color depth, and then assign a score based on how well the test image was reproduced.

For best results with color depth, you should get a TV that is capable of displaying 10-bit color, and then play HDR media on that TV. If you meet those requirements and still experience banding, try disabling any processing features that you still have turned on, as those can lead to banding as well.

Questions Found an error?

Let us know what is wrong in this question or in the answer.


Questions & Answers

Hello, I just purchased a Samsung 6300 series TV, it has HDR 10 with an 8 bit panel, would that be upgradeable to a 10 bit panel via firmware or its a hardware only?
This is a hardware difference and won't be changed by firmware.
Are you guys going to run this 10-bit test on last year's TVs? They're still relevant to the HDR discussion (thinking of the X930C in particular), so I know I'd be interested in your results!
Thanks again for all the great info and hard work :-)
We just performed it on many of the 2015 TVs we still have around the office. The X930C has a 10-bit panel, and was very good at producing a smooth gradient.
Do you have any recommendations on how to play 10 bit content over a Windows network on a Samsung KS8000 series TV? I have a Windows 10 HTPC hooked up with a GTX 950 graphics card, but I am forced to use 8 bit color because I prefer the text quality of 4:4:4. I have a Plex server and the application installed, but I do not know if it is actually displaying the video without transcoding. I know I can move content to a USB drive and play it directly, but I was wondering if there is a way to stream that content over the network to ensure it is being played at it's native 10 bit color.
Plex appears to work with 10 bit content over the network, without transcoding. Ensure you're using the HEVC codec in an MP4 or MKV container.
Do you test the gradients with a YCbCr signal like a blu-ray player outputs or a RGB signal like a PC outputs? I've heard that many TV's don't display RGB as smoothly as YCbCr. Is that true?
Movies are usually rendered in YCbCr 4:2:0 so we do our gradient test using YCbCr. A blu-ray player might render gradient better converting to RGB first though so the better setting really depend on the source capability. In the end, both settings look really close.
How are you able to tell the difference between an 8-bit panel using either type of dithering vs a true 10-bit panel when running your gradient test? If the dithering is good enough, won't you label it as a 10-bit even if it is really an 8 using dithering?
That's correct, we consider a 8 bit panel with good dithering to be equivalent to 10 bit. Some 10 bit panels struggle to display gradients smoothly so the score is based upon how well each panel can display the gradient.
I have the Sony X930C and it is showing serious banding gradients. Will it be the source or the TV? Can the same TV's have worse banding gradients than others?
This is due to the source. Ensure you are watching a 'Deep Color' bluray or HDR source for best performance. TVs can display a smooth gradient if they have a 10 bit panel and processing, or support dithering for the extra 2 bits. This depends on internal components and doesn't vary between units.
Questions Have a question?

Before asking a question, make sure you use the search function of our website. The majority of the answers are already here.

A valid email is required. We answer most questions directly by email to prevent cluttering the site.