HDR10 vs Dolby Vision
Which is better?

There are two main HDR formats, HDR10 and Dolby Vision (DV), and they have different approaches to HDR (see HDR vs SDR). Here are the different ways these formats deal with the key aspects of HDR.


What it is: Open standard for HDR.

Dolby Vision

What it is: Proprietary standard for HDR made by Dolby.

  HDR10 Dolby Vision
Bit Depth Good Great
Peak brightness Great Great
Tone Mapping Varies per Manufacturer Better
Metadata Static, for now Dynamic 
TV Support Great Limited
Content Availability Average, but growing fast  Limited

Bit Depth


  • 10 bit
  • 1.07 billion colors

Dolby Vision

  • 12 bit
  • 68.7 billion colors

Bit depth describes the amount of graduations of colors in an image. SDR content is typically mastered at 8-bit which allows for 16.7 million colors, in the world of HDR this is changing. For more information, have a look at our article on gradients.

Dolby Vision content allows for up to 12 bit color; HDR10 is only 10 bit. It might not sound like a lot of difference, but you have to remember that the difference of 2 bits here is the difference between 1.07 billion colors and 68.7 billion. This means much smoother graduations between colors and no color banding in skies.

12 bit is simply better than 10. With that said, though, don't think bit depth makes content displayed more colorful. Its importance is when displaying different tones of the same color in a gradient. The higher the bit depth, the smoother it will be.

Winner: Dolby Vision. However, even if Dolby Vision is capable of 12 bit, today's TV panels are a maximum of 10 bit. You would be hard-pressed to see a difference in current TVs. HDR10 will probably be updated to 12 bit by the time a TV that supports it appears.

Peak Brightness


  • Mastered from 1000 to 4000 cd/m2

Dolby Vision

  • Always mastered at 4000 cd/m2

Brightness and black level are the most important things to a good image because good contrast (the difference between them) is the most important part of a high-quality picture.

DV content is mastered at 4000 cd/m2; HDR10 content is mastered at a variety of levels from 1000 to 4000 cd/m2 depending on the title.

Both standards cater for images of up to 10,000 cd/m2, although no display can currently reach that level. Therefore there is no real difference between the formats as they both top out at 4000 cd/m2.

Winner: Dolby Vision. It wins by a small margin. Dolby Vision content will have more consistent mastering since HDR10 isn't as specific in its requirements. Keep in mind, though, few 2016 TVs even go above 1000 cd/m2, so it doesn't matter for now.

Tone Mapping


  • Tones that extend past the TV's range are mapped using the PQ transfer function.

Dolby Vision

  • Tones that extend past the TV's range are mapped by a Dolby chip using the PQ transfer function.

It is crucial how TVs with a relatively low peak brightness deals with a film that has been mastered on much higher level display. If you have a TV with a maximum brightness of 1400 cd/m2 how does it deal with highlights on a film of 4000?

The easiest way is clipping. In our example of a 1400 cd/m2 TV, everything from 1400 to 4000 would be clipped. What does this mean? It means that there would be no detail seen in that range of brightness and there would be no discernible colors in this region. This is simply because the TV cannot reproduce details in areas this bright as it is above its maximum output. At this point in time, some manufacturers clip highlights which are above their maximum brightness.

The alternative is tone mapping. On a 1400 cd/m2 TV, the highlights from 1400 to 4000 are remapped to fall below 1400 cd/m2. In practice, this means that there is some gentle roll off of color in the highlights starting around 1000 cd/m2.

This would mean that a TV that uses tone mapping would appear slightly dimmer than the same TV which employs clipping. While this is inevitable, a tone-mapped picture would show a lot more detail in the highlights than one which is clipped.

With DV, there is a Dolby chip that checks the TV's model and applies tone mapping using the TV's limitations as a reference. With HDR10, tone mapping entirely the manufacturer's choice, which can lead to inconsistency.

Winner: Dolby Vision.


Metadata is used to describe various facets of the content; it is contained alongside the series or film and helps the display deal with the content in the most effective way.

The way the two formats differ is in their use of dynamic metadata. HDR10 only asks for static metadata. Dolby, on the other hand, has dynamic metadata which allows it to give information on a frame by frame basis. What does this change? Well, with static metadata, the boundaries in brightness are set for the entirety of the movie.
For example: If you set the boundary as 0 to 1000 cd/m2, the moment you enter a very dark scene where none of the colors reach above 50 cd/m2, only 5% of the bit depth will be available for use since the 1.07B colors are spread over 1000 cd/m2.

With dynamic metadata, the boundary can be adapted to the scene. In the same scenario, the dark scene would have the full 10 bit distributed in the much smaller amount it needs. To put it into perspective, the HDR10 scene would use just over 50M colors, which while being much better than SDR, is a lot less than the 1.07B the Dolby Vision would have.

Winner: Dolby Vision. It's better at adapting to scenes that have very different lighting. This is probably short-lived, though, since TV manufacturers will presumably add their own dynamic metadata very soon.

Available Content & Playback Hardware

  HDR10 DV
UHD Blu-Ray Yes Planned for 2017
Netflix Yes Yes
Amazon Video Yes Yes
Vudu Yes Yes
PS4/PS4 Pro Yes No
Xbox One S Yes No
Samsung UBD-K8500 Yes No
Panasonic DMP-UB900 Yes No
Philips BDP-7501 Yes No
Nvidia Shield Yes No
Chromecast Ultra Yes Yes
Nvidia GTX 900 Series and up Yes Yes
AMD Radeon RX and up Yes No

Dolby Vision content is only available via streaming services at the time of writing, both Netflix and Amazon have several films and series encoded with DV. Vudu also has a limited amount of content for Vizio owners. If you want to watch Dolby Vision content via a streaming service you either need a TV that supports it or you have to buy a Chromecast Ultra. There is currently no other external device that supports it. It's important to note however that if the TV itself doesn't support Dolby Vision, using an external source won't make a difference. 

For HDR10 on the other hand, virtually every platform that supports Dolby Vision also supports it. In addition to that, a slew of Blu-Ray discs and players are readily available. Find out where to find HDR content here.

Winner: HDR 10.

Supported TVs

In the US, only a handful of TV's from Vizio and LG support Dolby Vision. All the Dolby Vision TVs also support HDR10, in addition to many TVs from all the other major manufacturers.

You shouldn't expect the cheaper HDR TVs to make use of all the extra capabilities of the formats. For most of them, you won't even be able to see a difference. Only high-end TVs can take advantage of it.

Winner: HDR 10.

Further Developments

Samsung has shown an enhancement to HDR10 which can make use of dynamic metadata by upgrading the TV's firmware. This development addresses one of the fundamental differences between DV and HDR10 although it is not currently implemented in any TVs.

There is also HLG or hybrid log gamma. This has been developed by the BBC and NHK for live broadcasts. LG have shown an E6 OLED TV running custom firmware able to decode the HLG stream. Since it doesn't make use of metadata, every TV that supports HDR today should be able to support it via a simple software update on the TV.


Dolby Vision can be considered the more advanced HDR format, but the lack of content and supported TVs is holding it back at the moment. HDR10 has the distinct advantage of having more content available and being supported on TVs with a higher peak brightness, effectively giving a better result in the end.

Ultimately, the difference between the two formats isn't that important. The quality of the TV itself has a much bigger impact on HDR (see our recommendations for the best HDR TVs). It’s still quite early days for HDR. Both formats have the ability to produce much more dynamic images than we are seeing on the best TVs today. The limitation is down to both the TV technology and the way the titles are mastered. We can’t yet reach the 10,000 cd/m2 maximum peak brightness and the expanded 12 bit color range.

Questions Found an error?

Let us know what is wrong in this question or in the answer.


Questions & Answers

I would love to grab a Sony x900E however I am afraid I will regret not grabbing a TV with Dolby Vision. Is this something I should be worried about? If I am watching something that is Dolby Vision will it still take advantage of HDR capabilities?
We still don't consider Dobly Vision to offer a significant advantage over the standard HDR10 presentation. Yes, a Dolby Vision source will simply fall back to a standard HDR10 signal if connected to a non-compatible TV.
Thanks for all the reviews! I have been considering the Samsung KS8000 or the Vizio P. I understand that the KS8000 is a better TV for the same price however I am really concerned that it does not support Dolby Vision (I don't want a useless TV in 2 years). I have heard different stories, mostly that the Samsung KS8000 actually has the chip for Dolby Vision and can be supported via a firmware update. Is this true? And what would you recommend considering this?
Both HDR10 and Dolby Vision currently exceed the capabilities of today's TVs, so you won't get worse picture quality by using HDR10. For future proofing, it's extremely unlikely that any content will be made for Dolby Vision but not HDR10, so you should be able to play all HDR content for the foreseeable future.

The KS8000 is better than the Vizio P in a bright room due to its higher peak brightness and amazing handling of reflections, but the Vizio P is better in a dark room due to its great local dimming. It is also better for gaming due to its low input lag and it can be better as a PC monitor because it can receive a 120 Hz input.

So a 10 bit panel using HDR10 will only get you 1.07 billion colors, but a 10 bit panel using Dolby Vision will not get you more colors? I can understand not getting 68.7 billion colors because it is not 12 bit, but I don't understand how it could not be more colors on 10 bit with Dolby Vision. Especially if it is a software update.
A native 10 bit panel can only produce 1.07 billion colors (2^(3*10)), no matter the hardware or software running it. Each subpixel can only become one of 2^10=1024 different shades. Actually many TVs have 8 bit panels but use techniques like dithering and FRC to give the appearance of a 10 bit panel, and are virtually indistinguishable from native 10 bit panels. No TV we've ever tested has been able to display more than 10 bit color depth, though theoretically a native 10 bit panel with FRC could display 12 bit color depth.
I have the EF9500 OLED. If I purchase an external streaming device that supports Dolby Vision, will the TV recognize the format?
No. A specific piece of hardware is needed in the TV to decode the Dolby Vision format.
I am just starting to wonder if there is any value in considering HDR10 vs Dobly Vision - TODAY, Nov 2017. I wonder because if the TV sets are unable to display the differences, why is there a debate going on? Does this mean that Dolby vision is "that much better than HDR10" because of its better metadata handling? Or will someone have to be used to seeing the picture differences between two comparable $50,000 projector systems in order to distinguish the better one? It is starting to seem like logo separation. My device has a Dolby HDR logo on it and yours doesnt, but the pictures look the same to most viewers. Until higher bit panels become available, is the discussion of HDR(x) vs HDR(y) more a tech talk that includes numbers, or is there a viewable difference? Thanx.
As the new HDMI interface (2.1) includes support for 12 bit HDR and per-frame metadata, the differences between the two technologies are insignificant. Currently, we don't recommend spending more on TV over your main choice only for Dolby Vision as there isn't doesn't offer an important advantage in picture quality (most differences tend te be due to mastering and not the actual performance).
Novice here - are HDR and HDR10 the same? And do I need an AV receiver that says it supports HDR10 or that it simply supports HDR?
There are a few different HDR video formats. HDR10 is the most popular format, but there is also Dolby Vision, HLG and a few others. To be qualified as "HDR Compatible" a device needs to support at least HDR10.
If a TV supports Dolby Vision, does it by default also support HDR10? In other words, is there a TV that only supports Dolby Vision but not HDR10?
It's technically possible for a TV to support Dolby Vision but not HDR10, however it's unlikely any TV on the market does this. After a manufacturer has put in the work to support Dolby Vision, adding HDR10 support isn't much more work.
You should be able to update your supported TVs table as the top 4 Sony TV models are getting an update to support Dolby Vision sometime later this month.
Thank you for contacting us. We still have all of our 2017 TVs and do continue to update the reviews with firmware updates. Once the Dolby Vision update for Sony comes out we will test it and update the reviews, however we only post things that we have tested ourselves - for example if the update is significantly delayed then we don't want to mislead readers.
Why is 4k in quotation marks when referring to the LG TV sets?
Many low end LG TVs use RGBW panels, where only half the pixels can show color information (they have RGB subpixels) while the other pixels can only show brightness information (they only have a white subpixel). This means that color detail won't be as sharp on these TVs as on other 4k TVs, but this isn't too noticeable in most cases.
Do you know if the LG EF9500 incorporates tone mapping or clipping for hdr10?
The LG EF9500 uses tone mapping up to a certain point, but after that it clips. Nearly all TVs have this behaviour.
Your HDR10 vs DV comparison shows that Nvidia cards support HDR10 but not Dolby Vision, but with a recent Driver update Nvidia does indeed support Dolby Vision! Source (Anandtech)
Thank you for the heads up! The article has been updated.
Hey, I just ordered an LG OLED 65 inch and I'm just wondering what are the maximum peak brightness levels of HDR or Dolby Vision content currently available on Netflix and Amazon Video? I understand OLEDs are known for their dimmer peak brightness but I'd imagine this shouldn't be an issue since I (and most people) only really watch TV or game at night, but if the LG OLED65E6P only has a peak brightness of around 650 cd/m2, then how bright do the brightest scenes in Netflix and Amazon Video even get?
Most HDR content is mastered for either 1000 cd/m2 or 4000 cd/m2 peak brightness, so the 2016 OLEDs will not be able to get bright enough for the really bright highlights in HDR content. This is an issue even in dark rooms because HDR content is intended to be shown at maximum brightness on the TV, even in a dark room (though you can turn it down if you find it too bright).

However in practice the 2016 OLEDs are some of the best TVs for producing bright highlights in HDR content. Our HDR Real Scene test is a good measure of how well a TV will be able to brighten highlights in HDR, and only one 2016 TV scored better than the OLEDs, the Sony X930D. Few TVs can beat the OLEDs because even though many LED TVs advertise very high peak brightness, they can only reach that brightness in very ideal cases, such as our 2% and 10% white window tests, and not when watching most HDR content.

If I am correct Dolby Vision will not be up to it full potential until we get 12bit panels and HDMI 2.1(since current HDMI 2.0b standards do not support dynamic metadata) Please someone correct me and explain if I am wrong. I am trying to learn.
12-bit screens will give a slight advantage to Dolby Vision when they do appear, but according to Dolby's own research, 10 bits is sufficient in most use cases to cover most use cases without having banding appear. We expect that open HDR formats will be updated to 12-bit when TVs that support it do appear. Samsung has also released HDR10+ which implements dynamic metadata through current HDMI connections. As such, it is not necessary to wait for HDMI 2.1 equipped devices to make use of HDR dynamic metadata.
Is there any way to tell which HDR format your TV is using? For example, I have an LG OLEDB7A, it says HDR when viewing videos, but how do I know if it's Dolby Vision or HDR10?
Hi and thanks for contacting us. When watching an HDR movie (or any other HDR content), just check in which picture mode setting the TV is on. If it is in Dolby Vision HDR, the TV will list the available picture modes under 'Dolby Vision Picture Mode'. If it is in HDR10 HDR, it will simply list the available picture modes under 'HDR Picture Mode'.
The 2016 OLED B6P are not 12 bit panels?
Very unlikely, as 12 bit panels (or 10 bit panels with FRC or dithering) are usually only found in professional monitors. And though the B6's gradient performance was very good for a TV, it wasn't as good as most 10 bit monitors like the LG 27UD68P-B, which has a nearly perfect gradient score.
Questions Have a question?

Before asking a question, make sure you use the search function of our website. The majority of the answers are already here.

Current estimated response time, based on # of pending questions: 13.5 business days.

A valid email is required. We answer most questions directly by email to prevent cluttering the site.