Shopping for a new TV has never been easy, but it seems like it’s only gotten more complicated lately.
As 4K TV’s become more common, so has a feature called HDR.
High Dynamic Range (HDR) offers brighter highlights, deeper shadows, a wider range of color detail, and it also displays a crisper picture. Almost all mid and large-size televisions come with 4K HDR, and streaming services like Netflix are releasing more HDR TV shows and movies.
But how does HDR work? With several HDR formats to choose from, which is the right one for you? And is it worth it to buy a TV with HDR, or should you go with something else?
Back when I worked in one of those big-box electronics stores, I loved explaining new technology to people, so they could make an informed decision what the best product was for them. So, in this article, we’re going to answer all of those questions, and a few others too.
Don’t worry…I’ll keep the technical jargon to a minimum.
Standard Dynamic Range (SDR) vs. High Dynamic Range (HDR)
Dynamic range is the ratio of the measurement between the brightest part of an image and the darkest.
So if you look at the image below, the darkest point in the photo is the area of trees next to the mountains, highlighted by the yellow circle. The brightest area of the photo is the snow-covered mountains highlighted by the red circle.
You’ve probably seen photos where the shadows are one big, black blob. Or photos of your friend’s ski trip where the light reflecting off the snow is so bright that it looks completely white, with no detail.
In both cases, the photos have areas that are outside the dynamic range of the camera.
Simply put, High Dynamic Range (HDR) increases the dynamic range of the camera so you get deeper darks and brighter lights, all while keeping the small details that Standard Dynamic Range (SDR) can hide.
The HDR standard were proposed by the International Telecommunications Union (ITU) back in 2016 in an industry whitepaper titled Rec 2100 (link to pdf). These recommendations became the basis for HDR, HDR10, HDR10+ and HLG (Hybrid Log Gamma).
What Does HDR Mean? (Simple Terminology)
HDR vs. Contrast Ratio
You’re probably thinking that that explanation of Dynamic Range sounds a lot like a TV’s Contrast Ratio.
And you’d be right.
Contrast ratios are expressed in a ratio of some number to 1. Let’s take an example of a TV with a contrast ratio of 10,000:1.
That means that the brightness level of a completely white image is 10,000 times greater than the brightness level of a completely black image.
While contrast ratio uses a linear scale, dynamic range is measured in T-Stops.
You’ve probably heard of F-stops when taking photos, and the two concepts are related.
F-Stops and T-Stops
Basically, a lot F-stop number will cause the camera’s aperture to open wider to let more light in. That causes details in bright areas to appear more washed out. By contrast, higher F-stops will result in a smaller aperture opening and less light reaching the lens.
T-Stops normally track very close F-stops, because it measures how much light gets through the aperture to the light sensor.
That can be a little confusing, so let’s make it clearer.
What really helped me understand F-stops was this amazing info-graphic from Click and Learn Photography.
However, unlike contrast ratios, the math for F-stops is a bit more complicated because it represents the difference in powers of two.
Let’s take a simple example of a single photo taken on your smartphone. Let’s assume the camera is sitting at f/6.0.
That means that this phone camera can accurately capture an image with bright spots 64 times brighter than the darkest shadows.
That’s because 6.0 ^ 2 =64. [Six f-stops is two raised to the sixth power and thus 64]
The moral of the math is the higher the F-stop number, the higher the T-stop number is likely to be. That means the image can get darker and still keep the level of detail that you want.
Having great contrast in an image is one thing, but what you probably notice right away about an HDR image is the bright, vibrant colors.
That’s because High Dynamic Range images have a wider color gamut than Standard Dynamic Range images.
Color gamut is simply the range of colors or visible “color space” the television can produce.
The triangle shows the peak colors available in the red, blue and green (RBG) spectrum. To put it simply, the bigger the triangle, the more individual colors available.
This is why HDR images look so vivid, especially when displaying greens and yellows. There’s a greater depth of color available, and our eyes tend to pick out yellows and greens more easily than other colors.
Now that you’ve got a basic idea of the terminology, let’s talk about how HDR actually works.
How Does HDR Work?
We’ve seen that High Dynamic Range helps you get those vivid images where the colors jump out and even the brightest and darkest areas still have details.
But did you know that it does this by taking multiple images and combining them into one final product?
Using a simple example, an HDR camera will take two images of the same shot. one is slightly overexposed, so it can bring out the bright colors and little details, like the photo below on the left. Another is slightly underexposed to really bring out the depths of the shadows, like the photo in the middle.
Finally, these images are combined, using the best elements of each to get something that is more representative of what our human eye sees, like the photo on the right.
In practice, the camera records an image multiple times in rapid succession, slightly adjusting the exposure level for each image captured. Then, software techniques blend them all into one clear and crisp image.
They are borrowing the details in the pictures to create the most realistic range possible.
Challenges with HDR Images
Because an HDR camera takes multiple images, there are some potential challenges.
Any sort of movement can be a huge issue for HDR cameras. The reason being, whether you’re capturing 3 images or 100, any movement will ruin the image.
Pictures can become blurred, have lagging movement, or worse, a pixelated effect.
Additionally, sometimes the colors in HDR images can be too vivid. Sometimes, you actually want the image to appear duller. Alternately, an image that is already colorful or vivid may look unrealistic and uncomfortably neon.
Color gamut and contrast play a big role in how vibrant the image will look.
The same happens with interesting textures or contrasts. HDR, by nature, will sometimes reduce or eliminate textures and contrasts because it’s combining several images.
Often, HDR can look unnatural and surreal, but it is those vivid details that we want when taking artful photographs.
Alright…let’s bring this back to televisions.
So, Why Purchase An HDR TV?
If HDR doesn’t capture video very well, then why bother buying one at all?
Well, here is the tricky part.
Higher Dynamic Range started becoming really popular when 4K televisions first started hitting the market. As manufacturers started releasing their new high-end TV’s, adding HDR to them seemed like a natural choice.
That’s actually the reason why you don’t see any 1080p HDR televisions.
Instead of trying to promote both 1080p and 4K televisions, it was easier for manufacturers to link 4K and HDR to make it more enticing for customers to upgrade.
Additionally, more and more streaming services like Netflix and Amazon are pushing more HDR content.
Unlike 4K or 8K resolutions, you don’t need lightning-fast Internet service to see all those bright, vivid colors. You just need HDR content and the TV to display it.
That means it’s much easier to enjoy and consume HDR content than 4K, making it more consumer-driven.
Finally, unlike Dolby Digital, where television manufacturers have to pay for the privilege of using their technology, most televisions that are mid-size or bigger are already HDR TVs.
In short, if you’re shopping for a 4K TV, you’re not going to find one without HDR.
All you need to figure out is which HDR format to buy.
HDR Is HDR, Right?
Surprisingly, that’s not true.
As if you didn’t already have enough choice, we’ll break down the three main types of HDR, and what makes each one special.
Chances are, if you buy an inexpensive HDR TV, HDR10 is the format that it’s using.
HDR10 is a free, open source technology format supported by all 4k HDR TVs and UHD Blu-ray players. That also applies to all the HDR programming available on services like Netflix and Amazon.
In addition to the image, it sends data to your TV (called metadata) that let’s your TV know what the brightest and darkest parts of the film are. This lets your TV fine-tune the image quality for each movie individually.
If you’re wondering what the ’10’ in HDR10 stands for, it refers to the 10-bit color palette that HDR10 supports.
You’ve probably heard of Dolby before. They’re one of the biggest names in home audio and home theater. Dolby Vision is a proprietary HDR standard found in many high-end 4K TVs. However, unlike HDR10, Dolby charges the TV manufacturer a royalty fee for every television they put it on.
Dolby Vision takes HDR10 one step further by sending dynamic metadata to your TV. That means, instead of letting your TV know what the brightest and darkest parts of the entire movie are, it can do this for every scene, or even every frame!
Dolby Vision also reproduces 10,000 nits of brightness, which most TV’s can’t produce. However, it’s become more common on higher-end TVs. It also can support 12-bit color depth, a whopping 4096 shades of primary colors.
Most manufacturers, except for Samsung, support Dolby Vision on their higher-end televisions.
Samsung, to compete with Dolby Vision, created it’s own format: HDR10+.
Like Dolby Vision, it uses dynamic metadata, allowing TVs to fluctuate color and brightness levels with every scene or even every frame. Unlike HDR and HDR10, HDR10+ is a certification program that manufactures have to sign on for. In addition to Samsung, it was founded by 20th Century Fox and Panasonic..
It requires at least 4,000 nits of brightness, and has 10-bit color depth, allowing for over 1000 shades of primary colors. And yes, HDR10+ TV sets are backward compatible, working flawlessly with HDR10 content.
HDR Content is Different Too
Just because you watch your favorite movie on an HDR television doesn’t mean it will be an HDR image. You cannot take a movie that was not created for HDR display and expect HDR quality.
The question of which HDR format you need comes down to what content you want to watch, and where you’ll be getting it.
I still like purchasing physical media. You can usually find older titles on eBay or Amazon pretty inexpensively. Plus, it ensures that I always have access to my favorite movies without paying a monthly fee to a streaming service.
However, movies need to be in 4K in order to have HDR content.
That means, only 4K Ultra HD Blu-ray discs will play in HDR. Regular Blu-ray movies will not be HDR.
That said, all 4K/Ultra HD Blu-ray movies will have some version of HDR. So if you have a 4K Blu-ray player and an HDR TV, then it’s definitely worth investing in 4K Blu-ray movies.
According to Sound & Vision, the format used in most 4K Blu-ray discs is the basic HDR10. Dolby Vision is available on a growing number of titles, but HDR10+ is still trying to gain a foothold.
That means, if you like to purchase your own media, a 4K TV with HDR10 or Dolby Vision is what you should be considering.
If you prefer to stream your movies and TV shows, you’ll also find a growing library of HDR content across most of the major streaming services.
However, as you’d expect, the major players can’t seem to decide on a single standard.
- Netflix offers most of their content in HDR10, with a limited number of Dolby Vision titles as well. They do not currently support HDR10+
- Amazon Prime Video is the opposite, offering most of their content in HDR10 and HDR10+. Only a handful of titles are available in Dolby Vision.
- Disney+ does not support HDR10+, but offers some content in HDR10 and Dolby Vision.
- YouTube, like Amazon, has signed on to offer it’s content in both HDR10 and HDR10+. It even allows creators to livestream in HDR.
Is HDR Worth It?
That’s a lot to throw at you, and you might be feeling a little overwhelmed.
So if you’re shopping for a new 4K HDR television, it’s worth spending a little time to make sure you get the right one.
Here are a couple of quick things to keep in mind when you’re shopping:
- Some form of HDR is already included with most new TVs, and every 4K or 8K television.
- Many TV’s include more than one HDR format, giving you access to more HDR content
- Manufacturers are able to push updates to their televisions to improve HDR quality and performance.
- Many streaming services, cable TV channels, movies, and physical Blu-ray discs include HDR content. However, the HDR format differs between each provider.
- If your TV is in HDR mode, it will automatically detect what type of HDR (HDR10, HDR10+ or Dolby Vision) is playing. You won’t need to change any settings except to put your TV in HDR mode.
There’s more and more HDR content being created every day, it’s definitely worth investing in a TV with the right HDR standard for how you watch it.
Basically, it comes down to this: If you mainly watch your movies on 4K Blu-ray discs, Netflix, or Disney+, then choose a TV with the basic HDR10 or upgrade to one with Dolby Vision. But if you mainly watch YouTube or Amazon Prime Video, opt for a TV with HDR10+, so you can get the most out of that content.
Either way, the 4K TV you end up with will be the right TV for you, based on the HDR content you watch most.