What Is HDR?

A TV marked with "HDR" is displaying the beach sunset image with High Dynamic Range.
Credit: Samsung


A TV marked with "HDR" is displaying the beach sunset image with High Dynamic Range.
Credit: Samsung

HDR means High Dynamic Range, and it’s one of the biggest improvements to video quality since the jump from standard definition (SD) to high-definition (HD). Some say it’s probably the best thing to happen since sliced bread, but is it?

Nowadays, you'll find HDR in just about all of the best TVs, including some entry-level screens that won't cost you an arm and a leg. Paired with supported content, HDR games, movies, and TV series boast a deeper contrast with highlights that pop and colours that can even rival OLED displays.

There are quite a few different types of HDR, however, and you should know some things before putting a TV in your basket. Before we delve into the world of HDRs and explore the many different types, let’s start by understanding Dynamic Range. What is it, and why does it matter?

Dynamic range

The dynamic range of an image on a television (TV) refers to the difference in its brightest and darkest points. The term is often used in the settings of a camera when it captures video or a still image during production and is usually measured as a contrast ratio once the footage is gathered.

We have two types of dynamic range, the SDR (standard dynamic range) and our monitor of the moment, HDR (high dynamic range).

HDR

Image of a brown mountain top is shown in three different brightnesses.
expand image
Credit: Adobe

HDR has become the big buzz around 4K TVs nowadays. The technology improves the brightness, contrast, and colour accuracy of videos and still images.

The most attractive aspect is the technology's internal contrast: the ratio of white to black. The darker parts of an image look incredibly dim, while the lighter regions appear brilliantly lit, all without the detail muddying in either.
But what is the relationship between HDR and Color Gamut? Let’s find out.

HDR vs Colour gamut

While HDR deals with how much light a TV display is designed to put out, or luminance, the colour gamut is the range of colours a display can reproduce. But that’s not the whole story.

Basic science taught us that a monitor can display the whole spectrum of colours using its three fundamental hues — red, green, and blue — or call it colour gamut.

Pretty much any shade can be made by combining these foundational colours; however, monitors cannot produce colours outside their gamut.

The Rec.709 colour gamut, an international standard for HDTV formats, is typically the only one that conventional SDR displays can support.

Modern HDR monitors defy those odds, expanding the triangle to cover a wider range of colours that human eyes can see using specialized LCDs that support the wider DCI-P3 colour gamut.

Image of a bird eating from a pink flower shown in 4K and 4K HDR.
expand image
Credit: Apple

Types of HDR

There are different types of HDR, each with its technical specifications and capabilities. The common ones are HDR10, Dolby Vision, and HLG (Hybrid Log-Gamma).

HDR10

HDR10 is compatible with all HDR-capable TVs and is the bare minimum requirement. It’s the first-born son of HDR technology and offers a massive improvement on SDR devices. A colour depth of 10 bits and a maximum brightness of 1,000 nits (a brightness measurement) are both supported by the HDR10 format.

Those figures mean little on their own, but in context, HDR10 (high dynamic range) allows an image to be over twice as bright as SDR (standard dynamic range).

With a corresponding increase in contrast (the difference between the darkest blacks and the lightest whites) and a colour palette with one billion shades as opposed to SDR's comparatively meagre 16 million. The TV quality you use to see HDR10 determines how well it is implemented, just like with all other HDR formats. When appropriately used, HDR10 produces incredibly high-quality video material, but there are better HDR fishes to fry now as the HDR10 is no longer at the top of the HDR food chain.

Dolby Vision

Dolby Vision takes things a notch higher and adds spice to it. It’s a variation of HDR that is made to keep far more information from the time that video is created in the first place — say, at a Hollywood studio — until it reaches your TV or mobile device.

This data, known as metadata, contains brightness information for each frame of a movie or TV episode so that the TV (or phone or tablet) will know precisely how to display the picture throughout the entire production.

Dynamic metadata refers to the fact that this data is present for every frame, whereas static metadata, which standard HDR10 contains, only has one data point. Simply put, Dolby Vision is an HDR standard that uses dynamic metadata. Your visual experience will be enhanced, and image quality will equally improve.

HLG

The British Broadcasting Corporation (BBC) and Japan Broadcasting Corporation (NHK) together created the HDR standard known as HLG (hybrid log gamma). Without significantly increasing bandwidth requirements, it is intended to provide HDR compatibility to broadcast TV signals and preserve similar simplicity for the broadcast data.

The cost and complexity of the broadcast signal are drastically reduced since it enables broadcasters to transmit a signal with a wide dynamic range supported by HDR and SDR televisions. In contrast to more specialised standards like Dolby Vision, HLG is free of royalties, and unlike other HDR standards, it doesn't rely on metadata to instruct the TV how to show HDR material.

HDR10+

Samsung is the creative head behind the creation of this type of HDR. Dynamic metadata, similar to Dolby Vision, adds to HDR10 (that’s really what the “+” means). It still modifies the range of light it instructs the TV to display for each scene or frame, even though it doesn't employ customised information for each screen.

It’s an open standard with a highly particular production workflow, just like HDR10, and it can add more detail to your image than what HDR10 shows.

HDR compatibility

HDR compatibility is an important aspect to consider when purchasing a device. Not all displays or content platforms support HDR, so ensuring that your display and the content you want to watch are HDR-compatible is crucial.

Many streaming services, gaming consoles, and Blu-Ray players now offer HDR support, enabling you to enjoy a wide range of HDR content. Remember that not all HDR formats are compatible with every device, so checking the specifications before purchasing is advisable.

HDR vs OLED

Flatscreen LG TV featuring a green and blue pattern on the display mounted to a grey wall.
expand image
Credit: LG

OLED TVs are not as bright as LCD TVs. The best and brightest LED-backlit LCDs are more colourful than their OLED counterparts.

It is another factor that contributes to OLED TV limitations when it comes to displaying HDR or High Dynamic Range content, which requires high brightness levels to properly render its expanded colour gamut and contrast ratio - both of which are needed for the vibrant colours and dynamic picture quality that HDR can provide.

Most HDR video is graded at 1,000 nits of brightness, so the screen must be at least 1,000 nits bright for the material to display correctly. The LG G1 popularised OLED tech when it launched in 2021. LG's data show that its evolutionary OLED evo technology makes it 20% brighter than earlier OLED panels.

Independent testing shows that the G1 has about 870 nits of sustained brightness when the Vivid picture choice is on. This is more than the LG GX, which only had 754 nits.

So, even though it's a lot brighter than most OLED technology, it doesn't meet the basic HDR brightness requirement. The result is that, compared to the most excellent LED-backlit LCD TVs, OLED screens may lack detail in brighter scenes.

An optional tone mapping mechanism on some OLED TVs can restore most or all lost detail. However, that’s short-lived as the overall brightness and vibrancy suffer. None of this implies that OLED displays cannot deliver HDR content. Also, it doesn’t mean that OLED technology is inherently inferior to LCD. Regarding features like per-pixel lighting, viewing angles, response time, and more, OLED offers a significant advantage, but their HDR content viewing is still up there.

Is HDR worth it?

Let’s spare you further exhaustive technical details. However, it’s important to note that HDR is one of the most crucial characteristics to consider when purchasing a new TV because 4K TVs are currently the industry’s big boys.

Although it's still not widespread, HDR10 and Dolby Vision have shown some compelling contrast and colour improvements over the standard definition and a ton of material supporting both. HDR is necessary if you want to upgrade to 4K and have the extra money to buy you an upgraded entertainment experience.

This Article's Topics

Explore new topics and discover content that's right for you!

TechAffiliates