HDR stands for High Dynamic Range. Although HDR can apply to types of signals, it is mostly associated with imaging. HDR encodes a larger range of brightness and defers the final exposure calculation. The final image is computed with tone-mapping and shown on a standard dynamic range display or shown on an HDR display.
In this article, we will focus on HDR for digital imaging, but those principles also apply to analog photography and image capture in general, although implementing HDR with digital devices is much easier.
HDR in Digital Cameras
Most image data containers (photo files, like JPG…) are encoded using color values that will match the output of a monitor or a printer. Typically each pixel’s color value will be represented in an integer format per components (RGB pixels, CMYK printer, etc…). All those systems work well, but don’t have a very high precision and range because of the output device limitations (Monitor, paper printing…).
When capturing HDR images, the emphasis is put on capturing the scene’s lighting as precisely as possible, rather than focusing on how it will be rendered later. While output devices are often limited to 256 value of red, green, blue, nature has much more amplitude than that. That amplitude is the “range” of color and brightness (luminance) that can be captured.
For example, the sun could be a million times brighter than a desk lamp as seen from the camera. A non-HDR camera would either overexpose the image (washed out photo) or under-expose (dark photo). With HDR, it is possible to capture both bright and dark parts of a scene, then pick and choose which exposure and luminance to apply (to each pixel), to obtain a nicer photo of the scene.
HDR images can are captured in a couple of ways:
1/ Capture several standard dynamic ranges using different exposure
Pros: works on most recent cameras
Cons: requires multiple shots, and moving subjects are more likely to appear blurry
2/ Capture a single image with an HDR sensor
Pros: capture is faster and less prone to subject motion
Cons: requires a more modern or more expensive sensor
In general, and from a consumer perspective, #2 is preferred since it leads to better results and people can think of HDR photography just as normal photography. For those who are pickier and want to control the HDR conversion process, they can capture images in RAW format and use an application to tweak the image.
Most new smartphone cameras now have HDR integrated, but some do require that a setting parameter is activated. We usually recommend to enable it by default, and let the camera use it when needed. Only disable it when it is too slow, in which case, the camera isn’t good anyway.
HDR in Televisions
Most of what we said about photo capture in HDR, also applies to video capture. After recording an HDR video, we would like to play it with as much fidelity as possible. However, most televisions in circulation have a limited range of color and luminance, so that’s why HDR televisions were introduced. The above image shows a Sony HDR demonstration using the Hybrid Log Gamma technique, which uses an HDR broadcast that is compatible with both standard and HDR TVs.
HDR movies on HDR televisions should look more life-like, especially in bright scenes (sunny landscape), or very high-contrast scenes (nightlife/busy city like Tokyo, Vegas). To achieve that you need two things:
1/ Higher brightness/luminance
Ordinary TVs can output about 300-400 NITs of brightness. This brightness level is similar to most high-end laptop displays at full power. In the real world, 400 NITs is pretty good, just enough on a sunny day. That’s why displays are often hard to read when you’re out in the street: because the surrounding light can overwhelm the display’s.
HDR TVs can push brightness to 1000 NITS, which makes bright scenes much more life-like and seemingly natural. All display technologies are not created equal: LCD displays tend to produce higher brightness, but their black levels (the darkest black) is slightly gray.
OLED displays aren’t as bright, but because their blacks are truly black, the overall contrast ratio is visibly better. That’s because with OLED, each pixel can be turned on and OFF – that’s millions of tiny lights. With LCD technology, the lighting comes from dozens to hundreds of lights so brightness is bleeding out to areas where it should not be.
2/ Better colors
HDR TVs have better colors as well because the specifications for 4K TVs say that ~1 Billion to ~12 Billion colors can be represented. That’s a far cry from the 16.7 Million colors for older standards, if not less. If you are curious, read more about Color Depth and bits per pixel. And image like this requires a lot of contrastm and is prone to color banding, especially on a large screen:
This is particularly noticeable in places where you have smooth gradients like a skyline going from dark blue to slightly lighter blue. On a standard TV, you sometimes notice bands of colors, which is called color-banding. With more colors, these bands go away, and things look more natural. Here’s an example of color-banding:
There are many standards, including Dolby Vision at the high-end, but most people will use the open-source HDR10, which is a standard that most companies have agreed upon. HDR 10 doesn’t reproduce as many colors as Dolby Vision. Also, Dolby Vision can go as high as 10,000 NITs of luminance while HDR 10 hits only 1000 NITs. These are the main differences that people should care about.
History and Conclusion
If you want to learn about all the finer details, we recommend reading the Wikipedia pages for Dynamic Range and HDR, in that order.
HDR is not a new problem: ever since photography has existed (mid-1800s), photographers were required to pick a specific exposure when capturing photos. The wrong exposure would make the photo too bright, or too dark.
Having to decide on final exposure at shooting time made it very complicated to capture images in high-contrast situations. HDR provides a way to capture the scene, without having to decide on a specific exposure right away. HDR allows exposure to be selected at a later time, and modern algorithms can even pick a different ideal exposure for every pixel of the image.
In 1850, Gustave Le Gray was among the first photographers to experiment with HDR photography for capturing better landscape photos. Such endeavor that is simplified to the extreme for today’s digital photographers, thanks to the ability to store more lighting information, and thanks to the computational power we can carry in our pockets.
Filed in HDR.
. Read more about