HDR in Forthcoming Halide 3
Ben Sandofsky (tweet, Hacker News):
Then the algorithms combined everything into a single “photo” that matches human vision… a photo that was useless, since computer screens couldn’t display HDR. So these researchers also came up with algorithms to squeeze HDR values onto an SDR screen, which they called “Tone Mapping.”
[…]
HDR and Deep Fusion require that the iPhone camera capture a burst of photos and stitch them together to preserve the best parts. Sometimes it goofs. Even when the algorithms behave, they come with tradeoffs.
[…]
After considerable research, experimentation, trial and error, we’ve arrived on a tone mapper that feels true to the dodging and burning of analog photography. What makes it unique? For starters, it’s derived from a single capture, as opposed to the multi-exposure approaches that sacrifice detail. While a single capture can’t reach the dynamic range of human vision, good sensors have dynamic range approaching film.
However, the best feature is that this tone mapping is off by default. If you come across a photo that feels like it could use a little highlight or shadow recovery, you can now hop into Halide’s updated Image Lab.
I’m still disappointed by the options that recent versions of the iOS Camera app offer here. With iPhone 13 and later, there seems to be no way to turn off Smart HDR. Enabling ProRAW may give more control for post-processing in Lightroom, but it still combines multiple captures into one.
I like that Lux is trying to address this with Halide, but I’m not sure this is the right solution for me. First, is the sensor really good enough to get all the needed data with one capture? Second, I don’t want to go through all the photos on my iPhone; I prefer to process them on my Mac.
Previously:
- Halide 2.15: Process Zero
- iPhone Camera Over Processing
- ProRAW
- Halide Mark II
- iPhone XS Users Complain About Skin-Smoothing Selfie Camera
3 Comments RSS · Twitter · Mastodon
There is a complete disconnect on what “HDR” actually means, historically and technically. Historically, “HDR photography” means taking multiple exposures, flattening the “high dynamic range” into … something, usually very odd looking and unnatural. Playing around with shadows and highlights in Lightroom can be seen as a version of this, but obviously not as extreme, taking the relatively high dynamic range captured by modern sensors and compressing somewhat the image.
Real HDR photography is something else entirely. It’s a photograph displayed in a much higher gamut colorspace, and a brightness range with a much peak brightness target. Photographs can be mastered with a specific peak brightness value, such as 1000 nits, but the principle is the same. Such photographs and videos are displayed on what Apple calls “XDR” displays, displays that have a peak brightness much higher than classical displays. These are recent iPhones, iPads and Macs.
What Apple calls “HDR” in the context of photography is and odd mix of the two (at least recently). As of iOS 11 (I think), Apple captures actual high dynamic range images, that are far less compressed that what it used to call “HDR” back in the day, but that is what is described in the ad above. Technically, since Apple still supports jpegs, which are a low gamut 8-bit format, the concept of “gain map” was introduced, where a base SDR image is captured in the jpeg, while a “diff” gain map is stored as metadata, where a math process produces the resulting HDR image. Interestingly, they do the same for heifs, despite those supporting true 10-bit depths. When these images are displayed on an SDR display, Apple discards the gain map and only displays the base image.
Anyone who has seen modern Apple images on an SDR display knows their colors look odd and lacking, while on an HDR screen, colors are … better. That’s because tone mapping will always only get you so far, with videos or images. Content mastered for HDR displays will never look right on SDR displays, and Apple has been mastering for HDR for almost a decade now. This is why HDR in Lightroom is a modality; either HDR or SDR.
Actually Apple does a lot less frame processing in light scenes in their HDR process, but it produces images that are too bright on HDR displays. I’ve moved my Sony photography to HDR almost exclusively, but in Lightroom (Classic) I can produce far more natural results that don’t burn the retinas of my eyes. Apple seems to hide their SDR process on modern devices, which make people dislike “HDR” even more, but for reasons unrelated to technical specifications.
As usual, I have no idea what Halide is on about; "Zero processing" gets more "processing". Yippie. I don't really see them addressing anything though. They either do the over-processing classic "HDR" or they do real HDR, which they recommend not to, which looks exactly like Apple's over-bright HDR. Worth the $$$ subscription for sure.
"usually very odd looking and unnatural"
I'm always a bit perplexed when people say that HDR images look unnatural. They look closer to what I see with my own eyes than regular photographs. They look more natural than regular photographs, but we find them odd because we've become accustomed to the unnatural, limited dynamic range of regular photographs.
You don’t see a compressed dynamic range either. The quote above is about the compressed “H”DR style of photos. When you look at a scene with a very bright sky, for example, you are able to input a very large contrast ration, so you still see details in the shadow area. But at any given moment, you are keenly aware that the sky is much, much brighter than the shaded salt shaker you can see from the window nearby. Flattening the dynamic range to show a blue sky and almost as bright white salt shaker is not what your eye see, hence unnatural. A real HDR photograph, run through a proper HDR pipeline is a far closer representation of the scene than the compressed one. The representation closeness is dependent on the peak brightness of your display, the contrast ratio and the environment your display resides in. But my assessment stands.