Thursday, March 24, 2022

iPhone Cameras and Computational Photography

Sebastiaan de With:

When you take a photo on a modern iPhone — or any smartphone for that matter — you might like to think that what you saw was what you captured, but nothing could be further from the truth.

[…]

Automatic edits on photos can go wrong. When there are object in motion, the ‘merging’ of photos creates artifacts, or ‘ghosting.” This all worked out with smarter algorithms, more powerful chips, faster memory, and an iPhone that could simply take photos so fast that there were fewer gaps in photos.

Fast forward to today, and your iPhone goes way above and beyond HDR. It has not been a setting you can toggle for a while. When you take a photo now, the camera on the iPhone will merge many shots to get your final result. Today, your camera essentially always ‘edits’ your photos for you. And exactly how it edits them… is a bit of a mystery.

[…]

While the jump [in RAW] from the X to the 11 is noticeable, the move to the 13 Pro is much less so, despite having a faster lens and larger sensor. It’s possible there is a lot more detail that the sensor and lens can resolve, but we can’t really tell — possibly it’s because the sensor resolution has been the same 12 megapixels since the iPhone 6S, which launched seven years ago.

There are lots of nice photos in this post, though it made WebKit crash three times for me.

Nick Heer:

Like de With, I think Apple’s processing choices are often too aggressively tuned for noise removal, even on my iPhone 12 Pro. A little grain is fine for my tastes and even more is acceptable in darker images.

Kyle Chayka:

But the 12 Pro has been a disappointment, she told me recently, adding, “I feel a little duped.” Every image seems to come out far too bright, with warm colors desaturated into grays and yellows. Some of the photos that McCabe takes of her daughter at gymnastics practice turn out strangely blurry. In one image that she showed me, the girl’s upraised feet smear together like a messy watercolor. McCabe said that, when she uses her older digital single-lens-re#ex camera (D.S.L.R.), “what I see in real life is what I see on the camera and in the picture.” The new iPhone promises “next level” photography with push-button ease. But the results look odd and uncanny. “Make it less smart—I’m serious,” she said. Lately she’s taken to carrying a Pixel, from Google’s line of smartphones, for the sole purpose of taking pictures.

[…]

One expects a person’s face in front of a sunlit window to appear darkened, for instance, since a traditional camera lens, like the human eye, can only let light in through a single aperture size in a given instant. But on my iPhone 12 Pro even a backlit face appears strangely illuminated. The editing might make for a theoretically improved photo—it’s nice to see faces—yet the effect is creepy. When I press the shutter button to take a picture, the image in the frame often appears for an instant as it did to my naked eye. Then it clarifies and brightens into something unrecognizable, and there’s no way of reversing the process. David Fitt, a professional photographer based in Paris, also went from an iPhone 7 to a 12 Pro, in 2020, and he still prefers the 7’s less powerful camera. On the 12 Pro, “I shoot it and it looks overprocessed,” he said. “They bring details back in the highlights and in the shadows that often are more than what you see in real life. It looks over-real.”

John Gruber:

I don’t think Chayka is being overly disingenuous, but for 99 percent of the photos taken by 99 percent of people (ballpark numbers, obviously) the iPhone 12 or 13 is a way better camera than an iPhone 7.

Nick Heer:

Right now, the iPhone’s image processing pipeline sometimes feels like it lacks confidence in the camera’s abilities. As anyone who shoots RAW on their iPhone’s camera can attest, it is a very capable lens and sensor. It can be allowed to breathe a little more.

[…]

Google’s Pixel was the phone that really kicked off this computational photography stuff. It seems its interpretation of images is seen — by this owner anyway — as less intrusive. But I do not see the iPhone 12 Pro issues she raises in my own 12 Pro’s photos, such as desaturated warm tones.

John Nack:

f anyone reads that New Yorker article and thinks they’ll prefer shooting on an iPhone 7, please show them these iPhone 7-vs-12 shots I took.

Via John Gruber:

A lot of times when new iPhones are reviewed — including my own reviews — camera comparisons are made to iPhones from just one or two years prior, and differences can seem subtle. Separate iPhones by five years, though, and the results are striking.

Previously:

Update (2022-04-19): Riccardo Mori:

But let’s go back to Chayka’s article. The point that is the most thought-provoking, in my opinion, is the emphasis given to one specific aspect of the newer iPhones’ computational photography — the mechanisation, the automation, the industrial pre-packaging of a ‘good-looking’ or ‘professional-looking’ photo. Much like with all processed foods produced on an industrial scale, which all look and taste the same, computational photography applies a set of formulas to what the camera sensor captures, in order to produce consistently good-looking results. The article’s header animation summarises this clearly: a newer iPhone passes by a natural looking still life with flowers in a vase, and for a moment you can see how the iPhone sees and interprets that still life, returning a much more vibrant, contrasty scene. Certainly more striking than the scene itself, but also more artificial and less faithful to what was actually there.

That’s why, in Chayka’s view, his iPhone 7 took ‘better’ photos than his iPhone 12 Pro. It’s not a matter of technical perfection or superiority.

[…]

Especially with low-light photography, what these newer iPhones (but also the newer Pixels and Samsung flagships) return are not the scenes I was actually seeing when I took the shot. They are enhancements that often show what is there even when you don’t see it. Sometimes the image is so brightened up that it doesn’t even look like a night shot — more like something you would normally obtain with very long exposures.

Jeff Carlson (tweet):

I do highly recommend that you read the article, which makes some good points. My issue is that it ignores—or omits—an important fact: computational photography is a tool, one you can choose to use or not.

Unfortunately, Apple doesn’t really give you that choice, except via third-party camera apps. There is no longer an option to save both the normal and HDR versions of a photo. I usually like the results of Smart HDR, so I leave it on, but sometimes it does a poor job and then I’m stuck without access to the original.

See also: TidBITS-Talk.

9 Comments RSS · Twitter

"for 99 percent of the photos taken by 99 percent of people (ballpark numbers, obviously) the iPhone 12 or 13 is a way better camera than an iPhone 7"

I think one problem here is predictability. Sure, 99% of the pictures are good, but if the 1% are bad in ways that seem weird to the person taking the picture, that might be much more noticeable, and a much bigger problem, than having a camera that is worse, but produces consistent output.

In the past, you knew pretty well what kind of picture you'd get in any given situation. That's no longer the case.

My non-iPhone has a low-light mode that it enables automatically, which means that I can take two pictures, right after another, and have drastically different results, because some minuscule thing changed between the two shots that caused the phone to turn its low-light mode on or off. In this case, both pictures are usually good, just different, but the problem is that it is unpredictable, and thus much more noticeable than a camera that behaves consistently.

>I don’t think Chayka is being overly disingenuous

I think they are.

>Halide, a developer of camera apps, recently published a careful examination of the 13 Pro that noted visual glitches caused by the device’s intelligent photography, including the erasure of bridge cables in a landscape shot. “Its complex, interwoven set of ‘smart’ software components don’t fit together quite right,” the report stated.

From this paragraph, you might get the impression that Halide is making a statement about the camera in general. But that isn't what de With is saying at all. Sebastiaan is referring to one image in particular, and within it, quite a crop.

Their conclusion in general is "If you’re a serious photographer, the iPhone 13 Pro is a brilliant camera", whereas from Kyle's quote, you might get the impression that Sebastiaan considers the iPhone 13 to be a misstep.

"I think they are."

There are tons of threads from people upset with the pictures from their iPhone 13 Pro/Max on sites like macrumors, with complaints like "it's basically impossible to obtain a normal face without the "bad photoshop" effect." This seems to be a real thing that real people find upsetting.

I wonder how many bad photos a day are taken if we accept that 1% of 1% are bad.
Tens of thousands?

Old Unix Geek

I just wish my photos of sunsets looked like what I see. I know the reasons, but a good camera would faithfully record what I see.

“I think they are.”

There are tons of threads from people upset with the pictures from their iPhone 13 Pro/Max on sites like macrumors, with complaints like “it’s basically impossible to obtain a normal face without the “bad photoshop” effect.” This seems to be a real thing that real people find upsetting.

I wasn’t commenting on that either way. I was saying I disagree with John Gruber; i.e. I do think Kyle Chayka is rather disingenuous by summarizing Sebastiaan de With’s article with the quote of “Its complex, interwoven set of ‘smart’ software components don’t fit together quite right”. That quote referred to a crop of one image in particular, not to camera performance in general.

There are certainly concerns to be had about computational photography making incorrect guesses. (One example a lot of US West Coast folks noticed a few years ago: during wildfires, it “corrected” the sky to one it thought should be what a sky looks like, but wasn’t what the burning sky did look like.)

But I’m sure Apple has statistics on that. And I’m guessing they’ve concluded something in the high-90s of images looks better with their algorithms than without.

(The other layer is what constitutes “better”. Some people comment on John Nack’s tweet arguing the lighting on the iPhone 7 images looks more natural. Luckily, Apple has added some sliders to adjust this — sort of a ProRAW on easy mode.)

From Mori:

>They are enhancements that often show what is there even when you don’t see it. Sometimes the image is so brightened up that it doesn’t even look like a night shot — more like something you would normally obtain with very long exposures. And again, some people like this. They want to have a good capture of that great night at a pub in London or at a bistro in Paris, and they want their phone to capture every detail.

Indeed. Many Night Mode photos I've taken are great. But some of them are both great and also clearly "unnatural". They capture a kind of light and color that, to my literal eyes, wasn't actually there, and for such moments, I would've liked a slider that lets me tone down the artificialness just a tad. (I presume Photographic Styles helps here, but haven't checked.)

But…

>The problem, as far as I’m concerned, is the approach of those who happily take advantage of all the capabilities of computational photography but want to pass the resulting photos as a product of their creative process.

Mori really loses me here. This sounds like old man yells at cloud-style gatekeeping, like "programmers these days only cobble together libraries instead of writing their own code".

Mori really loses me here. This sounds like old man yells at cloud-style gatekeeping, like "programmers these days only cobble together libraries instead of writing their own code".

No, it's more like "programmers these days take fully AI-generated code, give it a cool name, and want to pass the result as an app they made". :-)

Leave a Comment