Wednesday, February 22, 2023

The Limits of Computational Photography

Will Yager:

Every time I tried to take a picture of the engraved text, the picture on my phone looked terrible! It looked like someone had sloppily drawn the text with a paint marker. What was going on? Was my vision somehow faulty, failing to see the rough edges and sloppy linework that my iPhone seemed to be picking up?

[…]

Well, I noticed that when I first take the picture on my iPhone, for a split second the image looks fine. Then, after some processing completes, it’s replaced with the absolute garbage you see here.

[…]

Significantly more objectionable are the types of approaches that impose a complex prior on the contents of the image. This is the type of process that produces the trash-tier results you see in my example photos. Basically, the image processing software has some kind of internal model that encodes what it “expects” to see in photos. This model could be very explicit, like the fake moon thing, an “embodied” model that makes relatively simple assumptions (e.g. about the physical dynamics of objects in the image), or a model with a very complex implicit prior, such as a neural network trained on image upscaling. In any case, the camera is just guessing what’s in your image. If your image is “out-of-band”, that is, not something the software is trained to guess, any attempts to computationally “improve” your image are just going to royally trash it up.

Via Nick Heer:

This article arrived at a perfect time as Samsung’s latest flagship is once again mired in controversy over a Moon photography demo. Marques Brownlee tweeted a short clip of the S23 Ultra’s one-hundredfold zoom mode, which combines optical and digital zoom and produces a remarkably clear photo of the Moon. As with similar questions about the S21 Ultra and S22 Ultra, it seems Samsung is treading a blurry line between what is real and what is synthetic.

[…]

Another reason why it is so detailed is also because Samsung specifically trained the camera to take pictures of the Moon, among other scenes.

Previously:

Update (2023-03-14): ibreakphotos (MacRumors):

So, while many have tried to prove that Samsung fakes the moon shots, I think nobody succeeded - until now.

[…]

The moon pictures from Samsung are fake. Samsung’s marketing is deceptive. It is adding detail where there is none (in this experiment, it was intentionally removed). In this article, they mention multi-frames, multi-exposures, but the reality is, it’s AI doing most of the work, not the optics, the optics aren’t capable of resolving the detail that you see.

Via John Gruber:

Have to say I’m surprised both Raymond Wong and Marques Brownlee were taken in by this. These “amazing” moon photos seem impossible optically, and, more tellingly, no one is able to get these Samsung phones to capture similarly “amazing” 100× zoom images of random objects that aren’t the moon.

It’s strange, since Brownlee’s words are that “It’s not an overlay,” but then he references the Huawei controversy, where it was established that the details came from an ML model rather than a bitmap overlay. So he knows that Samsung might be doing the same thing yet seems to assume it’s just a great lens.

Nick Heer:

Samsung has explained how its camera works for pictures of the Moon, and it is what you would probably expect: the camera software has been trained to identify the Moon and, because it is such a predictable object, it can reliably infer details which are not actually present. Whether these images and others like them are enhanced or generated seems increasingly like a distinction without a difference in a world where the most popular cameras rely heavily on computational power to make images better than what the optics are capable of.

Update (2023-03-16): Samsung (via Hacker News):

As part of this, Samsung developed the Scene Optimizer feature, a camera functionality which uses advanced AI to recognize objects and thus deliver the best results to users. Since the introduction of the Galaxy S21 series, Scene Optimizer has been able to recognize the moon as a specific object during the photo-taking process, and applies the feature’s detail enhancement engine to the shot.

When you’re taking a photo of the moon, your Galaxy device’s camera system will harness this deep learning-based AI technology, as well as multi-frame processing in order to further enhance details.

Update (2023-03-21): John Gruber:

There’ve been a couple of follow-ups on this since I wrote about it a few weeks ago. Marques Brownlee posted a short video, leaning into the existential question of the computation photography era: “What is a photo?” Input’s Ray Wong took umbrage at my having said he’d been “taken” by Samsung’s moon photography hype in this Twitter thread.

Samsung’s phones are rendering the moon as it was, at some point in the past when this ML model was trained.

And that’s where Samsung steps over the line into fraud. Samsung, in its advertisements, is clearly billing these moon shots as an amazing feature enabled by its 10× optical / 100× digital zoom telephoto camera lens. They literally present it as optically superior to a telescope. That’s bullshit. A telescope shows you the moon as it is. Samsung’s cameras do not.

2 Comments RSS · Twitter · Mastodon

Niall O’Mara

Computational photography, just like AI, is going to royally feck-up future historians as very little that is left behind from this era will be trustworthy.

All sorts of media will have been enhanced to give what was thought to be the desired result rather than reality.

Today is already more Orwellian than we hoped it would be and it”s just the beginning…

Reminds me of when people discovered PDF files used JBIG2 lossy compression which could alter the numbers in small print. Anyway, this is why pro photographers shoot RAW files with no preprocessing.

Leave a Comment