Monday, December 4, 2023

Computational Bridal Photo

Matt Growcoot:

A woman says that “the fabric of reality crumbled” after she looked at an iPhone photo of herself trying on a wedding dress and noticed that her reflection looked different.

Standing in front of two large mirrors, Tessa Coates’ reflection does not return the same pose that she is making, and not only that, but both reflections are different from each other and different from the pose Coates was actually holding.

While Coates was holding one arm up and another down, the reflection on the left is seen holding both arms down, and the reflection to her right is holding both arms up to her waist.

John Gruber:

Coates, in her Instagram description, claims “This is a real photo, not photoshopped, not a pano, not a Live Photo”, but I’m willing to say she’s either lying or wrong about how the photo was taken.

[…]

In a long-winded story post, Coates says she went to an Apple Store for an explanation and was told by Roger, the “grand high wizard” of Geniuses at the store, that Apple is “beta testing” a feature like Google’s Best Take. Which is not something Apple does, and if they did do, would require her to have knowingly installed an iOS beta.

Nick Heer:

This is, as far as I can find, the first mention of this claim, but I would not give it too much credibility. Apple retail employees, in my experience, are often barely aware of the features of the current developer beta, let alone an internal build. They are not briefed on unannounced features. To be clear, I would not be surprised if Apple were working on something like this, but I would not bet on the reliability of this specific mention.

It’s almost unbelievable the sort of looney things that my customers tell me they were told by Apple retail employees/geniuses, and I’ve been directly told some quite unlikely things myself. The best bet is that someone in this story is mistaken. But I’m not sure I’d rule anything out after hearing Apple brag on stage that they secretly beta tested APFS file system conversions during the regular software update process.

Wesley Hilliard (Hacker News):

What’s actually occurred here is a mistake in Apple’s computational photography pipeline. The camera wouldn’t realize it was taking a photo of a mirror, so it treated the three versions of Coates as different people.

Coates was moving when the photo was taken, so when the shutter was pressed, many differing images were captured in that instant as the camera swept over the scene, since it was a panoramic photo capture. Apple’s algorithm stitches the photos together, choosing the best versions for saturation, contrast, detail, and lack of blur.

John Gruber:

The subject claims it wasn’t a Panoramic mode photo, but she didn’t snap the photo, and if a photo taken in Panoramic mode isn’t wide enough to reach some threshold, the Photos app does not identify/badge it as such. And conversely, a normal photograph cropped to a very wide aspect ratio will be badged as Panoramic — like this and this from my own library — even though it wasn’t snapped in Panoramic mode.

Those sound like bugs to me.

I think it’s quite likely Korkmaz is correct that this is the explanation for how this photo was created; I remain unconvinced that it wasn’t a deliberate publicity stunt.

Previously:

4 Comments RSS · Twitter · Mastodon

There are already systems that take multiple pictures, and merge the photos so everybody's eyes are open, aren't there? Or am I misremembering this? It seems at least slightly plausible that a similar system could create a situation like this, even though I agree that it is a much less likely explanation than a panorama picture, or a publicity stunt.

I find this little mystery very much "Of Our Time". It's a dull dumb pointless version of The Dress. We're backsliding into stupidity.

>There are already systems that take multiple pictures, and merge the photos so everybody's eyes are open, aren't there? Or am I misremembering this?

You're thinking of Google's Best Take. (Which, yeah, brings back conversations about "what even is a photo", and "is this still the photographer's intent".)

Apple hasn't announced a similar feature, and I find the idea that this is a) a publicity stunt or b) an honest mistake where the photographer used panoramic mode (or otherwise stitched multiple pictures together) and she doesn't know and is now looking for post-hoc explanations _far_ more likely than her being rolled into some private beta.

>I’m not sure I’d rule anything out after hearing Apple brag on stage that they secretly beta tested APFS file system conversions during the regular software update process.

But that one was designed to be not only unnoticeable but also not do anything permanent. It created APFS metadata from HFS+, validated that, then threw it away. You could make the case that Apple used its customers' computing resources without their consent (assuming this wasn't something that only ran on devices with some diagnostics opt in?), but not that they changed your data without it.

Whereas, in this scenario, her contention is that her photos were permanently altered.

@Sören Yes, I think those explanations are far more likely. My point is only that here’s a case where everyone ended up running beta code without opting into anything. Maybe they do this for photo processing, too. If so, this would certainly be intended be invisible to the user, but it’s not impossible for there to be a bug where the wrong version of the image gets saved. In fact, I recall linking to a bug where sharing a photo, which you would think would be a read-only operation, would modify the photo library in an unwanted way.

Leave a Comment