FaceTime Attention Correction
A video call is a great way to connect with friends and family when you can’t physically be together. But even if you’re staring directly at your loved one’s face, there’s still something a little off about the whole process. The way your phone’s screen display and camera lens sync up means you’re never quite able to look your conversational partner squarely in the eye. Until now, that is. Apple is allegedly working on a new feature that subtly adjusts your gaze during video calls, so it appears as if you’re looking into the camera when you’re actually looking at the screen.
The new “FaceTime Attention Correction” feature, first spotted by Mike Rundle on Twitter, can be turned on and off in the FaceTime section of the Settings app, although it only appears to work on iPhone XS and XS Max devices in the third iOS 13 beta sent out to developers on Tuesday.
FaceTime Attention Correction appears to use an ARKit depth map captured through the front-facing TrueDepth camera to adjust where your eyes are looking for a more personal and natural connection with the person that you’re talking to.
Twitter users have discovered the slight eye warping that Apple is using to enable the feature, which can be seen when an object like the arm of a pair of glasses is placed over the eyes.
9 Comments RSS · Twitter
I don't know if it's just me, but there's already something really off with the front-facing camera on the XS. So much that I don't use it. It's like they did something strange to the perspective of it to make it work with FaceID (taking a guess here) that it makes the proportions look off. Any time I flip the camera around to the front, it's like my head is 2x smaller in proportion to my shoulders than it should be.
Point is, it seem like they've been tweaking their cameras and it's kind of ruining the experience. This seems like it's going to add to that.
I tweeted about this the other day and have been collecting responses on my blog: http://interconnected.org/home/2019/07/04/attention_correction
The responses that came my way are pretty mixed. There’s a decent amount of concern which I feel would be alleviated if Apple made sure to signal (somehow) on the receiving end that the video had been filtered.
I'm wondering whether this would not be a feature requested by an Apple SVP so that someone can close one's eyes during video conferences without getting it noticed.
Sad to see Apple are happy to align themselves with those who will falsify images for aesthetic reasons - we're already close to not having an iota of trust in images we see - this sort of nonsense will only add to that perception.
Looking at picture comparisons between the iPhone X and the XS/XS Max, it does look like something has changed. The XS phones don't have the problem where the nose looks way too big compared to the rest of the face, which is a problem with almost all selfie cameras. From the pictures alone, it's impossible to say if the difference is from the lense, or if it's some kind of image manipulation that corrects the picture, but overall, it does seem like an improvement, even if it might look a bit odd to people who have gotten used to how front cameras have made their face look for the past decade.
Since it's only an option, and it's likely that it will be off by default, there is little reason to so much negative feedback.
It feels like it's a detractor of attention from other important issues.