Friday, November 2, 2018

Depth Capture With the iPhone XR

Ben Sandofsky:

The iPhone 7 Plus introduced a dual-camera system, which enables a similar way to construct depth. By taking two photos at the same time, each from a slightly different position, we can construct a disparity map.

[…]

With the iPhone X, Apple introduced the TrueDepth camera. Instead of measuring disparity, it uses an infrared light to project over 30,000 dots. […] However, depth data isn’t entirely based off the IR dots.

[…]

In a DPAF [Dual Pixel Auto Focus] system, each pixel on the sensor is made up of two sub-pixels, each with their own tiny lens. The hardware finds focus similar to a rangefinder camera; when the two sub-pixels are identical, it knows that pixel is in focus.

[…]

If you captured two separate images, one for each set of sub-pixels, you’d get two images about one millimeter apart. It turns out this is just enough disparity to get very rough depth information.

The iPhone XR combines this with machine learning to capture depth for Portrait mode.

John Gruber:

I’m so glad Halide offers this, but I can see why Apple hasn’t enabled it for non-human subjects in the built-in Camera app. It’s hit or miss.

Previously: iPhone XR Reviews.

Comments RSS · Twitter

Leave a Comment