Acorn 6.3 Postmortem
Apple added a new feature to its latest iPhones in the iOS 12 update called “Portrait Matte”. It’s a special image embedded in HEIC images which is based off the depth data and some machine learning in your photo. You can then use this image as a mask to blur parts of your image (which is what the iOS “Portrait” camera setting does), or you can use this data to remove backgrounds.
But how should Acorn expose this matte? My first stab was to have Acorn add the matte as an additional layer. After playing with it a bit, it just felt off. So I ended up adding the matte as a mask to the main layer when opening the image. But folks are obviously going to want to do more than just mask out the background so I added new features to Acorn where you could easily drag and drop the layer mask into into its own layer. I also made it easy to move an existing layer to another layer’s mask via drag and drop. I can’t predict what people are going to want to do with the mask, but I might as well make it easy to move around.
It was also during this development that I found some bugs in Apple’s My Photo Stream. The matte was showing up rotated incorrectly when opening images out of Photos. At first I figured I was just reading the data wrong, but nope- under certain conditions when images with the portrait mask were uploaded to MPS, the rotation data from the camera went missing. After some communication and a Radar filed at Apple, this bug was fixed in an OS update. Bug fixes like this don’t happen very often, but when they do it makes filing all the other Radars worth it. Mostly.