Archive for June 4, 2025

Wednesday, June 4, 2025

EagleFiler 1.9.17

EagleFiler 1.9.17 is a maintenance release of my digital filing cabinet and e-mail archiving app. This version improves capturing Web pages from Orion and Safari, works better with different font sizes, and improves tag auto-completion and searching.

Some interesting issues were:

Previously:

SwiftUI macOS Sheet Buttons

Sam Rowlands:

Sheet dialog buttons don’t meet the macOS human interface guidelines by default, I’ve tried some solutions in the past, but then I stumbled across a really simple way to do it and I’m sharing that now.

[…]

Yes, it’s that simple, use a toolbar and the placement attributes to specify which buttons perform which action and SwiftUI will not only place the buttons correctly, but will resize the default and cancel buttons to match the Apple Human Interface Guidelines.

Except now the window shows a horizontal line for the toolbar. Still, this is the easiest way I’ve seen so far.

Previously:

OmniFocus 4.6

Ainsley Bourque Olson (release notes):

Notes are a great way to add additional context to an item in OmniFocus, and OmniFocus 4.6 makes it easier than ever to add content from outside of OmniFocus to a note, without bringing unnecessary font styles along for the ride. With this update, OmniFocus now defaults to an improved “Merge Styles” paste behavior, preserving only essential styles like bold, italic, underline, and strikethrough, as well as links with titles, and attachments.

While we think the “Merge Styles” paste behavior will be a great fit for most workflows, OmniFocus 4.6 also allows for customization of this behavior with a new paste behavior setting. And the default paste behavior is now context aware, only stripping styles when pasting text copied from an external source—styles are retained when pasting text copied from within OmniFocus, allowing you to move styled note text between OmniFocus items with ease.

It also fixes a really annoying sync bug that could make the window move between spaces.

Previously:

The iPhone 15 Pro’s Depth Maps

Mark Litwintschik (Hacker News):

Finn Jaeger, who is the head of VFX at Replayboys, a film production firm in Hamburg, Germany, posted a screenshot a few weeks ago showing how multiple depth maps were being produced by his iPhone.

He announced he was working on a project called HEIC Shenanigans. This project contains scripts to separate out images and their metadata from HEIC containers as well as convert them into EXR Files. As of this writing, the project contains 374 lines of Python.

In this post, I’ll walk through Finn’s codebase with an example image from an iPhone 15 Pro.

Uncorrelated:

Other commenters here are correct that the LIDAR is too low-resolution to be used as the primary source for the depth maps. In fact, iPhones use four-ish methods, that I know of, to capture depth data, depending on the model and camera used. Traditionally these depth maps were only captured for Portrait photos, but apparently recent iPhones capture them for standard photos as well.