Archive for September 28, 2018

Friday, September 28, 2018

PhotoKit’s Core Data Model

Ole Begemann:

In my quest to understand the Photos framework better (especially its performance characteristics), I wanted to inspect its data model. I found a file named PhotoLibraryServices.framework/photos.momd/ deep in the bowels of the Xcode 10.0 app bundle[…]


A .mom file is a compiled Core Data model. Xcode can’t open this directly, but it can import one into another Core Data model. Follow these steps to view the model in Xcode[…]

Brian Webster:

The Mac version of Photos doesn’t use Core Data, but instead a custom SQLite database format that originated in Aperture. Looks like a similar number of tables/entities though.

Brian Ganninger:

That’s correct (former UI engineer here ) The Aperture database format served as the basis for the shared library format between Aperture & iPhoto. That library format was then evolved for Photos library and cloud integration.

Previously: The History of Aperture.

Update (2018-10-02): Guilherme Rambo:

The Core Data model used by “Find My Friends”.

Basecamp App Rejected for Including Help Link

David Heinemeier Hansson:

Apple is rejecting an update to the @basecamp app in part because our app include a link to web-based help pages that have information about a paid version of Basecamp. Nothing changed in this app update from how that’s been since forever. Now a scramble to hide help links

The capricious review process that Apple subjects app devs to is such a stain on the company’s relationship with its ecosystem. It feels so utterly unnecessary, with such little upside, and such serious downsides. Apple may be the most benevolent in Big Tech, but it’s still in it

It also highlights what a glorious anomaly the web is as an application platform. Free from capricious overlords. Viva the open web. Viva email. Viva all open platforms.

This rule has never made sense to me. It’s even less understandable than the rules that you can’t mention which other platforms your app works with or which hardware or OS versions are compatible. And Basecamp’s intent is clearly not to bypass paying through Apple.

See also: Rejected for Mentioning a Pre-release macOS Version, Overcast Rejected for Listing Competing Podcast Apps, Purchasing From the Kindle App, iBookstore Rejects Book for Linking to Amazon.

How Swift’s Mirror Works

Mike Ash (now at Apple, but thankfully allowed to blog):

There isn’t a single universal way to fetch the info we want from any type. Tuples, structs, classes, and enums all need different code for many of these tasks, such as looking up the number of children. There are further subtleties, such as different treatment for Swift and Objective-C classes.

All of these functions will need code that disptaches to different implementations based on what kind of type is being examined. This sounds a lot like dynamic dispatch of methods, except that the choice of which implementation to call is more complicated than checking the class of the object the method is being used on. The reflection code attempts to simplify matters by using C++ dynamic dispatch with an abstract base class that contains a C++ version of the above interface, and a bunch of subclasses covering all the various cases. A single function maps a Swift type to an instance of one of those C++ classes. Calling a method on that instance then dispatches to the appropriate implementation.


Looking up the elements in structs, classes, and enums is currently quite complex. Much of this complexity is due to the lack of a direct reference between these types and the field descriptors which contain the information about a type’s fields. A helper function called swift_getFieldAt searches for the appropriate field descriptor for a given type. This whole function should go away once we add that direct reference, but in the meantime it provides an interesting look at how the runtime code is able to use the language’s metadata to look up type information.

Why Did Apple Spend $400M to Acquire Shazam?

Daniel Eran Dilger:

Virtually every one of Apple’s recent acquisitions can be directly linked to the launch of serious, significant new features or to embellishing core initiatives designed to help sell its hardware, including Face ID (Faceshift, Emotient, and Perceptio); Siri (VocalIQ); Photos and CoreML (Turi, Tuplejump, Lattice Data, Regaind); Maps (Coherent Navigation, Mapsense, and; wireless charging (PowerbyProxi) and so on.

Further, the reported $400 million price tag on the Shazam acquisition puts it in a rare category of large purchases that Apple has made which involved revolutionary changes to its platforms. Only Anobit (affordable flash storage), AuthenTec (Touch ID), PrimeSense (TrueDepth imaging) and NeXT itself are in the same ballpark apart from Beats—Apple’s solitary, incomparably larger $3 billion purchase that delivered both the core of Apple Music and an already profitable audio products subsidiary paired with a popular, global brand.


Given Apple’s interest in building traction for ARKit, which launched last fall as the world’s largest AR platform, it seems pretty clear that Apple bought Shazam, not really for any particular technology as Apple has already developed its own core visual recognition engine for iOS, but because Shazam has developed significant relationships with global brands to make use of AR as a way to engage with audiences.


With the development of ARKit, Apple has now created a new "mixed reality" world of app experiences that mesh right into the real world. At its last two WWDC events, Apple has introduced various games as primary examples of using ARKit. However, Shazam has already developed marketing campaigns that take advantage of ARKit to build engaging experiences—very similar to the core concept of iAd many years ago.

Update (2018-10-02): Scott Perry:

You can tell Apple has finished its acquisition of Shazam because none of the Spotify integration seems to work anymore. Even basic stuff like deep linking is busted.