Friday, January 18, 2019 [Tweets] [Favorites]

Even More About Swift’s Codable

Ben Scheirman (via Kuba Suder):

Instead, we can use a special method to get a super-class ready encoder that already has a container attached to it[…]


Here we have a migration_date field that has a different date format than the created_at field. Let’s also assume that the name property has since been changed to just name.

This is obviously not an ideal situation, but real-life happens and sometimes you inherit a messy API.


This is a listing of beer styles, but the keys are actually the name of the style. We could not represent every possible case with an enum as it could change or grow over time.

Instead, we can create a more dynamic implementation of CodingKey for this.

This is the most comprehensive guide to Codable and JSON that I’ve seen.

Russ Bishop:

The new Codable protocol is flexible enough to allow a different encoded representation from the in-memory representation which is a nice property to have in a serialization mechanism. Today I’m going to build SingleValueCodable to automate that work when dealing with RawRepresentable types.

Ole Begemann:

So Dictionary seems to behave differently depending on its Key type, even though the enum values are ultimately encoded as strings. What’s going on here? We can find the answer in Dictionary’s implementation for the Encodable protocol.


There are three branches: only if the dictionary’s key type is String or Int does it use a keyed container. Any other key type triggers results in an unkeyed container of alternating keys and values.


SE-0167 introduced Codable conformance for some types in the standard library, but not the Range family of types. This proposal adds that conformance.

There’s quite an interesting discussion about this, because the details of how it works will end up affecting databases and APIs outside of Swift itself.

Paul Samuels:

The two key takeaways here are

  • If you need to represent a collection that can have multiple types then you’ll need some form of wrapper and enums can perform that duty well when it makes sense.

  • Swift’s Codable is really powerful and helped remove a heap of issues that arise from manually parsing/creating objects.

Removing optionality, reifying types and using compiler generated code are great ways of simplifying our code. In some cases this also helps move runtime crashes into compile time issues, which is generally making our code safer. The benefits here are great and it shows that it’s really worth taking time to model your data correctly and then use tools like Codable to munge between representations.

Paul Samuels:

Testing Codable implementations isn’t particularly hard but the boilerplate code required can get out of hand pretty quickly. I thought I’d run through a TDD process to get to the final solution as I find this stuff personally interesting and hopefully someone else might to. Hopefully I’ve highlighted some basic stuff to test when looking at custom Decodable implementations and shown that it’s useful to refactor not only the production code but the test code as well.

The challenge I see is how to make sure that you don’t break compatibility as you evolve your data model.

Stop Google Search Results Tracking

Jeff Johnson:

When you click on the link, the onmousedown action runs some JavaScript that swaps the original URL with a new tracking URL. Google does this as you click, right under your nose. Or finger.

By default, StopTheMadness has ⌘-Click and Drag and Drop protections enabled. A side effect of these protections is that you’re also protected from link hijacking. Why? Clicking on a link in a browser is preceded and triggered by mousedown and mouseup events. Thus, if a web site could hijack these events, it could prevent the link click from working as expected. This is why StopTheMadness prevents mousedown and mouseup events from getting hijacked when you ⌘-click on a link. But what about clicks without the ⌘ key? StopTheMadness prevents mousedown from getting hijacked whenever you click on a link, even without the ⌘ key, because dragging a link in a browser is preceded and triggered by a mousedown event. For full protection against link hijacking, then, you need both ⌘-Click and Drag and Drop protections enabled (as they both are by default).

I’ve stopped using Ghostery and other content-blocking/anti-tracking Safari plug-ins because I’m tired of them breaking sites. StopTheMadness provides less privacy protection, but it fixes the really annoying things that sites do while causing far fewer problems.

Airbnb and Security Camera Disclosure

Jeffrey P. Bigham (via Hacker News):

When my family and I stayed in an AirBnB this past winter break, we discovered this camera and another about a day into our stay. I was shocked, and immediately unplugged them. I don’t think we did anything particularly weird in front of that camera, but it’s very likely that my 2-year-old ran in front of this camera naked (the field of view of the camera was close to the exit of the bathroom).


A lot of other weird stuff happened during this trip stemming from this -- AirBnB told my host we asked about the cameras, he sent someone to snoop on us, he left us a bad review, etc.


Airbnb has re-re-re-reviewed my case, and now they agree that the cameras were not properly disclosed. Their position seems to be that the customer service representative(s) did not understand my concern, and/or they gave inaccurate information. While the reps I talked to before today repeatedly said that photo constituted disclosure, the senior person who reviewed the case says that it does not. 🤷

You can review Airbnb’s trust standards here.


Haptic Touch Bar


Haptic Touch Bar provides actual feedback when pressing buttons on your Touch Bar

Brings back the full Escape key experience!

Get back to touch typing—no more glancing at the Touch Bar as you type

Stop the self-doubt (did I hit the key?) with tactile & audible feedback

Configurable for intensity of feedback & sound

It vibrates the trackpad when you press a key on the Touch Bar. Of course, you still can’t actually feel where the keys are.

Thursday, January 17, 2019 [Tweets] [Favorites]

Stack Allocation for Non-Escaping Swift Closures

aschwaighofer has a pull request for stack-allocating Swift closures.

Slava Pestov:

Short history of non-escaping functions:

- Swift 4.1 and earlier: type checker enforcement; same ABI as escaping
- Swift 4.2: new ABI - the context is a trivial pointer and not ref-counted like with escaping
- now: non-escaping contexts allocated on stack

The ABI change was key here - Arnold frontloaded the changes before we started locking down, now stack-allocation is “just” an optimization

And ancient pre-history for those who weren’t around at the time:

- Swift 2.2 and earlier: all function values escaping by default, opt-in @noescape attribute for parameter types
- Swift 3: @noescape becomes default for function parameters, @escaping added to opt-in

More trivia: In ancient Swift the accepted idiom to turn a non-escaping function into an escaping one was unfortunately an unsafeBitCast(). The compiler added a special withoutActuallyEscaping form and started screaming about casts in 4.0 so that we could stage in the ABI change

Previously: Optional Non-Escaping Swift Closures.

Acorn 6.3 Postmortem

Gus Mueller:

Apple added a new feature to its latest iPhones in the iOS 12 update called “Portrait Matte”. It’s a special image embedded in HEIC images which is based off the depth data and some machine learning in your photo. You can then use this image as a mask to blur parts of your image (which is what the iOS “Portrait” camera setting does), or you can use this data to remove backgrounds.

But how should Acorn expose this matte? My first stab was to have Acorn add the matte as an additional layer. After playing with it a bit, it just felt off. So I ended up adding the matte as a mask to the main layer when opening the image. But folks are obviously going to want to do more than just mask out the background so I added new features to Acorn where you could easily drag and drop the layer mask into into its own layer. I also made it easy to move an existing layer to another layer’s mask via drag and drop. I can’t predict what people are going to want to do with the mask, but I might as well make it easy to move around.

It was also during this development that I found some bugs in Apple’s My Photo Stream. The matte was showing up rotated incorrectly when opening images out of Photos. At first I figured I was just reading the data wrong, but nope- under certain conditions when images with the portrait mask were uploaded to MPS, the rotation data from the camera went missing. After some communication and a Radar filed at Apple, this bug was fixed in an OS update. Bug fixes like this don’t happen very often, but when they do it makes filing all the other Radars worth it. Mostly.

Big Win for Web Accessibility in Domino’s Pizza Case

Lainey Feingold (via Jared Spool):

Circuit Court of Appeals gave a big win to digital accessibility in a case against Domino’s Pizza. The lower court had ruled for Domino’s and tossed the case out of court. The appeals court reversed, ruling that the ADA covers websites and mobile applications and the case can stay in court.


The case will now go back to the lower federal court in California. As the appellate judges concluded, “We leave it to the district court, after discovery, to decide in the first instance whether Domino’s website and app provide the blind with effective communication and full and equal enjoyment of its products and services as the ADA mandates.”

How Facebook Keeps Messenger from Crashing on New Year’s Eve

Amy Nordrum (via Hacker News):

In addition to shifting loads, the Messenger team has developed other levers that it can pull “if things get really bad,” says Ahdout. Every new message sent to a server goes into a queue as part of a service called Iris. There, messages are assigned a timeout—a period of time after which, that message will drop out of the queue to make room for new messages. During a high-volume event, this allows the team to quickly discard certain types of messages, such as read receipts, to focus its resources on delivering ones that users have composed.


Georgiou says the group can also sacrifice the accuracy of the green dot displayed in the Messenger app that indicates a friend is currently online. Slowing the frequency at which the dot is updated can relieve network congestion. Or, the team could instruct the system to temporarily delay certain functions—such as deleting information about old messages—for a few hours to free up CPUs that would ordinarily perform that task, in order to process more messages in the moment.


“You can bundle some of those together into a single large request before you send it downstream. Doing that, you reduce the computational load on downstream systems.”

Batches are formed based on a principle called affinity, which can be derived from a variety of characteristics. For example, two messages may have higher affinity if they are traveling to the same recipient, or require similar resources from the back end. As traffic increases, the Messenger team can have the system batch more aggressively. Doing so will increase latency (a message’s roundtrip delay) by a few milliseconds, but makes it more likely that all messages will get through.

Wednesday, January 16, 2019 [Tweets] [Favorites]

Google Pixel’s Night Sight

Jeremy Burge:

Whatever Apple does with the iPhone camera this year, they need to be able to compete with Pixel night mode. All taken on 18 month old Pixel 2 in challenging / dark conditions, and no iPhone photo at night comes close[…]


Seriously Google camera team: great job. You took a tiny sensor, put magic in the software and the results are unbelievable for a phone camera.


For those who haven’t used night sight on Pixel, it’s not ~just~ the magic in software. Pics also look better because the mode takes 2-5 seconds to take a photo. More light is captured, and movement stabilised. That’s what gives it the edge. That’s why iOS should offer this mode


No before/after pics online will do justice to photographing a near pitch-black room and getting a usable photo. Genuinely mind blowing.

Vlad Savov:

Google’s Pixel phones have already changed and improved smartphone photography dramatically, but the latest addition to them might be the biggest leap forward yet. Night Sight is the next evolution of Google’s computational photography, combining machine learning, clever algorithms, and up to four seconds of exposure to generate shockingly good low-light images. I’ve tried it ahead of its upcoming release, courtesy of a camera app tweak released by XDA Developers user cstark27, and the results are nothing short of amazing. Even in its pre-official state before Google is officially happy enough to ship it, this new night mode makes any Pixel phone that uses it the best low-light camera.

Previously: Google Pixel 3 and 3 XL, iPhone XS Users Complain About Skin-Smoothing Selfie Camera, The iPhone XS and Its Camera, iPhone and Android Cameras.

Swift Community Podcast

Episode 1 (tweet):

Welcome to the Swift Community Podcast — a podcast for the Swift community, by the Swift community. On this initial episode, John Sundell, Garric Nahapetian and Chris Lattner introduce the concept of the show and why it was created — and recount their first impressions of Swift and the evolution of the community, starting with Chris’ initial prototype back in 2010.

Turning Type Sideways

Jonathan Hoefler (via John Gruber):

This month, researchers made official something that typeface designers have long known: that horizontal lines appear thicker than vertical ones. At left, a square made from equally thick strokes; at right, the one that feels equally weighted, its vertical strokes nearly 7% thicker than the horizontals. This phenomenon, central to typeface design, has implications for the design of logos, interfaces, diagrams, and wayfinding systems, indeed anywhere a reader is likely to encounter a box, an arrow, or a line.


Is it possible that all of typography’s many optical illusions can be correlated with misapplied learning from our experience of the real world? So much of perception involves reflexively adjusting for the effects of context, light, or perspective, in order to make quick judgments about size, distance, color, or mass. Do we perceive round letters as shorter than flat ones because we intuitively understand something about the weight of cubes and spheres? Is it a lifetime of looking at foreshortened things above us that leads us to expect a well-balanced letterform to be smaller on top than on the bottom?

On Public Bug Trackers

Brent Simmons:

Decisions about what to work on — and when, and by whom — are complicated. From the outside it might look like it’s as simple as picking the next feature request with the most votes, but it’s not that simple.


But if you have a public bug tracker, you’d likely find that you’re having to explain your decisions all the time. You’d be constantly defending your plans to people who remind you that Feature X has all these votes, so why hasn’t it shipped yet?

Smokey Ardisson:

Sadly, this is just as true for open-source software projects as for commercial ones, but there’s no way around it in an open-source case (there are, of course, some ways to ameliorate the effects). But at least now you can point everyone to Simmons’s list of reasons why you might not be working on that thing they’re interested in, instead of having to type out the reason(s) yourself ☺︎

Customer Support for Failing App Downloads

Bruno Virlet (tweet):

On, if the customer selects the option “App fails to install or won’t download”, the customer receives the suggestion to contact the app developer correctly:

If you’re having issues with this app, please contact the app’s developer directly, they may have more specific troubleshooting steps for their app. Click on the App Site button to open the developer’s support page.

Of course, this isn’t due to a bug in the app because the App Store hasn’t even downloaded it yet. Despite my selling far more software through direct downloads than through the Mac App Store, customers report more download/installation problems when using the store. Another case where the App Store reality is not what one would have predicted. It’s galling that this remains unreliable when Apple controls every aspect of the process—and then blames the developer. These are some of the worst e-mails to get. The customer is rightly upset that I’ve just taken their money, yet they can’t download the app, and there’s not a lot I can do to help them.

Previously: Apple Support Tells Customers to Ask Developer for Refund.

Tuesday, January 15, 2019 [Tweets] [Favorites]

DuckDuckGo Switches to Apple Maps for Location Searches


We’re excited to announce that map and address-related searches on DuckDuckGo for mobile and desktop are now powered by Apple’s MapKit JS framework, giving you a valuable combination of mapping and privacy. As one of the first global companies using Apple MapKit JS, we can now offer users improved address searches, additional visual features, enhanced satellite imagery, and continually updated maps already in use on billions of Apple devices worldwide.

Dieter Bohn:

Before today, DuckDuckGo used a mix of different services to power its results: sidebars and boxes used OpenStreetMap, while asking for directions meant getting a drop-down menu with options from Bing, Here maps, and Google. DuckDuckGo says that “Apple is providing all of the maps for our new maps experience,” though it will “continue to use a variety of providers to add additional data to these results, such as a direct integration with Yelp.”

John Voorhees:

DuckDuckGo explains elsewhere on its site that it uses GEO::IP lookup to determine users’ location by default. For better results, users can grant DuckDuckGo permission to use their browser location data, in which case DuckDuckGo says searches are still anonymous because the company does not store location data on its servers.

Previously: WWDC 2018 Links.

Signal v Noise Exits Medium

David Heinemeier Hansson:

Three years ago we embraced an exciting new publishing platform called Medium. It felt like a new start for a writing community, and we benefitted immensely from the boost in reach and readership those early days brought. But alas it was not to last.


These days Medium is focused on their membership offering, though. Trying to aggregate writing from many sources and sell a broad subscription on top of that. And it’s a neat model, and it’s wonderful to see Medium try something different. But it’s not for us, and it’s not for Signal v Noise.


Traditional blogs might have swung out of favor, as we all discovered the benefits of social media and aggregating platforms, but we think they’re about to swing back in style, as we all discover the real costs and problems brought by such centralization.

David Heinemeier Hansson:

Nice bonus from leaving @medium is to finally be able to kick those fucking Facebook like buttons off our posts

Previously: Moving to Medium, Preserving Permalinks.

Save Changes Before Quitting?

Niko Kitsakis:

This “Mac-like” feeling was at the core of the classic Mac OS era. It’s what gave the Mac its legendary status and its place in history. And while the first versions of OS X broke with some conventions, things became better as OS X progressed. That is to say, until 10.7 came out and started a trend of questionable design decisions that has been continuing ever since.

But it’s not only Apple that seems to have forgotten its own roots in making good Human Interfaces, the rest of the software industry too seems strangely preoccupied with reinventing the wheel while making it worse with every iteration.


But unfortunately, these things do not only evolve, sometimes they devolve. Fast forward around 25 years to 2018 and you’ll find this in Adobe Premiere[…] No icon, no verbs and the unsafe option is right in between the safe ones.

Previously: The Lost Art of Legendary Apple UX.

Update (2019-01-17): Dave Mark:

The Mac design language was so powerful, and so widely adopted, that any app that did not follow the rules stood out like a sore thumb. Mac applications were instantly recognizable, and apps from outsiders tended to look ugly, in comparison, as those outsiders did not know the rules to follow.

Does the modern macOS and iOS app universe still hew to a common standard? Are Apple’s Human Interface Guidelines lost in the incredible complexity of application creation?

Update (2019-01-18): Niko Kitsakis:

My english language version of Photoshop tells me in german that it can’t open .psd files because they are in “Adobe Photoshop preference file format”

Announcing the FoundationDB Record Layer

FoundationDB (via Will Wilson, Hacker News):

The Record Layer stores structured data, just like a relational database. Databases managed by the Record Layer support records with fields and types, an evolving schema, complex primary and secondary indexes, and declarative query execution. The Record Layer also includes features not typically found in a traditional relational database, such as support for complex nested data types, indexes on the commit-time of records, and indexes and queries that span different types of records.

Built on top of FoundationDB, the Record Layer inherits FoundationDB’s strong ACID semantics, reliability, and performance in a distributed setting. The Record Layer also uses FoundationDB’s transactional semantics to provide features similar to a traditional relational database, but in a distributed setting. For example, the Record Layer’s secondary indexes are maintained transactionally, so they’re always up-to-date with the latest changes to the data. Transactions reduce the number of bugs in application code and greatly simplify application development.


Together, the Record Layer and FoundationDB form the backbone of Apple’s CloudKit. We wrote a paper describing how we built the Record Layer to run at massive scale and how CloudKit uses it. Today, you can read the preprint to learn more.

Previously: Apple Open Sources FoundationDB, Exploring the New iWork File Formats, Swift Protobuf.

Monday, January 14, 2019 [Tweets] [Favorites]

AWS, MongoDB, and the Economic Realities of Open Source

Ben Thompson:

Basically, MongoDB sells three things on top of its open source database server:

  • Additional tools for enterprise companies to implement MongoDB
  • A hosted service for smaller companies to use MongoDB
  • Legal certainty


This leaves MongoDB Inc. not unlike the record companies after the advent of downloads: what they sold was not software but rather the tools that made that software usable, but those tools are increasingly obsolete as computing moves to the cloud. And now AWS is selling what enterprises really want.

Worse, because AWS doesn’t have access to MongoDB (it is only matching the API) it only supports MongoDB 3.6; the current version is 4.0.5.


This tradeoff is inescapable, and it is fair to wonder if the golden age of VC-funded open source companies will start to fade (although not open source generally). The monetization model depends on the friction of on-premise software; once cloud computing is dominant, the economic model is much more challenging.

GoDaddy JavaScript Injection

Igor Kromin (via Hacker News):

The technology that’s in use here is called Real User Metrics and GoDaddy has a page about it here - Why am I signed up for Real User Metrics?. If you happen to be a customer in US (which I am not but the website is hosted in a US data centre) then you are automatically opted into this service and all your website’s pages will have this JavaScript injected into them.


The worst part of it is GoDaddy, in their help article, admits that this could slow down or break your site! So much for a tool that is designed to improve performance and reliability!

It sounds like this only happens if you use GoDaddy as a Web host, rather than just for DNS.

The Lost Art of Legendary Apple UX

Marcin Krzyżanowski:

iPhone X in my case is not compatible with the website of the company that iPhone business is like what… 80% of revenue?

I recorded my annoyance: the bottom part of the website covers the part where “Accept” button is located. Also, scrolling is very hard (unlike iPhone scrolling at all).

Fun part: the website suggest to open App Store Connect, yea right!


The form has some fields that I don’t understand, and error messages mention fields that are missing (find SWIFT code field mentioned in the error message).

Update (2019-01-15): Brian:

I just had that same iTunes connect issue with an internal TestFlight link at my work. The kicker was after I couldn’t accept the new T&C on my iPhone, I tried on my PC and the link had expired because I’d already redeemed it.

Closing Down Coriolis Systems

Alastair Houghton:

The shift to APFS and the continuing lock down of the platform have meant that our existing products have become obsolete and their sales have declined to a trickle. Perhaps, if the full APFS documentation had been released somewhat before users’ machines were converted over to it, things might be different — though even then I’m not sure Apple’s current focus on security is conducive to a viable market for third-party utility software, solid state storage really doesn’t require defragmenting in 99% of cases and in all likelihood the Mac line will at some point shift to ARM at which point you won’t be able to run Windows on it except through emulation, which will substantially reduce the market for partitioning tools also.

Aura, our AC-3 compatible real-time encoder, doesn’t sell in any volume, and right now Coriolis isn’t covering its costs.

Previously: iDefrag and iPartition Discontinued.

Aliases, Hard Links, Symlinks, and Copies in Mojave’s APFS

Howard Oakley:

There are now five different types of copy/clone/alias/link: the regular copy, APFS clone (copy on write clone), symbolic link (symlink), hard link, and Finder alias. I’ll tackle them in that order.

Howard Oakley:

Bookmarks are a generalisation of Aliases which allow variants, including those saved as files, both the Finder Alias and alisma’s Bookmarks, which are similar but not identical. Bookmarks have been used extensively internally in macOS and applications since at least Mavericks 10.9 in 2013. They’re now used in a lot of preference files and other places, particularly by Launch Services in its SharedFileList files stored in ~/Library/Application Support/


The remaining issue with Bookmarks and Aliases is that they cannot ordinarily be resolved at the command line or in scripts. My free tool alisma should be a help, as it can return the absolute path from a Bookmark file or Alias.

Howard Oakley:

Here are a couple of tables which summarise the most important features of different types of copies, clones, links and aliases used in Mojave running on APFS (with a little reference to HFS+ too).

Howard Oakley:

The bug occurs if you select a Finder Alias to a missing folder in a window set in Column view. After an initial pause of a few seconds, the spinning beachball appears, and the only way to regain access to the Finder is to press Command-Option-Escape, then select Finder and restart it.

Howard Oakley:

One of the claimed advantages of Finder Aliases, and their parent Bookmarks, is their robustness in the face of change. Because they can use both an absolute path and the unique inode number to resolve the location of the item to which they point, they should be much more reliable than symbolic links, and easier to use than hard links. Indeed, since System 7 in 1991, they have been the only form of link which can be created in the GUI of Mac OS and macOS, the others requiring command line access.


When studying the contents of orphaned Finder Aliases and Bookmarks, whose original items had been removed, I noticed that some contained paths not to the original location of the item, but to it after it had been placed in the Trash, but others had retained the original location instead. A little further experimentation confirmed an interesting aspect of their behaviour: resolving an Alias or Bookmark will cause its saved paths to be updated if they have changed.


Given the information stored in an Alias about the file or folder to which it points, this is perhaps not surprising: the resolver has a lot more to work with than just the path, volume and inode number, and does appear to use that additional information to ensure that, most of the time, the link between the alias and original will work, and it won’t be fobbed off by another item posing as the original.

Howard Oakley:

QuickLook previews in other places, such as the Open File dialog, don’t appear to use the cache, though. At the same time that QuickLook was offering that thumbnail in the Finder, the preview offered in an Open File dialog was that for the imposter file which had replaced the original.

This may be related to the fact that when the resolver updates the contents of an Alias, that change isn’t readily detected by anything calling the resolver. There thus doesn’t appear to be a simple way for QuickLook to tell whether its cached data need to be refreshed because the item to which the Alias points has changed. It’s a subtlety which produces amusing demonstrations, but is of little importance to Mac users.

Howard Oakley:

The new features in this version of Precize which make this possible include separating the analysis of Alias contents from resolving them. There’s also a checkbox to avoid changing the Alias data when resolving it: although this shouldn’t make any difference, as it is working on extracted data from the Alias and not the Alias itself, I offer it as an option.

Previously: BookmarkData Exposed.

Friday, January 11, 2019 [Tweets] [Favorites]

Strangers Watching Ring Security Cameras

Sam Biddle:

But for some who’ve welcomed in Amazon’s Ring security cameras, there have been more than just algorithms watching through the lens, according to sources alarmed by Ring’s dismal privacy practices.


Despite its mission to keep people and their property secure, the company’s treatment of customer video feeds has been anything but, people familiar with the company’s practices told The Intercept. Beginning in 2016, according to one source, Ring provided its Ukraine-based research and development team virtually unfettered access to a folder on Amazon’s S3 cloud storage service that contained every video created by every Ring camera around the world. This would amount to an enormous list of highly sensitive files that could be easily browsed and viewed. Downloading and sharing these customer video files would have required little more than a click.


At the same time, the source said, Ring unnecessarily provided executives and engineers in the U.S. with highly privileged access to the company’s technical support video portal, allowing unfiltered, round-the-clock live feeds from some customer cameras, regardless of whether they needed access to this extremely sensitive data to do their jobs.

See also: Nick Heer, MacRumors.

Previously: Nest Cam Waking in the Night.

Hacking With Private APIs on iPad

Guilherme Rambo:

The best development environment to work with private APIs is still Xcode on the Mac, but there’s a lot that can be done on iOS, especially the iPad. Of the three options shown in this article, it is hard to name a favorite because each one has advantages and disadvantages, but the one I’ve been using the most, especially because of its flexibility and integration with Shortcuts, is JSBox.

iOS Games Found Talking to Golduck Malware C&C Servers

Sergiu Gatlan:

Even though Apple has always been especially proud of its App Store app review process, it seems that some apps which are not exactly malicious but do exhibit risky behavior escape its review team’s scrutiny occasionally.

This is the case of over a dozen iOS applications found in Apple’s App Store which were observed while transferring data to command-and-control servers known to have been used by the Android Golduck Loader.

Jennifer Valentino-DeVries and Natasha Singer:

The Weather Channel app deceptively collected, shared and profited from the location information of millions of American consumers, the city attorney of Los Angeles said in a lawsuit filed on Thursday.


The government said the Weather Company, the business behind the app, unfairly manipulated users into turning on location tracking by implying that the information would be used only to localize weather reports. Yet the company, which is owned by IBM, also used the data for unrelated commercial purposes, like targeted marketing and analysis for hedge funds, according to the lawsuit.

Via Andrew Pontious:

It should also get them kicked out of the App Store, if Apple is committed to evenhandedness and fairness.

Previously: How to Game the App Store.

App Discovery, Downloading, and Purchasing

Ben Bajarin:

In collaboration with a few indie app developers, we ran a study looking to see how consumers discover, decide on which app to download, and some underlying economics around the app ecosystem. This study had respondents from the US and key parts of Europe. In total, 908 consumers participated in this study.

Reviews and price are very important. Most customers did not feel tricked into paying for IAPs or subscriptions. More than 40% of iOS customers had only three or fewer paid apps.

Wednesday, January 9, 2019 [Tweets] [Favorites]

Google Assistant Coming to Google Maps for iOS

Dieter Bohn (MacRumors):

Manuel Bronstein, VP of product for Google Assistant, made the case that Google is building an entire ecosystem for Assistant that’s akin to the ecosystem it’s built for Android. It’s a platform play, basically, just like Alexa. And Google wants to ensure it’s everywhere.


Beyond Android Auto, partners like Anker are making little lighter plug-ins that work with Google Assistant. A bigger deal, though, is that Google is going to bake Google Assistant into Google Maps. It may not be able to convince iPhone users to install the Google Assistant app, but it has a huge install base for Maps. Google says that Assistant in Maps will let you “share your ETA with friends and family, reply to text messages, play music and podcasts, and get information hands free.”

Elgato Thunderbolt 3 Pro Dock

Joe Rossignol:

The dock is equipped with two USB-C ports with transfer speeds up to 10Gb/s, two USB-A ports with transfer speeds up to 5 Gb/s, two Thunderbolt 3 ports with transfer speeds up to 40 Gb/s, one DisplayPort 1.2, one Gigabit Ethernet port, a 3.5mm headphone jack and audio output, and SD and microSD card readers.

Marco Arment:

Finally! Someone has made a 1-to-4 USB-C hub!

Except it costs as much as an iPad. And has a bunch of other stuff you may not need. And it won’t work with the 12-inch MacBook or iPad Pro — it’s Thunderbolt-only.

Previously: The Impossible Dream of USB-C.

Adding a Command Line Tool Helper to a Mac App Store App

Timo Perfitt:

During testing, the command line tool continually crashed with a “Illegal Instruction: 4” both in the app and when I ran the tool outside the app on the command line. Turning off code signing (or not signing the app) make the issue go way, but code signing is required for submitting to the Mac App Store.


Long and short of it:

  1. The command line tool must have a Mach-O load command for LC_VERSION_MIN_MACOSX. It can be set using the GCC flag “-mmacosx-version-min=10.12” (change 10.12 to what makes sense).
  2. Command line tools must be signed with an entitlement that has exactly 2 rules: sandbox and inherit. It can be set with the codesign command. All other rules are inherited from the main app and should be set there.
  3. Pretty sure that the command line tool must be in the MacOS folder or a perhaps a folder named “Helpers”. I put mine in the Executables folder in a Copy Files build phase[…]

The exception is if your app has a command-line tool that is meant to be invoked by the user. Then the “inherit” entitlement would get in the way because it’s not being run from your app.

The Toxic Fragility of Siri Shortcuts

Gabe Weatherhead:

I love both of these Shortcuts because I can use a simple voice command to trigger them and they make my life a tiny bit better. Well, they did until a couple of weeks ago.


Without predictable outcomes from Siri Shortcuts it might as well not exist. It’s not helpful to issue a command that worked yesterday and get a joke response back today. If I wanted that, I’d ask my kid to do it.

Via Nicholas Riley:

Very surprised this doesn’t get more press. Siri shortcuts reliability, like Siri overall, is so bad that I can’t rely on it.

Update (2019-01-11): Dave Verwer:

I hadn’t seen this until I was just catching up with @mjtsai’s blog, but this tweet thread from me sounds like the same bug.

I know it’s not good enough, but deleting the shortcut and recreating it with the same phrase does work.

No NVIDIA Drivers for Mojave


Developers using Macs with NVIDIA graphics cards are reporting that after upgrading from 10.13 to 10.14 (Mojave) they are experiencing rendering regressions and slow performance.

Apple fully controls drivers for Mac OS. Unfortunately, NVIDIA currently cannot release a driver unless it is approved by Apple.

Marco Chiappetta (via Hacker News, MacRumors):

And when Apple pushed macOS 10.14 out the door, it appears suspended support for some discrete NVIDIA GPUs. According to Apple’s website, only two aging “Mac Edition” discrete NVIDIA GPUs, the Quadro K5000 and GeForce GTX 680, are officially supported. Pre-Mojave though, many users had turned to newer, more powerful NVIDIA discrete GPUs based on the company’s Pascal architecture for workloads that can benefit from NVIDIA’s CUDA parallel computing platform and other proprietary development tools.


In the post, Diamond tags Jarred Land, a producer that also happens to be the president of RED Digital Cinema, who himself is an NVIDIA user. In fact, Land has a post on his wall showing a GeForce RTX Titan decoding 8K video in real-time at 23.98 frames per second, out to a Sharp 8K UHD TV. “Not allowing NVIDIA to put out drivers for OSX 10.14 hurts my business. We depend on NVIDIA drivers to keep our Macs flying through apps like Creative Cloud, Resolve and RED Workflows. We NEED these drivers to keep our pipelines from impacting our clients.”, said Jason.

Colin Cornaby:

I don’t really like Nvidia. But I’m tried of Apple making life difficult for GPU makers. eGPU was a great step. But now Apple is now strangling adopting by restricting GPU drivers. Apple should allow Nvidia to release their Mojave drivers, and ideally make the driver layer public

Previously: Removed in macOS 10.14 Mojave.

Update (2019-01-11): isaiah:

i gave up hope and gave my very nice 5K capable Nvidia 1080 card to my kids’ VR PC.

it’s tough to invest in a platform where the maintainer’s capricious decisions often cost you a thousand bucks.

Tuesday, January 8, 2019 [Tweets] [Favorites]

Overcast Premium Improvements


Two big improvements for Overcast Premium, which lets you upload your own audio files (DRM-free audiobooks, lectures, draft podcasts, etc.) at and listen in your Overcast app:

  • More space: up from 2 GB to 10 GB
  • Multi-select file uploads! Finally!

Environmentally-Lit User Interface

Bob Burrough:

I’ve been working on an environmentally-lit user interface. It’s lit by the lighting around you rather than some arbitrary light source (or just blinding white).

Bob Burrough:

An environmentally-lit interface takes information from the environment around the device and uses it to render physically-accurate things on the screen. It appears as if the lights around you are shining on the things on the screen. If the lighting in your room is bright, then the things on your screen are brightly lit. They can even take on complex characteristics like mother-of-pearl or opal.

Now, this doesn’t mean you have to hold a flashlight over your phone to read the web in bed. What it means is designers are empowered to use the design language of the physical world to design their interfaces. Gloss, glitter, glow-in-the-dark, or any other visual quality may be used. In the case of reading a website in a darkened room, the web designer may apply elegant backlighting or glow-in-the-dark treatments to maintain legibility. This is far superior to today’s method of making your phone act like a spotlight that shines in your face.

This is really cool.

Bob Burrough:

Flat design results in higher cognitive load.

Dave Smith:

Burrough’s “Project Erasmus” is a user-interface (UI) implementation that uses the lighting in your immediate environment to light, shade, and reflect on the software elements in the device. The result is an incredible, immersive visual effect that would make you want to use your phone even more (as if that’s possible).

Andrew Orr:

For example, software toggles and menu bars develop drop shadows and highlights based on light sources in the room. He does this by attaching an Olloclip wide angle lens to capture the light, then the software renders that light as a scene. This is real-time rendering and it makes elements on the screen appear as physical objects.

See also: TMO Daily Observations.

The iOS Menu

Simon (tweet, Hacker News):

I realised six months ago as I was using my Mac, using the menus, that I need these things — menus — in Codea. I was trying to solve a problem that has been solved for decades.

So I set out to make the best menus I could make for iOS.


Compared to all the options I considered, menus are exactly that, discoverable. You pull down a list of named features complete with shortcut keys (if a keyboard is attached). Then you activate that feature by tapping on it, or by dragging your finger and releasing.

Hamburger menus, side-drawers, whatever you want to call them, are a conventional way to bury additional and often unrelated functionality into an app. But they are much heavier than the good old-fashioned menu bar. They often pull out a whole modal side-thingy, maybe they slide all your content to the right. It’s a context switch for your brain.

iOS really needs something like this. I get that Apple didn’t want to bring over everything from the Mac’s design. But, as with some other features, I feel like they’ve had their chance to show us a better way and haven’t delivered. So they may as well reinvent the wheel.

Previously: Proof That iOS Still Hasn’t Gotten Undo Right, Make the iPad More Like the Mac, Great Alternatives to Hamburger Menus.

Update (2019-01-11): Simon:

In this post I’m going to walk you through all the other details that make this work.

John Gruber:

What they’re doing here with Codea isn’t just putting the Mac menu bar on iOS. They’ve designed and built a very iOS-looking take on a menu bar, deeply informed by the aspects of the Mac menu bar that do work on a touch screen. Something like this is desperately needed as a standard interface element on iPad, and I think could work on iPhone too.

Riccardo Mori:

Speaking of iOS apps with menus, the first instance I remember seeing was TaskPaper on iOS 6. I still use this app, by the way.

Solution for Time Machine “Error While Restoring From the Backup”

Harry Fear (via Maxwell):

A few hours into the restore (about three-quarters of the way through the data transfer) the restoration would always fail with “An error occurred while restoring from the backup.”


Initially I needed a Finder and Terminal window so I had to setup the new Mac as new with no user data so I could fully access the Time Machine backup to apply the fix. Then I connected the backup to the Mac.


Then I had to delete the problematic folder that was identified in the log[…]


Then go back into Recovery mode on the Mac and reattempt to restore from the modified backup.

iCloud Leader Leaves Apple

Kevin McLaughlin (MacRumors):

Patrick Gates, an Apple senior director of engineering who led development of iCloud, FaceTime, and iMessage during nearly 14 years at the company, has left to join a stealth startup founded by two other former Apple employees. The startup, called Humane, announced Mr. Gates had joined as chief technology officer on Dec. 19.

Mr. Gates, who worked in an organization led by Internet services chief Eddy Cue, oversaw a project in 2015 that aimed to unify Apple products like iCloud and iTunes in a single cloud platform. But the effort was delayed by friction with another Apple group led by former engineering executive Eric Billingsley, who left the company last October.

Humane, co-founded by former Apple directors Imran Chaudhri and Bethany Bongiorno, is working on products that focus on “the next shift between humans and computing,” according to a note on its website.

Amir Efrati and Steve Nellis (in 2016):

Political infighting within Apple’s engineering ranks is holding back the company’s efforts to fix technical problems that have plagued iCloud and iTunes, say people with direct knowledge of the situation.

Two engineering teams working on new internal cloud-computing infrastructure to power Apple’s Web services are in open conflict, the people say. Already, the infighting has sparked at least one key employee departure, with more expected soon.

Via Dan Masters:

Noteworthy that repeated reports of dysfunction and infighting (culminating in both Apple cloud managers leaving within months of each other) aligned with personal experience of other employees as well[…]

Katharine Schwab:

Chaudhri left Apple in 2017, after spending almost two decades designing interfaces for the iPod, iPad, Apple Watch, and Apple TV as well as the iPhone, to pursue a still-under-wraps company of his own. I recently sat down with him to talk about his time at Apple, and had the chance to ask him how he views his legacy now that the downsides of smartphones have come into focus. He cited the challenges of working as a designer at a giant corporation, where his personal ethics didn’t always align with decision-making[…]

Previously: Inside the World of Eddy Cue, Apple’s Services Chief.

Monday, January 7, 2019 [Tweets] [Favorites]

GitHub Now Offers Unlimited Free Private Repos

GitHub (Hacker News):

GitHub Free now includes unlimited private repositories. For the first time, developers can use GitHub for their private projects with up to three collaborators per repository for free. Many developers want to use private repos to apply for a job, work on a side project, or try something out in private before releasing it publicly. Starting today, those scenarios, and many more, are possible on GitHub at no cost. Public repositories are still free (of course—no changes there) and include unlimited collaborators.

Update (2019-01-08): Paulo Andrade:

So Microsoft bought HockeyApp and are doing a pretty good job so far of turning it into @VSAppCenter. Then they bought @github and added free private repos. At this rate it looks like I’ll be coding Swift in @code soon

My question is:

What’s Apple doing with @buddybuild?

Previously: App Center Will Take It From Here, Apple Acquires Buddybuild.

iTunes Video and AirPlay on Samsung TVs

Eric Slivka:

Samsung today announced that it has worked with Apple to integrate iTunes movies and TV shows, as well as AirPlay 2 support, into its latest smart TVs. The features will roll out to 2018 models via a firmware update this spring and will be included on new 2019 models. iTunes movie and TV show access will come via a new dedicated app for Samsung’s TV platform, available in over 100 countries.

Eric Slivka:

- Apple says “leading TV manufacturers” will be including AirPlay 2 support in their TVs, indicating that this initiative will not be a Samsung exclusive. Apple has not, however, announced additional TV partners or a timeline for when AirPlay 2 will come to these other brands. Samsung’s support is rolling out in a firmware update for 2018 TVs and built into 2019 models “beginning this spring.”

- AirPlay 2-enabled TVs will act just like any other AirPlay 2 speaker, meaning you can send many different types of audio from an iOS device or your Mac to your TV. Music being sent to your TV via AirPlay 2 can also be synced with other AirPlay 2 speakers.

Great news. I’d been hoping they’d do this for a long time. It makes my purchased content seem more secure, given that I’ve had problems with the old Apple TV hardware and don’t want to buy a new Apple TV. Less reason to make the big jump to Amazon and fragment my library.

Benjamin Mayo:

In other words, Samsung TVs will be able to watch 4K iTunes content before Macs can.

Ryan Jones:

Hm, so are cross-platform Services the entry drug to Apple or the ecosystem around iPhone? I’m not sure.

But it is hard to imagine Music and TV competing well against Spotify, Netflix, YouTube TV without a native advantage.

Ryan Jones:

They just aren’t built for any of this. Really really feels like spreadsheet growth-hunting.

Josh Centers:

Everyone says I’m wrong here, but Samsung won’t be the last smart TV platform to get iTunes. And once it’s as universal as Amazon Video, you’d be crazy to drop $180 on an Apple TV.

Mitchel Broussard (in 2016):

HTC announced its new smartphone, the HTC 10, revealing that the Android device will have the ability to wirelessly play audio through devices and speakers that support streaming via Apple’s AirPlay feature (via SlashGear).

Previously: Cultural Insularity and Apple TV, Movies Anywhere, Amazon Offering Apple Products.

Update (2019-01-08): Colin Cornaby:

Still don’t understand why Apple doesn’t offer a built in option for a Mac to become an AirPlay 2 target.

Nilay Patel:

Apple tells me that no smart TV content tracking is allowed on AirPlay 2 streams on Vizio and LG TVs, in addition to preventing Samsung from tracking the iTunes app. Sounds like they pushed this policy with the industry, good for them

The wacky part is Apple can’t prevent TV makers from content tracking on HDMI inputs, so a smart TV can track what you watch on an Apple TV!

Joe Rossignol:

A few days ago, Apple announced that AirPlay 2–enabled smart TVs are coming soon from leading manufacturers, and we’ve since seen a series of announcements from Samsung, LG, Sony, and Vizio at CES 2019.

Update (2019-01-09): Benjamin Mayo:

This is great for everyone.


If Apple had licensed AirPlay video more liberally from the get-go, every TV screen and projector would have it built in already. No need to buy a $100+ peripheral. No need to switch to the Apple TV input. No setup needed.

I am so pleased that Apple has changed their stance here. These partnerships bind Apple customers more closely to the iPhones, iPads and Macs they already own, and improve customer satisfaction and loyalty rates. It will take time for the number of AirPlay 2-enabled TVs sold to be meaningful, but in the course of time, it will be commonplace.

Rene Ritchie:

I’m keeping my Apple TV until Eddy Cue pries it from my Hulk hands.

tvOS interface is much better for me than any of the smartTV stuff, and I trust it to have better updates faster, be more secure and private, and I dream of apps taking off one day. So help me. LOL.

Kirk McElhearn:

Hell is freezing over for Apple because the company has finally accepted that it cannot make enough money from its video offerings just with Apple devices (ie, the iPhone, iPad, and Apple TV). This also suggests that the Apple TV has seen its last iteration. If Apple can put the same apps on any smart TV – which is, of course, not complicated – why have a separate device?

Lee Bennett:

Coz Apple TV provides an app experience not available anywhere else! Long Live Apple TV!

Nick Heer:

But I am not sure that necessarily leads to the end of the Apple TV. I don’t see the company abandoning dedicated hardware just because it has a services business, even for a presently lower-priority product like the Apple TV. It seems to me that it’s more likely that Apple’s TV product may morph to become a full television that they have complete control over. Why not? Most televisions look awfully cheap and are privacy nightmares.

Update (2019-01-11): Josh Centers:

The HomeKit story is a bit more interesting: supported TVs will become HomeKit devices, and as such, you’ll be able to turn them on and off or change their inputs with Siri or Apple’s Home app. You’ll also be able to create HomeKit scenes with actions to control these TVs.


What’s curious is how only Samsung gets iTunes Movies and TV Shows, but Samsung is the only vendor not providing HomeKit support. Despite that confusion, these announcements may be great news for Apple users who own or plan to buy a supported TV set, but what does it mean for the rest of us, and for Apple’s TV plans going forward?

Why Doesn’t JSONEncoder Conform to the Encoder Protocol?

Kaitlin Mahar:

Inspecting the source code for JSONEncoder, we see it’s a open type that internally uses a private type _JSONEncoder, which does conform to Encoder.


But why were they designed that way? Why not just make JSONEncoder an Encoder too?

In short, the answer is that they provide very different APIs. The JSONEncoder API is designed to provide a single, simple entry point into encoding, and the Encoder protocol provides a completely different API for customizing how types are encoded.

This makes sense, though it’s kind of odd that the facade and the protocol for the private types both use the same word (Encoder/Decoder). Cocoa distinguishes between NSArchiver/NSUnarchiver, which you use directly, and NSCoder, which is passed to you. Although, that’s also a bit messy because the archivers are subclasses of NSCoder, and so all the other methods are still there.

Swift Import Declarations

Mattt Thompson:

Import declarations have a form that can specify individual structures, classes, enumerations, protocols, and type aliases as well as functions, constants, and variables declared at the top-level:

import <#kind#> <#module.symbol#>

Here, kind can be any of the following keywords[…]


In practice, isolating imported declarations and submodules doesn’t confer any real benefit beyond signaling programmer intent. Your code won’t compile any faster doing it this way. And since most submodules seem to re-import their umbrella header, this approach won’t do anything to reduce noise in autocomplete lists.

If you gave up after finding that import Module.Class doesn’t work, you actually can do it with import class Module.Class.