Archive for September 17, 2019

Tuesday, September 17, 2019 [Tweets] [Favorites]

iPhone 11 Reviews

John Gruber:

My biggest problem is that I wrote this review last year. I re-read my review of last year’s iPhones XS (regular and Max) and at almost every single paragraph I found myself wanting to say the exact same thing again this year. Not that these phones are the same as last year’s phones, but that the year-over-year improvements are equally impressive and at times mind-boggling.


The bad news is, Haptic Touch is a bit slower. With 3D Touch, when you pressed, the action was instantaneous. With Haptic Touch, because it’s sort of a long press with pressure, there’s a very brief pause before it fires. […] The iPhone 11 Pro and Pro Max have bigger batteries than their XS counterparts, using the space freed up by omitting 3D Touch.


Another bit of magic. There are two new options in Settings → Camera: “Photos Capture Outside the Frame” (off by default) and “Videos Capture Outside the Frame” (on by default). When these options are turned on, when you shoot with the 1x or 2x lenses (wide or telephoto), the Camera app will use the next widest lens to capture additional footage outside the frame of the lens you’re shooting with. In post, this allows you to rotate the photo or video — typically, to fix a crooked horizon — without cropping.


QuickTake seems the closest to marketing spite, by which I mean this is such a great idea that I really think it ought to be part of the Camera app for all iPhones running iOS 13.

Matthew Panzarino:

Phone 11’s Night Mode is great. It works, it compares extremely well to other low-light cameras and the exposure and color rendition is best in class, period.

Nilay Patel:

These are some of the most well-balanced, most capable phones Apple — or anyone — has ever made. They have excellent battery life, processors that should keep them relevant for years to come, absolutely beautiful displays, and a new camera system that generally outperforms every other phone, which should get even better with a promised software update later this fall.


My iPhone 11 and 11 Pro review units are running iOS 13.0, and iOS 13.0 is pretty damn buggy. I saw all kinds of glitches and crashes during my week of testing, as did Verge executive editor Dieter Bohn with his iPhone 11 review unit running iOS 13.

Rene Ritchie:

Apple says the batteries on the iPhone 11 Pro and iPhone 11 Pro Max will last — wait for it — 4 and 5 hours longer than last year’s iPhone XS and XS Max. And no, that’s not a typo. I checked. Thrice: Up to 18 hours of local video playback for the Pro and 20 for the Max, 11 hours and 12 hours of video streaming, and 65 and 80 hours of wireless audio.


Water resistance has improved on the Pro models as well. The XS was already IP68 and rated for up to 30 minutes at up to 2 meters. The Pros will go all the way down to 4 meters, though.


Apple's latest, greatest system-on-a-chip, the A13 Bionic, manages to be both faster and less power hungry at the same time. 20% faster across the efficiency, performance, graphics, and neural engine cores, and 40, 25, 30, and 15% less power hungry respectively.

Michael Love:

N.B.: the highest Geekbench 5 single-core score for any Mac is 1262. (2019 iMac 3.6) So the iPhone 11 now offers the fastest single-core performance of any computer Apple has ever made.

Sebastiaan de With:

The biggest notable change across the board this year is what we already teased: the max ISO of the sensors has gone up significantly. The Wide camera’s maximum ISO sensitivity is up 33%; the telephoto 42%!

See also: MacRumors, MacStories.


Update (2019-09-19): Austin Mann (Hacker News):

Of course, I’ve also been anxious to see what this Ultra Wide lens can do, so shortly after the performance I popped out to the countryside to find some epic landscapes and have been out exploring this big, beautiful country ever since.

Update (2019-09-25): John Gruber:

Is there a setting to make holding down the shutter shoot a burst instead of video? No, there is no setting for this. There should be, though.

See also: Joanna Stern, Brian X. Chen, John Gruber.

Juli Clover:

According to iMore’s Rene Ritchie, bilateral inductive charging wasn’t pulled from the iPhone 11 because it was never slated for production to begin with. Ritchie says there is no hardware in iPhone 11 models that would allow such a feature to be enabled later.

Matt Birchler:

Given the leaks and the marketing images, I was not prepared for how small the camera bump would be on the 11 Pro.

Rene Ritchie:

There’s a new Night King in town. At least that’s what some people are saying. As usual, Apple wasn’t first to computationally enhanced low light photography, just like they weren’t first to multiple cameras or depth effects, or even phones at all.

Now that they are, they’re doing it in typical Apple style. It doesn’t do everything. You can’t force it on manually. You can’t use it with the focus-pixel-free ultra-wide-angle. But what you can do, you can do well. With good detail recovery, texture preservation, and tone mapping.

In a really opinionated, maybe even controversial way.

Ryan Cash:

From left to right:

iPhone 5
iPhone 6
iPhone XS
iPhone 11 Pro

Handheld and unedited.

Jay Freeman:

I just spent an hour using an iPhone to take videos of iPhones taking video of an iPhone (with a fifth iPhone to take a video of the rest) to verify this: the iPhone 11 Pro Max on iOS 13 has an additional 50-66ms of latency in its camera preview vs. the iPhone XS Max on iOS 12.4.


I did this as I was having a subtle-yet-annoying feeling of motion sickness using the iPhone 13 Pro Max camera that I have never experienced with an iPhone before and wanted to be 100% sure I wasn’t making it up; a 100ms input latency was already “pushing it”: 166ms is “too far”.

Gannon Burgett:

Not all of the cameras are made equal though. In addition to not having optical image stabilization, it’s been revealed the ultra-wide camera unit on all three models isn’t yet capable of capturing Raw image data or manual focus, unlike the wide-angle camera (and telephoto camera on the iPhone 11 Pro models).

Nick Heer:

These are curious limitations that put the ultra-wide camera on a similar level to the fixed-focus front-facing camera that only captures compressed image formats. It’s expected on the front, but a little disappointing for a back-mounted camera, especially as the other cameras don’t have these restrictions, so it’s a little inconsistent.

Update (2019-09-26): See also: Samuel Axon and Juli Clover.

Update (2019-10-04): John Gruber:

0.5× always uses the ultra-wide camera, because you can’t get that field of view otherwise. 1× always uses the wide angle, because that camera has the best sensor and fastest lens. But 2× doesn’t mean you’re always using the telephoto camera — in low light it will use the wide-angle camera and digital zoom. Previous iPhones with dual camera systems have done the same thing in low light conditions, but a lot of us — myself included — made the wrong assumption about Night Mode and “2× zoom”.

It occurs to me that this is why Apple has been somewhat obfuscatory about Night Mode working only with the regular wide angle camera, despite being very forthcoming about explaining other technical details (like Deep Fusion) at great length: it means the iPhone 11 can shoot the exact same “2×” Night Mode shots as the iPhone 11 Pro, because on both phones 2× Night Mode shots are cropped and digitally zoomed from the 1× camera sensor.

Update (2019-10-18): John Gruber:

“HD” video is usually 1920 × 1080, but Quick Video shoots 1920 × 1440 because it always records with a 4:3 aspect ratio. That’s not what I expected, but you don’t lose anything — the 1920 × 1080 image recorded by default in the “Video” mode is a 16:9 center crop of the 4:3 sensor.

Breaking the NSData.description Contract

Mattt Thompson (tweet via Cédric Luthi):

iOS 13 changes the format of descriptions for Foundation objects, including NSData:

// iOS 12
(deviceToken as NSData).description // "<965b251c 6cb1926d e3cb366f dfb16ddd e6b9086a 8a3cac9e 5f857679 376eab7C>"

// iOS 13
(deviceToken as NSData).description // "{length = 32, bytes = 0x965b251c 6cb1926d e3cb366f dfb16ddd ... 5f857679 376eab7c }"

Whereas previously, you could coerce NSData to spill its entire contents by converting it into a String, it now reports its length and a truncated summary of its internal bytes.


Was Apple irresponsible in making this particular change?

No, not really — developers shouldn’t have relied on a specific format for an object’s description.

The documentation promises—still, as of this writing—that description returns:

A string that contains a hexadecimal representation of the data object’s contents in a property list format.

Perhaps it would be a mistake to rely on the exact format of the string, e.g. where the spaces are inserted. But, clearly, it is supposed to contain the entire data’s contents, in a format that can be reconstituted by the property list API. That is no longer the case, and the fault for any resulting breakage lies with Apple, not with developers who were relying on the API to do what it said it would do.

Apple hasn’t explained why it made the change, or even documented it in the release notes. In fact, there don’t even seem to be Foundation release notes yet.


Update (2019-09-17): Joe Groff:

It doesn’t break anything until you build with Xcode 11. The new behavior is based on the linked SDK version, so existing binaries keep working. If you want to upgrade your Xcode, you need to fix your code, though

Cédric Luthi:

Haha, I remember thinking “why did they introduce -UUIDString and not just used -description” for that purpose. Turns out, Apple thought about it too and changed the implementation of -[NSUUID description] in recent OS versions.

Seems like they should have provided a similar replacement on NSData for people relying on the old format.

Update (2019-09-18): Peter Steinberger:

Took the time to decompile [NSData description] on iOS 13 GMv2 and can verify that Apple did the sensible thing here: output only changes if linked SDK is > 12. Existing apps continue to work (forwards to debugDescription) once you adopt Xcode 11 they need to be fixed tho.

See also: this thread.

Update (2019-12-23): Sarah Edwards:

Anyone know how to make Xcode/plutil to stop truncating BLOBs? Seems to have started with macOS 10.15. Previously could see entire BLOB and export to hex editor, etc. Would prefer not to have to use 3rd party or conversion. This is driving me crazy.

This seems to be consequence of the change to NSData.

The Internet Relies on People Working for Free

Owen Williams (tweet):

But when software used by millions of people is maintained by a community of people, or a single person, all on a volunteer basis, sometimes things can go horribly wrong. The catastrophic Heartbleed bug of 2014, which compromised the security of hundreds of millions of sites, was caused by a problem in an open-source library called OpenSSL, which relied on a single full-time developer not making a mistake as they updated and changed that code, used by millions. Other times, developers grow bored and abandon their projects, which can be breached while they aren’t paying attention.


Survival of cURL is thanks to a set of sponsors who fund the project’s hosting and other costs — though Stenberg says no major company pitches in — and contributors like Stenberg that give their time away for free. Stenberg says he believes that it’s important that open source exists and that he has never regretted making cURL open source. What frustrates him is when companies demand his help when things go wrong.

Last year, a company overseas contacted him in a panic after they paused a firmware upgrade rollout to several million devices due to a cURL problem. “I had to explain that I couldn’t travel to them in another country on short notice to help them fix this […] because I work on cURL in my spare time and I have a full-time job,” Stenberg says.


When Stenberg asked the company that needed him to fly to a different country to troubleshoot their problem to pay for [a support contract], they refused.


Update (2020-01-30): See also: Igal Tabachnik.

Apple Tweaks Rules for Children’s Apps and Sign-in

Matthew Panzarino:

The changes announced at Apple’s developer conference in the summer were significant, and raised concerns among developers that the rules could handicap their ability to do business in a universe that, frankly, offers tough alternatives to ad-based revenue for children’s apps.


Both of those rules are being updated to add more nuance to their language around third-party services like ads and analytics. In June, Apple announced a very hard-line version of these rule updates that essentially outlawed any third-party ads or analytics software and prohibited any data transmission to third-parties. The new rules offer some opportunities for developers to continue to integrate these into their apps, but also sets out explicit constraints for them.


Third-party contextual ads may be allowed, but only if those companies providing the ads have publicly documented practices and policies and also offer human review of ad creatives. That certainly limits the options, including most offerings from programmatic services.


Sign in with Apple will not be required in the following conditions[…] Most of these were sort of assumed to be true but were not initially clear in June.

It’s hard to write good rules.

Jacob Eiting:

I only wish they had done this in the first place. If they had talked to the top 20 apps in the Kids category beforehand, they would have realized what a mess an ambiguous ban was going to be.

But, props to them for listening and changing course.


Google’s Privacy Sandbox

Justin Schuh:

First, large scale blocking of cookies undermine people’s privacy by encouraging opaque techniques such as fingerprinting. With fingerprinting, developers have found ways to use tiny bits of information that vary between users, such as what device they have or what fonts they have installed to generate a unique identifier which can then be used to match a user across websites. Unlike cookies, users cannot clear their fingerprint, and therefore cannot control how their information is collected.

This has been criticized unfairly, I think. Mass cookie blocking really did start an arms race that led to fingerprinting. And now we can’t turn back the clock to when the old privacy techniques worked. He’s not saying that turning off cookie blocking will improve your privacy; it’s a comment on the second order effects of everyone blocking them.

Second, blocking cookies without another way to deliver relevant ads significantly reduces publishers’ primary means of funding, which jeopardizes the future of the vibrant web.

This part is also disputed, the implication being that advertisers are vastly overpaying for targeting that doesn’t actually work. I suppose that’s possible, but I don’t find it intuitive.

We want to find a solution that both really protects user privacy and also helps content remain freely accessible on the web. At I/O, we announced a plan to improve the classification of cookies, give clarity and visibility to cookie settings, as well as plans to more aggressively block fingerprinting. We are making progress on this, and today we are providing more details on our plans to restrict fingerprinting. Collectively we believe all these changes will improve transparency, choice, and control.

Bennett Cyphers:

But hidden behind the false equivalencies and privacy gaslighting are a set of real technical proposals. Some are genuinely good ideas. Others could be unmitigated privacy disasters. This post will look at the specific proposals under Google’s new “Privacy Sandbox” umbrella and talk about what they would mean for the future of the web.

Of course, none of this is to say that Google isn’t also doing all sorts of stuff to track you.