Archive for November 10, 2016

Thursday, November 10, 2016

How to Read the Swift Standard Library Source

Ole Begemann:

The bulk of the standard library’s code is in the stdlib/public/core directory in the Swift repository on GitHub. You’ll find the interfaces and implementations for all public types, protocols, and free functions there. You can of course just read the code directly in your browser or clone the repository and go through it on your local machine, but there’s one complication: you’ll notice that about a third of the files have the file extension .swift.gyb. If you open one of these files, e.g. FixedPoint.swift.gyb (this is where the integer types are defined), you’ll see a mixture of Swift and a templating language called GYB.

Erik Dietrich (via Reddit):

When it comes to getting up to speed in short order, there’s really little besides practice that will give you the needed skill set. As such, going into a bunch of codebases in your spare time and poking around is really the best way to become skilled at going into foreign codebases and figuring them out.

[…]

It’s not just the weighty architectural decisions that you’ll pick up, either. You’d be amazed at how many quick, little wins you snag with this practice. One day you’ll see a unit test naming scheme that you fall in love with. The next day you’ll notice a new language feature that will save you twenty lines of code in a lot of your classes. Or, maybe you’ll observe someone using semantics that make mistakes a lot harder.

Whatever the case may be, looking at a lot of code means taking advantage of a lot of others’ experience. It’s not quite the same as having tons of people review your code, but you can realize some of the same benefits.

Update (2016-11-16): Erica Sadun has more information about gyb.

The Best Confirmation Button Ever

Kyle Barrow shows a button from backup app Carbon Copy Cloner:

OK, I will jiggle the mouse every 20 seconds or so to keep the system awake

There are actually hardware mouse jigglers that can do this for you, although it looks like the same goal can be accomplished via software power assertions.

Bombich Software:

You’re right. I hadn’t considered using a DeclareUserActivity sleep assertion because it isn’t be appropriate for backup tasks, but

yeah, that’s the right use here, and it’s a supported mechanism. I’ll take a look at that for the next update, thanks!

Update (2016-11-10): See also Energy Efficiency Guide and QA1340 (via McCloudStrife).

Computational Photography and the Pixel

Sam Byford (via Nick Heer):

Clearly, this is by far the most competitive Google has ever been in mobile photography. But the Pixel phones, on paper, don’t have cutting-edge hardware, relying on an f/2.0 lens without optical image stabilization. Instead, and in typical Google fashion, Google has turned to complex software smarts in order to power the Pixel camera.

[…]

This no-compromise approach to HDR photography has partly been made possible by new hardware. The Hexagon digital signal processor in Qualcomm’s Snapdragon 821 chip gives Google the bandwidth to capture RAW imagery with zero shutter lag from a continuous stream that starts as soon as you open the app. “The moment you press the shutter it’s not actually taking a shot — it already took the shot,” says Levoy. “It took lots of shots! What happens when you press the shutter button is it just marks the time when you pressed it, uses the images it’s already captured, and combines them together.”

[…]

The traditional way to produce an HDR image is to bracket: you take the same image multiple times while exposing different parts of the scene, which lets you merge the shots together to create a final photograph where nothing is too blown-out or noisy. Google’s method is very different — HDR+ also takes multiple images at once, but they’re all underexposed. This preserves highlights, but what about the noise in the shadows? Just leave it to math.

[…]

Google also claims that, counterintuitively, underexposing each HDR shot actually frees the camera up to produce better low-light results. “Because we can denoise very well by taking multiple images and aligning them, we can afford to keep the colors saturated in low light,” says Levoy.

Whereas iOS won’t let me always take photos using HDR—which is the non-lossy choice since the phone also saves the non-HDR version—Google enables HDR by default and intends for you to leave it on.

Hardware Is Sexy, But It’s Software That Matters

Seth Godin:

For years, the Mac was merely a container for Mac software. It was the software that enabled the work we created, it was software that shifted our relationship with computers and ultimately each other.

Over the last five years, Apple has lost the thread and chosen to become a hardware company again. Despite their huge profits and large staff, we’re confronted with[…]

I agree about the importance of software, and Apple’s recent troubles with it, but I don’t see Apple as having made such a choice.

Via John Gruber (tweet):

Software, in general, is much better than it used to be. Unlike 1995, we don’t lose data due to bugs very often. (For me personally, I can’t even remember the last time I lost data.) But our hardware is so much better than our software, the contrast is jarring. An iPhone is a nearly perfect object. Sleek, attractive, simple. The hardware is completely knowable — there are only five buttons, each of them easily understood. iOS, however, is effectively infinite. The deeper our software gets, the less we know and understand it. It’s unsettling.

David Owens II:

As a hardware company, I would expect Apple to be able to iterate more quickly on fixing bugs and not worrying about huge features.

On the other hand, the iOS hardware sets the schedule for when the Mac software ships.