Archive for May 25, 2015

Monday, May 25, 2015

Camera and Photos on iOS

Daniel Eggert:

On the iPhone, we can only adjust the ISO and the shutter speed. We can hence trade noise (affected by the ISO) against motion blur/sharpness while maintaining the same level of exposure.

That explains why photos at night often look worse than those taken during the day: At night there’s a lot less light. In order to still have an acceptable shutter speed, the auto exposure will bump up the ISO, probably to the maximum of what the camera allows. And even that may not be enough to achieve enough light, so the auto exposure will also lower the shutter speed. This combination results in more noise in the image, and the image being blurred.

Matteo Caldari:

The AVCaptureSessionPresetPhoto selects the best configuration for the capture of a photo, i.e. it enables the maximum ISO and exposure duration ranges; the phase detection autofocus; and a full resolution, JPEG-compressed still image output.

However, if you need more control, the AVCaptureDeviceFormat class describes the parameters applicable to the device, such as still image resolution, video preview resolution, the type of autofocus system, ISO, and exposure duration limits. Every device supports a set of formats, listed in the AVCaptureDevice.formats property, and the proper format can be set as the activeFormat of the AVCaptureDevice (note that you cannot modify a format).

[…]

New in iOS 8 is the option to move the lens to a position from 0.0, focusing near objects, to 1.0, focusing far objects (although that doesn’t mean “infinity”).

[…]

An interesting feature also introduced in iOS 8 is “bracketed capture,” which means taking several photos in succession with different exposure settings. This can be useful when taking a picture in mixed light, for example, by configuring three different exposures with biases at −1, 0, +1, and then merging them with an HDR algorithm.

Saniul Ahmed:

PHAsset’s representsBurst property is true for assets that are representative of a burst photo sequence (multiple photos taken while the user holds down the shutter). It will also have a burstIdentifier value which can then be used to fetch the rest of the assets in that burst sequence via fetchAssetsWithBurstIdentifier(...).

The user can flag assets within a burst sequence; additionally, the system uses various heuristics to mark potential user picks automatically. This metadata is accessible via PHAsset’s burstSelectionTypes property. This property is a bitmask with three defined constants: .UserPick for assets marked manually by the user, .AutoPick for potential user picks, and .None for unmarked assets.

[…]

First, you need to register a change observer (conforming to the PHPhotoLibraryChangeObserver protocol) with the shared PHPhotoLibrary object using the registerChangeObserver(...) method. The change observer’s photoLibraryDidChange(...) method will be called whenever another app or the user makes a change in the photo library that affects any assets or collections that you fetched prior to the change. The method has a single parameter of type PHChange, which you can use to find out if the changes are related to any of the fetched objects that you are interested in.

Sam Davies:

A user can chain incompatible photo edits together — if the adjustment data is not understood by the current extension, the pre-rendered image will be used as input. For example, you can crop an image using the system crop tool before using your custom Photo Editing extension. Once you have saved the edited image, the associated adjustment data will only contain details of the most recent edit. You could store adjustment data from the previous, incompatible edit in your output adjustment data, allowing you to implement a revert function for just your phase of the filter chain. The revert function provided by the Photos app will remove all the edits, returning the photo to its original state.

30 Years of Pac-Man

Chris Kohler (via Dave Dribin):

By creating a cute cast of characters and a design sensibility that appealed to wider audiences than the shoot-em-up Space Invaders, Iwatani broadened the appeal and marketability of games, creating what some call the first “casual game.”

[…]

“After that, I became a producer. Namco was a small company, and because the organization expanded, I was promoted to section chief. Someone had to coordinate the younger developers that we’d hired.

“So although I was still capable and wanted to keep developing games, I was told to serve as the supervisor — the manager of the baseball team, instead of a player.”

Corinne Segal (via Dave Dribin):

Today marks 35 years since Pac-Man debuted at a movie theater in the Shibuya area of Tokyo. Since then, the game has become one of the most popular of all time, producing more than eight other versions, a television series and more than 400 products. A few facts to think about the next time you’re playing Pac-Man at your local laundromat or on Google Maps.

[…]

Iwatani described his company’s reaction to the game in an interview with VH1 Games in 2007, saying: “I’m not sure if I should mention this or not. Well, um, the truth of the matter is, there were no rewards per se for the success of Pac-Man. I was just an employee. There was no change in my salary, no bonus, no official citation of any kind.”