Camera and Photos on iOS
On the iPhone, we can only adjust the ISO and the shutter speed. We can hence trade noise (affected by the ISO) against motion blur/sharpness while maintaining the same level of exposure.
That explains why photos at night often look worse than those taken during the day: At night there’s a lot less light. In order to still have an acceptable shutter speed, the auto exposure will bump up the ISO, probably to the maximum of what the camera allows. And even that may not be enough to achieve enough light, so the auto exposure will also lower the shutter speed. This combination results in more noise in the image, and the image being blurred.
The
AVCaptureSessionPresetPhoto
selects the best configuration for the capture of a photo, i.e. it enables the maximum ISO and exposure duration ranges; the phase detection autofocus; and a full resolution, JPEG-compressed still image output.However, if you need more control, the
AVCaptureDeviceFormat
class describes the parameters applicable to the device, such as still image resolution, video preview resolution, the type of autofocus system, ISO, and exposure duration limits. Every device supports a set of formats, listed in theAVCaptureDevice.formats
property, and the proper format can be set as theactiveFormat
of theAVCaptureDevice
(note that you cannot modify a format).[…]
New in iOS 8 is the option to move the lens to a position from
0.0
, focusing near objects, to1.0
, focusing far objects (although that doesn’t mean “infinity”).[…]
An interesting feature also introduced in iOS 8 is “bracketed capture,” which means taking several photos in succession with different exposure settings. This can be useful when taking a picture in mixed light, for example, by configuring three different exposures with biases at −1, 0, +1, and then merging them with an HDR algorithm.
PHAsset
’srepresentsBurst
property istrue
for assets that are representative of a burst photo sequence (multiple photos taken while the user holds down the shutter). It will also have aburstIdentifier
value which can then be used to fetch the rest of the assets in that burst sequence viafetchAssetsWithBurstIdentifier(...)
.The user can flag assets within a burst sequence; additionally, the system uses various heuristics to mark potential user picks automatically. This metadata is accessible via
PHAsset
’sburstSelectionTypes
property. This property is a bitmask with three defined constants:.UserPick
for assets marked manually by the user,.AutoPick
for potential user picks, and.None
for unmarked assets.[…]
First, you need to register a change observer (conforming to the
PHPhotoLibraryChangeObserver
protocol) with the sharedPHPhotoLibrary
object using theregisterChangeObserver(...)
method. The change observer’sphotoLibraryDidChange(...)
method will be called whenever another app or the user makes a change in the photo library that affects any assets or collections that you fetched prior to the change. The method has a single parameter of typePHChange
, which you can use to find out if the changes are related to any of the fetched objects that you are interested in.
A user can chain incompatible photo edits together — if the adjustment data is not understood by the current extension, the pre-rendered image will be used as input. For example, you can crop an image using the system crop tool before using your custom Photo Editing extension. Once you have saved the edited image, the associated adjustment data will only contain details of the most recent edit. You could store adjustment data from the previous, incompatible edit in your output adjustment data, allowing you to implement a revert function for just your phase of the filter chain. The revert function provided by the Photos app will remove all the edits, returning the photo to its original state.