Nate Pfeiffer:
Portrait mode on the iPhone 7 Plus showed a lot of promise but its flaws were plain to see, for a number of reasons. The depth maps it generated lacked any fine detail and often assumed features like glasses or ears were part of the background, resulting in inaccurate blurring.
[…]
Let’s break down the iterative improvements that led to the 13 Pro being near indistinguishable from the Rebel T6 in our first example. First, the image processing brings out more fine detail and widens dynamic range, preventing highlights from overexposing. The depth estimation did a better job at keeping the background blurred between small foreground objects and cutting out foliage edges. Shallower bokeh simulation (which can be manually adjusted) and iOS 16’s foreground blur are the cherries on top.
[…]
Apple’s new, aggressive approach to processing faces is laid bare here. Ensuring even exposure on the skin takes priority above all else. In many ways, the 8 Plus’ photo is more realistic with regard to lighting, due to its contrasted look, while the 13 Pro’s photo just feels more perfect than reality.
The depth maps have really improved.
Via Nick Heer:
Portrait Mode has come a long way since its first iterations. […] That said, I still have not found Portrait Lighting very useful. It does not seem to have benefitted nearly as much from the significant investments in Portrait Mode.
Previously:
Camera High Dynamic Range (HDR) iOS iOS 16 iPhone iPhone 7 Plus LiDAR Scanner Photography
John Voorhees:
So, what do you do if you’re in a shared library and want to join a different one? There’s a button in the Photos section of Settings to leave a library, so you can do so with one tap, saving all of the photos in the shared library to your personal library or keeping just those you originally contributed to the shared pool.
[…]
However, it was clear by this point that it was going to take hours, not minutes, to process 25,000 photos. Still, after only a half hour or so, several thousand photos had already been moved to the new library.
[…]
What I like most about the setup process is that it’s not a black box. Instead of flipping a toggle and hoping for the best, the multi-step process explains what will happen simply and thoroughly, building the user’s trust by explaining every option and the consequences of each. That’s critical when it comes to something as precious as your family photos. So, overall, the iCloud Shared Photo Library setup gets two enthusiastic thumbs-up from me. Every team at Apple that’s designing an onboarding process should look to this flow as one of the best on the company’s platforms.
Tim Hardwick:
However, in a footnote at the bottom of its iPhone 14 press release, Apple says “iCloud Shared Photo Library will be available in a future software update.” It therefore looks like the new Photos iCloud feature will not make it to the first official version of iOS 16.
Previously:
iCloud iCloud Photo Library iCloud Shared Photo Library iOS iOS 16 Photos.app
Steve Landey:
There are three patterns I use in most of my UIKit projects that I’ve never seen anyone else talk about. I think they help readability a lot, so I’m sharing them here[…]
[…]
Additionally, it’s not great to use force-unwrapped optionals to store anything. But if we use let
instead, then all views will be created at init time instead of in loadView()
.
[…]
We can solve a lot of problems by moving all view creation to the bottom of the file and using lazy var
.
The standard guidance is to use implicitly unwrapped optionals to reference views in a view controller, but I think lazy
works much better. Aside from the delayed loading benefit, lazy
lets you reference self
, and thus other properties and helper methods, which you would not ordinarily be able to do from a Swift initializer. (The compiler does protect you from making circular references to other lazy
properties.) And, because you are providing an initial value, the type of the property can be inferred.
A downside is that there’s no lazy let
, so the compiler can’t prevent you from accidentally mutating a variable that you meant to only set once. One way around this is to use a helper object with all the properties as let
and to store that in the var
, but I think the awkwardness this causes is worse than the problem it solves. Landey suggests:
You can at least prevent multiple writes to vars
at runtime by using this simple property wrapper, which throws an assertion failure in debug builds if you write to the property more than once. It’s not as good as a compile-time guarantee, but it does double as inline documentation.
You can decide whether the protection is worth the noise. I’m more interested in something like this to prevent write access to conceptually immutable properties except from tests. However, that is more relevant for model properties, which often use Core Data, and property wrappers don’t work with @NSManaged
. Core Data itself provides a—more verbose—out, though: you can still set a property with a private
setter by key.
Landey then recommends defining the properties at the top of the file and using helper functions at the bottom of the file to actually create the views. Personally, I prefer to see everything related to a property together at its declaration. If you use lazy
and create the initial value right there, you can set up the whole property without having to repeat its name or type anywhere.
Previously:
Cocoa Core Data Craft iOS iOS 15 Mac macOS 12 Monterey Programming Swift Programming Language
Rens Breur:
As clients of the SwiftUI framework, we can only create composed views. Composed views can have state, and can be seen as a function of this state to another view. Composed views use primitive views in their body. Primitive views are the building blocks of any type of view.
[…]
SwiftUI needs to perform diffing to find out how to change the view graph and how to update the layout tree. Primitive views play the most important role, and there are only a small number of structural views underneath which a view graph can change.
[…]
Remember the example in the introduction where it was ambiguous what the changes in a list of labels was? This is the same example! But the algorithm we created knows exactly whether a label’s text has changed, or whether it was replaced with another label. It was able to do that, by using static typing, and by making the view types themselves help determine what has changed.
Previously:
iOS iOS 15 Mac macOS 12 Monterey Programming Swift Programming Language SwiftUI