Thursday, April 18, 2019

Origins of the Apple Human Interface

Riccardo Mori (tweet):

Recently, the Computer History Museum has uploaded on its YouTube channel a lecture called Origins of the Apple human interface, delivered by Larry Tesler and Chris Espinosa. The lecture was held at the Computer History Museum in Mountain View, California, on October 28, 1997.

Being extremely interested in the subject myself, and seeing how apparently little thought is being given today to the subject, I wanted to quote a few selected excerpts from the talk, just to show what kind of hard work creating a user interface was back in the day when the Apple Lisa was being developed. It turns out that isolating this or that bit was futile, as the whole talk is made up of such cohesive, engrossing discourse. So I chose to transcribe it almost entirely, and add a few personal remarks here and there. I hope this turns out to be as interesting to you as it was to me.

I recommend watching the whole video. Mori’s transcription is a great companion that includes better images of the screen and context from a modern perspective.

A few parts I want to highlight:

[Larry Tesler is saying that at this stage of development of the Lisa interface, when you clicked on, e.g., the upward‐facing arrow, the content would move upwards too, in the same way “Natural scroll direction” works since its introduction in Mac OS X 10.7 Lion.[…]]


So we made a decision that had nothing to do with ease of use, nothing to do with ease of learning, nothing to do with error rates. It wasn’t a human factor’s decision at all in the traditional sense. It was a decision based on what customers liked.


And what I found was that the way we taught it made a lot of difference. You could take the same user interface and teach it in a different way, and people would get confused; or understand it; or make more mistakes; or fewer mistakes. And terminology made a difference also, so we then started a terminology project that Ellen Nold ran, which ended up with the FILE menu, the EDIT menu, etc., as you know today, and all the various commands that were in them. You know, choosing all the words for everything.


I remember very very very clearly that one of the massive controversies around the development for the Macintosh circa 1982–1983 was [that] developers would come up to us and say, You know, if you make the user interface consistent and if you put all that software in ROM that makes it— you know, if you make it hard to write to the screen directly, so that we have to use your user interface software to talk to the user, how are we ever going to make our applications unique and stand out and be different from each other in the marketplace?


[…]and so there’s constantly the dilemma (which you’ve seen historically in Mac system software) that the expert users want to put in the features they want to use, but the people who want to keep this system pure for the novices want to resist those

And if you’re lucky, you get a system that is easy to approach for the novice, and gradually unfolds itself for the expert. And if you’re unlucky, you get a lukewarm mediocrity between the two, where it’s a little too complex for the beginning user to understand, but still not nearly powerful enough for the expert user.

[I think this is a perfect snapshot of the current situation with iOS.]

Update (2019-04-22): Colin Cornaby:

On the note at the end: I drafted a blog post a while back on how I wanted a macOS iPad. When I passed it around, I got unexpected feedback from a few people. While iOS isn’t complicated enough for people like me, iOS is already growing too complicated for novice users.

iOS sits in their weird place, especially on iPad, where it’s not really working for anyone quite right. And there is a feeling out there that making iOS more complicated will make it less accessible to users (which, to be clear, is not my personal preference.)

5 Comments RSS · Twitter

It saddens me how far Apple has strayed from this ethic. The more I dive into their modern apps, the more disappointed I become. What happened to being awed and pleasantly surprised by their design decisions, and the hidden power lurking below the surface?

I mostly like the new News+ service since the price is good and there are many magazines that I enjoy, but the News app is just awful. I can’t believe this is a released product. It’s an alpha level at best. And of course there’s no other way to access the service. Today I discovered that when you “save” an article, it doesn’t actually save anything. If you try to access your saved articles when you are offline, it tells you that it can’t be displayed without an Internet connection. WTF? That’s a bookmark, not a “saved article”.

It seems like Apple is just hiring bottom of the barrel app developers these days. Maybe their “pro” stuff is good (like Final Cut) — I don’t know because I don’t use it — but their consumer apps and services totally blow, especially considering how much money Apple has. Just look at how great the podcast app Overcast is (and its companion web service), compared to Apple’s podcast app. That’s from one guy!

How are the people at Apple that create this crap not totally embarrassed? Their quality lately reminds me of XP-era Microsoft.


I think it’s more likely to reflect management priorities and incentives than developer ability.


Case in point: the dreadful state of developer documentation isn’t due to hiring incompetent technical writers. Even incompetent technical writers don’t decide to mark everything “no longer being updated” and then stop adding new docs for new material.

It’s management that makes blunders that big.


Apple's pro apps can fail in interesting ways as well.

Last month I added a single video to a blank timeline, walked away from the computer for 15 minutes, and returned to find a frozen Mac, and a window saying that Final Cut was using ~63GB of memory. That's some memory leak!

A cursory search of the Apple Support Forums shows this issue has been around for years. Who knows how reproducible it is, but at the end of the day, it means I can't trust the software (or at least have to quit Final Cut whenever I have to step away from the computer).

[…] Origins of the Apple Human Interface […]

Leave a Comment