Tuesday, January 15, 2019 [Tweets] [Favorites]

Save Changes Before Quitting?

Niko Kitsakis:

This “Mac-like” feeling was at the core of the classic Mac OS era. It’s what gave the Mac its legendary status and its place in history. And while the first versions of OS X broke with some conventions, things became better as OS X progressed. That is to say, until 10.7 came out and started a trend of questionable design decisions that has been continuing ever since.

But it’s not only Apple that seems to have forgotten its own roots in making good Human Interfaces, the rest of the software industry too seems strangely preoccupied with reinventing the wheel while making it worse with every iteration.

[…]

But unfortunately, these things do not only evolve, sometimes they devolve. Fast forward around 25 years to 2018 and you’ll find this in Adobe Premiere[…] No icon, no verbs and the unsafe option is right in between the safe ones.

Previously: The Lost Art of Legendary Apple UX.

Update (2019-01-17): Dave Mark:

The Mac design language was so powerful, and so widely adopted, that any app that did not follow the rules stood out like a sore thumb. Mac applications were instantly recognizable, and apps from outsiders tended to look ugly, in comparison, as those outsiders did not know the rules to follow.

Does the modern macOS and iOS app universe still hew to a common standard? Are Apple’s Human Interface Guidelines lost in the incredible complexity of application creation?

Update (2019-01-18): Niko Kitsakis:

My english language version of Photoshop tells me in german that it can’t open .psd files because they are in “Adobe Photoshop preference file format”

14 Comments

It sounds like the author is blaming Apple for Adobe disregarding the HIG. In fact, Apple still follows the same dialog box conventions illustrated by the TeachText example.

For what it's worth, I think the changes to saving and quitting introduced in Lion are very much in the Mac spirit of intelligently streamlining some clunkier and more annoying aspects of file and window management. One big reason I avoid Office on the Mac is because Microsoft still insists on getting in my way with a Save dialog every time I close a file (including when I quit an app, which automatically closes all open files, despite my settings in the General section of System Preferences). Office apps also don't auto-terminate after all windows have been closed. Of course, Microsoft is the ultimate "Windows first" Mac developer, so none of this is a surprise.

Re: Anonymous ^^^

I've never liked the "Close the last open document, and the entire app closes" paradigm. I don't see the use in it -- it only gets in the way. What if I want to close a document and open another one? If I don't open the second document before I close the first one, the app quits? That's idiotic. For apps that aren't document-centric and only have one primary window, like System Preferences, it makes more sense. But not for MS Office.

Re: Anonymous

Hi, I’m the author of that text :-)

You say “It sounds like the author is blaming Apple for Adobe disregarding the HIG”

You might have read the text to fast or you would have seen two places where I address exactly those issues:
1) “And while the first versions of OS X broke with some conventions, things became better as OS X progressed. That is to say, until 10.7 came out and started a trend of questionable design decisions that has been continuing ever since.”
2) “The dialogue box we looked at is not the main problem of course, but a symptom of a bigger trend. I can’t help but think, that many people who produce software like this simply don’t know what to look out for anymore.”

Maybe my writing was unclear but the whole point of that piece was to pick one specific example which serves as a canary in the coal mine in regards to what is happening all around us in terms of bad interfaces. Adobe used to be a Mac only developer and even when they started to make Windows versions they were still a good Mac citizen. Ever since OS X however things have cooled down more and more between Apple and Adobe… You might remember that they took ages to port Photoshop to OS X. Back then, buying a Mac meant in 80% of cases to buy a machine to run Photoshop on…

So showing how things have degraded in an Adobe application that runs on a Mac is not far fetched at all to reflect the state of affairs on the Mac in general.

But in regards with the issue you seem to be having: I might write another piece one day about the abysmal auto-save which is an Apple feature. The crappy new Apple file system that produces bugs and misbehaviours for casual and pro users alike would be a candidate as well (Time Machine drives that alert the user to being full despite being almost empty – just because snapshots don’t work like they are supposed to etc.)

However, those topics are rather more technical and I wanted to write about something which even a casual user – or graphic designer who uses Adobe applications every day – might appreciate and test quickly for themselves.

I hope this helps…

Re: Ben G

The way Automatic Termination works is not for an app to quit immediately upon the last or only window being closed (as in SysPref), but rather for it to quit upon the user switching focus to another app after closing all windows (implicitly signaling that they are done with that app for now, and the OS can do what it needs to free up its allocated resources).

Re: Niko Kitsakis

Thank you for the thoughtful reply. It’s indeed lamentable re: Adobe, though the company’s former reputation as a premier Mac developer is ancient history by now; today’s Adobe bears little resemblance to the old one, and new companies like Pixelmator have emerged to take its place as proper Mac citizens. There are plenty of other Windows-centric developers — many of them small teams with very limited resources — who flout the HIG and do the bare minimum to port their apps to the Mac, but that’s been happening since at least the 90s and is just a natural consequence of Microsoft’s ubiquity on the desktop.

Looking forward to your writing in the future.

>It sounds like the author is blaming Apple for Adobe disregarding the HIG

The problem isn't Adobe disregarding the HIG, the problem is Adobe just producing very, very badly designed applications, and making the most basic usability mistakes that I would have thought we'd rooted out decades ago.

>I've never liked the "Close the last open document, and the entire app closes" paradigm. I don't see the use in it

My guess is that this is based on real-world research done by Apple. I'm seeing a lot of people (the kinds of people who don't post comments on this blog) just never closing apps at all. They don't understand the concept of closing an app. They think in terms of documents, not apps (and they learned from iOS and Android that you don't have to quit out of apps). Auto-closing apps is perhaps not the best way this could have been addressed, but it's one way.

Re: Lukas

I agree with your first point.

On the second point however I’m with Ben G when he says “For apps that aren't document-centric and only have one primary window, like System Preferences, it makes more sense. But not for MS Office.” I would add “…or other apps”

For reasons that would be too complicated to go in to here, I do a lot of comparisons between western and Japanese culture. Most traditional craft items in Japan (water kettles, knifes, sandals etc.) are not what you would call optimally or ergonomically designed (from a western perspective, mind you.) They seem to stop at 50% of the way. These items seem to communicate something like “I do half a step towards you (the user) but you need to make half a step towards me.

This leads to products that force you to learn a little about them in order to be able to use them. What it leaves with the user is a greater feel of appreciation towards the craft and (more importantly for our UX/UI example) with a newly found power to use these items in a better way than their western equivalents.

The example doesn’t hold to a 100% of course but what I’m getting at is that – while I’m all for the “don’t make me think approach” – software which forces you just a bit to understand it works much better for you in the end than software that tries to do all the thinking for you. If the last document is closed but the application isn’t then maybe people should put a wee bit of effort in to understand that document does not equal application.

Compare Final Cut X to Premiere (ignoring everything that sucks about Premiere for a moment). Final Cut X would be the fully-automated-put-a-2-inch-nail-in-a-brick-wall-machine. Premiere would be a hammer. The fully-automated-put-a-2-inch-nail-in-a-brick-wall-machine is very good for doing what it says but don’t you ever try to use 3 inch nails, another type of wall or even think about misappropriating it for anything else.
The hammer is harder to use initially but once you understand its flexibility you will hammer all sorts of nails into all sorts of walls and then (when creativity kicks in) you might even use the hammer on a chisel or in combination with a pocket knife to carve or whatever else you can think of.

For me, the basic underlying philosophy of the document windows and the nail-in-wall-tools is the same.

TLDR:
Some things you will never be able to teach people about. Don’t try unless you want to make the experience worse for everyone.

Re: Ben G

The way Automatic Termination works is not for an app to quit immediately upon the last or only window being closed (as in SysPref and other single-window utilities), but rather for it to quit only when all windows have been closed *and* the user switches to another app (implicitly signaling that they are done with that app and the OS can do what it needs to free up its allocated resources). If the user closes all windows and doesn’t switch away, the app keeps running to allow new windows to be opened.

Re: Niko Kitsakis

Thank you for the thoughtful reply. It’s indeed lamentable re: Adobe, though the company’s former reputation as a model Mac developer is ancient history by now; today’s Adobe bears little resemblance to the old one, and new companies like Pixelmator have emerged to take its place as proper Mac citizens. There are plenty of other Windows-centric developers — many of them small teams with very limited resources — who flout the HIG and do the bare minimum to port their apps to the Mac, but that’s just a natural consequence of Microsoft’s ubiquity on the desktop since the 90s. The largest platform always gets the most attention (hence also the glut of Electron web apps).

Looking forward to your writing in the future.

>This leads to products that force you to learn
>a little about them in order to be able to use
>them

Macs are tools. People use them to get things done, not to learn about operating systems. If they wanted to do that, they'd have gone with Linux instead. A doctor, for example, shouldn't have to understand how memory management works. A doctor should have to understand how the human body works, which is hard enough, I'd guess.

If I buy a tool and the maker of the tool intentionally and needlessly stops halfway at making it easy for me to use, I'd say that maker is hostile towards me, and doesn't deserve me as a customer.

Re: Lukas

I don’t agree I’m afraid. I think you take my example too far. I’m not talking about learning the operating systems on a level where you would have to be comfortable with using the terminal etc.

To stay with my hammer metaphor: I’m talking about why the hammer is to be preferred to the automatic tool. You talk about that people should know how to forge a hammer. That goes to far.

Well, let me put it like this: some people do things as a hobby. Others have to do them as part of their job.

For example, I love 3D printers. I love printing things. It's not my job, it's my hobby. I'm fine with buying a 400$ 3D printer that only kind of works, and then investing my time into learning how it works, and fixing it so it prints well. On the other hand, I know industrial designers. They don't care about 3D printers, but they have to use them as part of their job. What they want to do is send a 3D file to the printer, and get a model 12 hours later. They don't want to learn how 3D printers work. That's why they pay 4000$ for their printer, and then have a somebody come in and fix things when they don't work.

The Mac is not a 400$ 3D printer where it's fine to ask its users to invest their own time into figuring out how things are supposed to work. The Mac is a 4000$ 3D printer that should just work, and if it doesn't, there's the Genius bar that will fix things for you. If you want the computer equivalent of a 400$ 3D printer, you don't buy a Mac.

Most people don't buy Macs because they want computers to be their hobbies, they buy them because they need to get things done.

I don't want my dad to have to figure out how Apple's document model works. I want him to write articles for the newspaper he works for, that's his job, that's what he's good at. Not understanding why closing all windows of an application doesn't close the application, and why keeping applications in memory will fill the swap space on his SSD, or some such nonsense. I have to admit, it kinda breaks my brain when people suggest that it is somehow *good* if things are difficult to understand, that it is *good* if people have to learn extraneous, arcane implementation details that are completely irrelevant to their actual goals.

It's not.

If people *want* to learn those details, that's fine. Needlessly forcing them to learn these things is not.

@ Lukas:

I'm mostly with Niko here. Even if the Mac is a tool you would still have to learn something to be able to use it. And if that something is easy to learn and gives a lot of value it might be worth it and then you would be able to do things that would not be possible otherwise. We're not talking about learning the ins and outs of computers, just small steps.

Let's take the second mouse button as an example, giving a contextual menu. The Mac did not have it in the beginning and it made everything simpler, there's no arguing there. But over time most Mac users would agree, I think, that the needs and benefits of contextual menus outweigh the drawback of the small learning curve it requires. (Same goes for the scroll wheel. It adds even more complexity but you will quickly learn and appreciate it, and even your father, I suspect.)

Re: Adrian B

Spot on on paragraph one, thank you.

On paragraph two: it’s interesting that you should mention the second mouse button. I am using it as well but I actually also like the (first) implementation by Apple which was control and click. The idea here was that you *modify* the click (which you already know) on a one button mouse instead of introducing a second button. After all, Jobs once jokingly said that with only one button, it’s extremely hard to press the wrong button. This just as a thought.

Re: Lukas

It’s ironic that you should mention the examples that you did because i always describe the Mac to people that don’t know me (and/or it) the following way: The PC and the Mac are both cars. The owner of the PC likes to spend his weekends in the garage beneath his car with an oily face and a wrench in his hand. The owner of the Mac likes to spend his weekends riding his car, picking up his girlfriend and drive to a nice picknick location.

So you see, I don’t see the matter any differently from you in that respect. I wonder if you might hold a prejudice towards what I am saying because you may think me some sort of geek programmer who hasn’t got a clue about real-life work. After all, this is the second comment of yours that seems to go in that direction (no. 1: “Macs are tools. People use them to get things done,” no 2: “some people do things as a hobby. Others have to do them as part of their job.”)

I can assure you: I’m not programming at all. I don’t work in any other field in software and never have. I’m also not a hobbyist. I use Photoshop, Illustrator, After Effects, Premiere, and InDesign all day long to do mostly animations for temporary projects such as events and advertising for paying clients. My target media are mostly screens and my speciality is to make animations for LED boards with special resolutions (one project right now is 96px × 20480px) or animations that seamlessly flow over multiple screens that are installed physically apart from one another. That’s my work and it fills all my days. When it’s slow I have time for the blog and writing comments like this…

The fact then that I’ve written about this UI example (and why this probably won’t be the last time I did so) is simply because I grew up surrounded by Macs (as it says in my piece) and that I therefore have seen the “downfall” (if you will) when it comes to certain UI elements I see everyday. Also, I like to know how things work and why.

I’ve written somewhere else: After the feather came the pen, then the typewriter, now the computer. No writing tool however is going to make you Shakespeare. For that you need (leaving talent aside for the sake of argument) enough drive to make you seek out guides and tutorials on how to write better, and an interest in literature in general. Every good writer is a voracious reader.

I think this is the last example I can come up with. If I haven’t gotten across what I mean by now, don’t expect me to be able to do so in the future…

I hope this helps, cheers.

> Even if the Mac is a tool you would still have to learn something to be able to use it

Exactly. So let's not artificially make people learn things that they don't absolutely *need* to learn. Computers are confusing enough as-is.

> No writing tool however is going to make you Shakespeare. For that you need (leaving
> talent aside for the sake of argument) enough drive to make you seek out guides and
> tutorials on how to write better

Again, exactly. Writers should learn how to write better. They shouldn't learn about what it means for applications to be kept in memory. Managing open application is needless complexity. It's not an intrinsic aspect of the problem people are actually trying to solve. It's something their operating system needlessly imposes upon them. Learning about how applications work, and why and when you should manually quit them, doesn't make you a better writer. It takes away time you could use to do things that actually *will* make you a better writer.

Stay up-to-date by subscribing to the Comments RSS Feed for this post.

Leave a Comment