Monday, May 20, 2024

Swift FormatStyle Issues

Wade Tregaskis:

They’re terser than using their otherwise more powerful cousins the Formatters, as they support a “fluent” style of property-based access, which tends to read more naturally and usually avoids having to define variables to hold the formatter.

[…]

They almost always break Xcode’s auto-complete, which is a problem since their syntax is non-trivial and unintuitive.

They’re hard to understand – and to even find in Apple’s official documentation – because there’s so many protocols and indirection involved.

It’s particularly hard to tell where the inexplicable gaps are. e.g. Double doesn’t support ByteCountFormatStyle, even though logically it should and Xcode will sometimes auto-complete as if it does.

I haven’t used the new formatter API much because it isn’t available in the SDK that I’m targeting. I like that it’s terser and doesn’t require tracking a formatter instance. But it’s probably not terse enough that I would use it directly vs. via a more semantically named helper method. And I agree that it’s not actually that easy to use if you don’t already know what you’re doing.

Wade Tregaskis:

Alas, they don’t always work correctly; some of these formatters contain egregious bugs.

In particular, ByteCountFormatStyle pretends to support multiple numeric bases – decimal and binary – but it doesn’t[…] Note how it still uses decimal units, “kB”. Decimal is not binary. I mean, duh, right? But apparently Apple don’t know this.

NSByteCountFormatter behaves the same way. I don’t think it’s a bug so much as Apple deciding to never display binary prefixes even though it is intentional about calculating memory sizes as binary and file sizes as decimal.

Previously:

4 Comments RSS · Twitter · Mastodon

I mean… I think it *is* a bug. At this point It's not really a matter of opinion, it's just wrong: "kB" means "kilobytes" which is 1,000 bytes. "KiB" is similarly clear as kibibytes which is 1,024 bytes.

Pretending that this is not the case - that for some reason we should keep using "k" when we mean x1,024, like it's 1980 still, is just weirdly obstinate.

I suppose you can argue semantics - which is really beside the point - as to whether something done with _intention_ is still a "bug" or just stupid, but either way it's really frustrating. Especially from a platform provider that should be setting a good example (and mindful of the outsized influence their frameworks have on software broadly).

@Wade It’s a bug with respect to the newish IEC and NIST standards, which I wish Apple would follow, but I don’t think the binary prefixes have really entered common use, and they certainly aren’t universal among more technical folks. It’s hard to change meanings after the fact. I guess we could say that Apple is being descriptive rather than prescriptive.

I understand that argument, I just (respectfully) disagree. I don't see any practical benefit with taking that position, only downsides.

And I realise you (Michael) are not necessarily taking that position, just positing about Apple's apparent position. I'm only annoyed at Apple about this, not anyone else; I hope I didn't imply otherwise.

I feel like 25 years is long enough for something this simple and easy to adopt to actually be broadly adopted. 🤷‍♂️

Also, tangentially, in my post I originally included a pithy line about most programmers today being younger than binary prefix standardisation, but some research - while revealing wildly varying numbers from ~30 to ~50 - suggested that the median active programmer age is not quite as low as 25 yet. But maybe not far off - I found one survey that put it at about 32. In any case, it won't be that long until these standard units will have existed since before most programmers were born. It's very likely true that they've existed since before most current programmers learnt to program.

@Wade Yeah, I’m not agreeing with Apple, but I can imagine that they probably think it would confuse people without really adding value, since non-developers probably never work with the actual number of bytes of RAM, nor compare RAM and disk sizes. As you say, Apple could certainly help make the prefixes mainstream if they wanted to. But this is the company that no longer tells you the GHz (and sometimes even the RAM capacity) of their products.

Leave a Comment