Tuesday, December 7, 2021 [Tweets] [Favorites]

The State of External Retina Displays

Casey Liss (tweet):

In [effectively] 2022, there are four options for retina-quality monitors to attach to your Mac.

[…]

Only two are 5K or greater. The LG still seems to have problems, and the Apple Pro Display XDR costs $6,000.

Over the last year or two, Apple has been doing a phenomenal job of filling the holes in their product line. For my money, the completely embarrasing monitor situation is the lowest-hanging fruit. By a mile.

Jack Wellborn:

While I’d also love to see Apple release a not ridiculously expensive but still very expensive monitor, I don’t think that addresses the problem. Before Retina was a thing, Casey and I could choose from a variety QHD displays. Ideally we’d have that same variety with 5K displays.

When I wrote about this, I noticed that macOS and Windows treat my 27” 4K display differently. By default, macOS uses the display’s native resolution while Windows scales to pseudo 5K. I wonder if there might have been a better display market had Windows not pretended 4K was 5K.

Here’s what I wrote (an attempt to explain why 4K 27” displays aren’t great).

Dan Moren:

It remains quite surprising that there isn’t an option for those who can’t afford and don’t need the $6000 reference-monitor quality of the Pro Display XDR to pair with a MacBook or Mac Pro, even two and a half years after Apple released its foray back into the external monitor market. Even the iPad can connect to external displays, though its utility remains a bit limited.

Josh Centers:

Macs don’t provide HiDPI (or Retina) scaling for sharp text on monitors with less than 4K resolution, including those with a 1440p resolution (2560 by 1440 pixels), and existing workarounds for Intel-based Macs don’t work with M1-based Macs. Macworld’s Jared Newman highlights BetterDummy, a clever utility that addresses this limitation in a roundabout way. It lets you trick macOS by mirroring the contents of a fake 5K display of the right aspect ratio onto your actual 1440p screen.

Fernando Cassia (via Hacker News):

Nobody can explain it better than the guy behind the code. So we decided to chat with him so he can tell us more about his project, where he thinks Apple could improve, and why Intel-based Macs are more flexible when it comes to supporting non-Apple monitors, among other things.

Previously:

Update (2021-12-13): Matt Birchler:

I’m using the LG 27” IPS 4K UHD Monitor (model #: 27UP600-W.AUS) which cost me about $399 and technically falls outside of Casey’s criteria, but I think works wonderfully. It’s 4K, supports the P3 color space, and has the inputs I need for my (realtively) basic needs.

It also happens to look a lot better, in my opinion, than the budget option in the article.

Update (2021-12-28): See also: Hacker News.

John Gruber:

This is why Apple needs to make its own prosumer-priced external display (or even better, displays) — it’s clear no one else is making them other than LG, and the LG displays aren’t great.

Update (2022-01-05): Matthias Gansrigler:

Which left me with the LG UltraFine 27UN880-B. And I have to say: it’s a choice I don’t regret one bit. I love it.

Unfortunately, it’s only 4K.

Update (2022-01-07): Tom Brand:

Say what you will about 5K vs 4K monitors, but ever since we switched from LG 5K displays and Caldigit Thunderbolt docks to Dell 4k USB Type-C displays, the kernel panics went away.

21 Comments

The LG Ultrafine 5K has also been removed from sale in Europe about a year ago. There’s effectively no option here apart from Apple’s overpriced 6K.

An iMac 5K screen without the internals is all we need, nobody asked for that 6K monitor :(

Ugh.

I'm not surprised at all that Apple has no offering in this area. There's no price tag that would satisfy both them and consumers. Monitors are a commodity. The price tag floated on ATP the other day was $2k, which is bananas for a monitor. I don't care if it's 120 Hz, mini-LED, 1500 nits, has nanotexture glossy reduction and is 6K; that's still way more than almost any Mac user is ever going to spend on a monitor. The Ultrafine's price is already extremely premium. Folks, $400 is already high-end for a monitor. You can get a perfectly workable one for $120. And a 4K one for $250. But Apple isn't going to do a $120, or $400, or even $800 monitor. And they've realized that, and so they haven't done any monitor at all. Instead, they did the weird LG partnership.

So that part doesn't bug me. So what if Apple doesn't do it?

The part that bugs me is that third parties haven't picked up the slack either, and the reason that has happened is that Apple went its own way with how they handled HiDPI, in two respects:

1) they effectively killed off subpixel rendering. Various portions of this were removed in recent macOS releases. This makes much of the rendering pipeline simpler and was clearly considered a legacy codepath, and it never worked correctly with Core Animation anyway, because it was implemented at the CPU side, rather than the GPU. But it means text looks like garbage on non-Retina screens now. Thanks, Apple. This decision was made without any concern for what MacBooks in offices actually look like. They get connected to one or two $100-200 24-inch 1080p screens via HDMI (hey, Apple has re-discovered that port), and then operated either with the lid shut or at a weird angle on a stand. And then everything looks wrong.

2) they only support integer scaling. This is much simpler to implement than the Tiger/Leopard-era experiments where you could enter fractional scale factors, but it also dramatically reduces your choices for good screens. On Windows, pick any screen, then set the font scaling to 100%, or 125%, or 150, 200, 250%. You get the idea. This makes it perfectly feasible to get a 24-inch or 28-inch 4K display, rather than the 21 inches Apple would recommend, and simply set the scaling not to exactly 200%, but slightly less. Yes, many apps are still glitchy with anything that isn't exactly 100%, but it's gotten better, and there's an end in sight.

With macOS, I don't see the light at the end of the tunnel. Instead, what I see is hubris and misunderstanding of how Macs are used. Yes, it's nice that Apple puts a lot of work into great displays. It is! They're gorgeous for both photo/video viewing and editing, I'm sure. But so many jobs out there — including software development — don't involve looking at pictures all day. They involve text. And for text, what on earth is Apple's story right now? That you buy a $3K MacBook Pro and then, on top of that, a $1,300 LG display, which by all accounts isn't even that great? They can't be serious, right?

This was somewhat forgivable in 2012 when the Retina MacBook Pro launched, but they really need this figured out by now. And no, a $2k Apple IAmRich Display XDR is not the answer.

I'll once again throw out my suggest to how I feel the problem would best be solved: Apple needs to put its architectural expertise where its mouth is and do two things:
#1 Every new iMac that Apple ships should have an embedded Apple TV and be able to take DisplayPort input on its Type-C ports. This is non-negotiable if Apple wants to continue to beat its chest over its environmental bona fides. The community MUST make Apple make this happen; no iMac should again ship that cannot become a Reused (of "Reuse, Repair, Recycle" fame) monitor/TV until its LCD panel goes dim/dies.
#2 Apple should produce a value-driven replacement motherboard that Apple Authorized Repair Providers (and, perhaps, even the new Parts program) can install into the chassis of 2014 and up 5K and 4K iMacs, using the existing power supply and screen, to provide Apple TV and Type-C Alt Display Port input. For less than $500, these machines should gain rescue from landfills and e-waste piles in far-off lands.
These are both so common sensical approaches, IMHO, it boggles my mind that Apple has not produced them. That Apple has not stands a stark (again, IMHO) reminder that Apple today is a company 180° opposite the culture of the Apple of yore. The Reality Distortion Field is now just "Reality". I do not believe in the concept of Manifestation… but Apple is probably as close as I've ever seen the trick "work".

> Folks, $400 is already high-end for a monitor.

Not really. Try to price a quality 32” 4K monitor with Thunderbolt. You’re looking at $1200 minimum.

I’m pretty happy with my 34” LG: https://taoofmac.com/space/blog/2021/08/21/1600 - which is Retina-like and has Thunderbolt power delivery.

I do think that Casey and the rest are unwitting victims of a growing trend towards mid-to-low res, high display rate displays to cater to the gaming crowd (which is where most of the disposable income seems to be, and who are also going through GPU shortages that make it hard to justify going HIDPI instead of lower latency). But from my research prior to buying mine, there were (pre-announced) Samsung and LG displays coming that simply did not materialize, and the models I was looking at have already gone through a few hardware revisions over the years.

HIDPI monitors that are “Mac-grade” are just not enough of a priority for third-party manufacturers, but they do exist outside the brackets Casey laid out…

I've had decent luck with the LG Ultrafine 5k monitor; I have several, because of different locations I work in and the people I share them with in those places. Of five, one fried for unknown reasons, and when I took it in to be repaired, the repair shop noted they've gotten a few of them in with a similar problem. They *strongly* recommended not using the power delivery feature of the monitor - in other words, plug a power supply directly into your Mac. They think that frequent plugging/unplugging of the Thunderbolt cable, along with delivering 90 W of power, causes many of the issues they see, FWIW. And LG now says, "For optimal performance on the 16-inch MacBook Pro, connect to power using the MacBook Pro’s 96W adapter."

It would be nice to have other alternatives, though.

Kevin Schumacher

> Not really. Try to price a quality 32” 4K monitor with Thunderbolt. You’re looking at $1200 minimum.

As I said last time this came up (and was pooh-poohed), for anything approaching a normal consumer, $400 is high end. There’s a bubble of people here that won’t blink at a $1,200 monitor. That’s more than a good 65” TV costs. And it’s vastly more than 99%+ of users are going to spend on a monitor, which is who Sören was referring to.

Old Unix Geek

Ironic that this story came up. I spent some of yesterday testing out a new M1 mac for work. Connected it via HDMI to my 3008WFP Dell monitor... since the new one I ordered is "being produced" for the next month. Both the menu and the dock are cut off, which is very un-useful. Theoretically there is a "underscan" slider in system preferences' display panel, but it doesn't appear whatever sacrifices I offer Tim Cook. I guess he's too busy making his own offerings to the Chinese government.

Then I learned that if no underscan slider appears, one can hack the /var/db/.com.apple.iokit.graphics file. But there's no such file on my brand new M1.

Supposedly SwitchRes-X could fix this, but then again, no luck: most options stay greyed out, because "Apple Silicon".

Of course, my mini-display-port cable that works with my not-that-old mac mini also doesn't work because we've got to be fancy and use USB-C. Next step, I suppose, I'll try to order a new fancy USB-C to DisplayPort cable.

At worst, I guess I'll use this Mac via VNC, if that still works. And if that gets too annoying, I'll use it to compile this project and see whether I can hack it to get the screen to show me the menu bar and dock...

But, pardon my language, WTF??? Frankly, this is way worse than my experience on Linux was 30 years ago. Not even a xvidtune equivalent to let people work around Apple's incompetence. I thought Apple was supposed to be "just works". This is a mac mini... it doesn't come with a screen. Even a $40 Raspberry pi using a set-top box video chip can drive a screen properly.

Anyway, if anyone has another workaround, I'm all ears...

@Soren & @Kevin: I agree, monitors today are commodity (<$400).

>Not really. Try to price a quality 32” 4K monitor with Thunderbolt. You’re looking at $1200 minimum.

Let me put the problem a different way: to use a keyboard well with a Mac, I don't need to buy a $150 Apple keyboard. Any $25 keyboard will do, and there's huge selection. Depending on your preferences, the way your hands are shaped, etc., such a keyboard may in fact work better for you than Apple's. I also don't need to buy a special printer to use AirPrint. A lot of printers support that now. Or a special Wi-Fi access point for AirDrop support. Any will do.

But to look at text (text!!) in macOS the way Apple intended, I have the choice to use a MacBook's internal display, or to buy from a very small selection of high-priced external displays. There is no middle segment for the masses there. You're either OK with a subpar experience (in which case, kind of a bummer that you got a Mac, huh?), or you're about to spend a lot of money.

If your $1200 display makes you happy, great! But there's no way I can tell me boss, "oh, you know those displays we usually get? Yeah, I'm gonna need a premium version of that, and by that I don't mean twice the price, but about six times".

Now, you might say, "yeah, but it's been that way for two decades! Why is this a problem now?" — that's not true. For a decade and a half, macOS had excellent support for subpixel rendering. It no longer does. So it's not just that Retina continues to be a premium experience you can choose (with a big wallet); it's that they made non-Retina a lot worse than it used to be.

> If your $1200 display makes you happy, great! But there's no way I can tell me boss, "oh, you know those displays we usually get? Yeah, I'm gonna need a premium version of that, and by that I don't mean twice the price, but about six times".

I get that. I was approaching this from the point of view of a consumer who already plunked down upwards of $1,000 of their own money for an M1 MacBook Air, or $2,000+ for a Pro. Unless that person intends to use the MacBook exclusively as a desktop computer, I would think they’d want the enormous benefits of Thunderbolt 3, allowing them to use their monitor as a rudimentary docking station/hub, elegantly connected with a single cable. And to access that feature in a larger 4K monitor, the price of admission is currently well north of $1,000.

For me the value of this far exceeds that of an incrementally higher resolution 5K display. And while I don’t have perfect eyesight, my eyes are still good enough that I only need to wear glasses for reading very small print at very short distances.

@ Old Unix Geek: so, have you played with BetterDummy?

(I tried it myself for entirely different reasons, and alas, it appears to play poorly with DisplayLink. Oh well.)

@ Freediverx:

>I was approaching this from the point of view of a consumer who already plunked down upwards of $1,000 of their own money for an M1 MacBook Air, or $2,000+ for a Pro.

Sure, but look at it a different way: you start out with "between a Dell, a Microsoft Surface, and a MacBook Pro, I think the MBP is really nicest, and the price isn't that different". But then you're shopping for a monitor, and your selection of decent choices on the Mac side are just slimmer.

> Unless that person intends to use the MacBook exclusively as a desktop computer, I would think they’d want the enormous benefits of Thunderbolt 3, allowing them to use their monitor as a rudimentary docking station/hub, elegantly connected with a single cable.

OK, but a few comments above, Nick Fabry advises against using that feature. Like, it sounds quite nice (if it works reliably), sure, but is it really that much nicer to justify such a steep price? Instead, you can:

a) plug in MagSafe, a $50 USB-C hub for most of the ports, and plug in HDMI on the other side. Or a pricier USB-C hub that already comes with HDMI. Now you have two or three cables instead of one, and you still have a "dock" in the sense that almost all your cables on the desk can just be permanently plugged into that.
b) leave MagSafe and instead plug in a $100 USB-C hub with sufficient PD (those typically come with HDMI, too). Just one cable, and any monitor you like.

Yeah, once you get into Thunderbolt territory, things become more expensive, but most people don't actually use or need that. It's great that Apple provides the option, but I wouldn't be surprised if only about 10% of MacBook Pro customers actually make use of Thunderbolt vs. mere USB-C.

(In fact, for all the flaws of the 2016 MacBook Pros, it's really nice how flexible those ports are, and how many third-party choices you get.)

@Sören: no, not yet. It's kinda hard to do much, including using XCode if one can't see the menu bar, or the top entries of the menus... I'm waiting first to see whether a USB-C to DisplayPort cable will work.

While I'm still unhappy about Apple abandoning sub-pixel rendering with Mojave, I personally benefitted from it: It helped me convince our IT department to make 4K displays the new standard for Mac users, while PC users are still stuck with 1080p screens.

Are 5K screens superior to 4K ones? Of course, but the image sharpness of my LG 27UL850-W is still exorbitantly better than that of any regular 1080p display. That even holds true if you run it at a non-integer scaled virtual resolution like 2304 x 1296 -- which I did for some until I decided that things being slightly bigger at 1080p was actually not a bad thing.

USB C to mDP cable works perfectly for 60 Hz 4K from M1 to my Dell P2415Q. Been using it for donkeys years, back to my 2013 MacBook Pro. It’s not a high end monitor by any means but desktop retina in a small size (24 inch) and budget was what I needed then and all these years since. Think it was £400 when new. Lower midrange as far as I’m concerned (I recall the nineties…) which the design, build and finish all agree with.

I run it from my M1 Air nowadays via CalDigit TS3. Single cable to the Mac (use it as my desktop and charger) and rock solid ports aplenty instead of relying on a display’s hub. The Dell’s USB 3 hub is okay but you have to power cycle the display quite routinely on hooking up a host (bad Dell, bad!) which naturally drops your USB devices just to get a bloody picture. Separate Thunderbolt or USB hub is by far the better way. Works a treat as the only connection I ever need.

I don’t consider myself high end by any means. No chance I was buying one of those MacBook Pros, as my Air is just superb for me already. Apple’s XDR is as likely to show up on my desk as a racing car in the drive! But someday, if they ever make one of those iMac displays without the iMac, I could, I might. Kinda expect it to cost the same with or without the integrated computer, mind!

> I wonder if there might have been a better display market had Windows not pretended 4K was 5K.

Perhaps, but to my eye, Windows does a much better job with 150% (non-integer) zoom on a 27" 4K monitor, than the Mac. Lots of system text and icons are pixel-perfect, afaict. Really the only time I notice I'm not on an integer scale is when using old Win32 installers.

For my eyes, having most things seem pixel-perfect and a handful of things being blurrier is a more pragmatic approach than the Mac's approach of nothing being pixel-perfect at 4K/27". And as discussed above, the Windows approach opens the door to much more affordable monitors (a used 27"/4K/IPS/matte display often goes for under $300 on ebay).

Also, text on Windows generally looks great on standard DPI monitors, so long as you're not using UWP apps (which are still far and few between). No longer being able to use my existing standard DPI monitors for legible text rendering was one of many reasons I switched off of macOS last year.

4K at 27″ is pushing it but is by no means unusable. I ran dual side-by-side 4K displays with 2x (pixel-perfect) scaling for several years until the pandemic hit. Everyone focuses on that 220ppi number without really taking viewing distance into account. Push the displays a little further back and utilize text scaling, browser zoom, and other affordances to make up the difference. It’s better, in my eyes, than non-native HiDPI or 1080p and 1440p standard DPI at any size, regardless of OS.

That said, I agree that the real solution involves either (a) adopting non-integer scaling in the OS without relying on framebuffer scaling/non-native resolutions, or (b) the hardware market getting with the program. And part of pushing the hardware market forward starts with Apple offering 24″ 4.5K and 27″ 5K displays, even if the price is more than most people are willing to pay.

>And part of pushing the hardware market forward starts with Apple offering 24″ 4.5K and 27″ 5K displays, even if the price is more than most people are willing to pay.

So far (9 years into the Mac Retina transition!), that doesn't appear to be happening. I just don't think Apple offering a display of their own will move the needle much, especially not if it's four figures. If anything, it might backfire. LG will probably discontinue their option, since they don't really care to compete against Apple in that space, and others will point to Apple and say, "they have one of those; we have different displays".

The PC monitor market, unusually, has developed in an entirely different direction. The enthusiasts care about higher framerates, not higher resolutions (if anything, they're counterproductive for performance reasons). There's overlap in terms of color depth, though.

Windows does offer (non-integer) higher DPI, but mostly on internal laptop/tablet displays. Which leads to all kinds of funny bugs when moving from one display to another (if they have different DPIs); we're still quite far from a situation where that works seamlessly. (Including, still, some Microsoft apps, and also major third-party apps like Acrobat Reader.) It's theoretically supported in Windows, and they keep making improvements, but there's a lot of legacy code to comb through. Assumption A: the DPI will always be roughly 96. Assumption B: the DPI will always be the same after Windows login. Assumption C: OK, but the DPI will always remain the same while the app is running, right?

I sued to be really happy with my new monitor. Coul someone tell me why the
Lenovo ThinkVision T27p-10
27" 3840 x 2160 16:9 isn't good enough?

I'm not doing full time video editing or photo editing, but from what I gather nwither are most of those that complain about monitors for their macs.

@ Kristoffer: the problem is that Apple's UI is designed with a certain correlation between logical pixels and physical millimeters in mind. Your screen, at its native resolution, isn't close to that, so you no longer get "WYSIWYG", if you will. You can either set it to Retina, and everything will appear physically too large, or non-Retina, and everything will appear physically too small.

See the chart at https://bjango.com/articles/macexternaldisplays/

(Whether this is a *huge* problem is up for debate.)

Thanks! Very illuminating. Thankfully it's an issue I can easily ignore.

> LG will probably discontinue their option

Someone from that HN thread claims the LG 5K is already discontinued: “Sadly it's out of production, everything you can find (if you can find it) is new-old stock.” [1]

> The enthusiasts care about higher framerates, not higher resolutions

Anecdotally I’ve noticed a growing contingent of PC users lamenting the state of desktop monitors. LG offers 4K OLED TVs with Dolby Vision, 120Hz VRR, etc from 48″ and up. Likewise, Apple offers mini-LED, XDR, ProMotion, etc at tablet/laptop sizes. But even the best consumer desktop monitors aren’t anywhere close to striking distance of TVs and mobile device displays. Not sure what my point is… I guess even PC users who aren’t affected by macOS pixel density issues aren’t happy?

I also don’t think an entry-level 4.5K Apple display would need to cost 4 figures. Clearly the panels are good enough for Apple, and available at sufficient volume to use in a $1299 computer. But for some reason you can only get that display with a computer grafted to it.

[1] https://news.ycombinator.com/item?id=29469837#29470396

Stay up-to-date by subscribing to the Comments RSS Feed for this post.

Leave a Comment