Archive for April 26, 2022

Tuesday, April 26, 2022

Studio Display 15.5 Firmware Beta

Juli Clover:

Apple in March promised a firmware fix for the Studio Display to address an issue with the webcam, and according to an Apple spokesperson that spoke to The Verge, the firmware beta released today does indeed fix the webcam problem.


The firmware brings updated “camera tuning, improved noise reduction, contrast, and framing.”

Jason Snell (tweet):

This video is from two Studio Displays, one placed directly in front of the other. One display has the original shipping firmware, the other updated firmware. Judge the differences for yourself!

Jason Snell (tweet):

In general, I’d say the new firmware generates a better picture. A lot of that is down to the fact that it seems to prefer a wider crop. That’s good, because it means it’s using a larger portion of the Center Stage camera’s 12 megapixel image. More pixels should equate to a better image.

I’m not sure I have figured out how the Center Stage framing algorithm may have been tweaked. An advantage of the new, wider crop that Center Stage seems to prefer is that it requires less panning in general—a slight shift in posture isn’t nearly as dramatic when there’s more room around a face in the frame.

James Thomson:

Comparing the 15.5 (1st pic) and 15.4 (2nd pic) firmware for the Studio Display camera. There’s a lot less noise, and a touch more contrast, but it’s still quite washed out compared to the iMac Pro camera (3rd pic, taken last month).

The iMac Pro one—using a 2 MP camera from 2017 without the benefit of an A13—is so much better. If this improved version is what Apple intended, it seems like they made a terrible design choice with the Studio Display.


Update (2022-04-27): Nilay Patel (tweet):

It’s not going to blow anyone’s mind, but it’s definitely not as immediately broken-looking as before. There’s a little more detail in my face, less noise in the background, and the colors are definitely more accurate. (I look like I have actually seen the sun in the past year.) It does seem like Apple’s decision to put a wide-angle lens on this camera for Center Stage support is working against image quality here — no matter what, it’s cropping down that sensor, giving the whole system less data to work with.

Juli Clover:

In practice, if you watch our video up above, there are noticeable quality updates, but the difference is subtle. Colors are not as pale, there’s improved contrast, and the overall look is a bit more vibrant. Depending on lighting, there isn’t a huge difference in sharpness, but the changes seem to be an overall net improvement.

Filipe Espósito:

When Apple announced Studio Display, it promised “sensational” webcam quality. However, as customers got their hands on the product, they noticed that the images captured by the built-in camera were not good. Apple is now rolling out a beta software that promises to fix some of these issues – but the thing is, Studio Display’s poor webcam quality is not a software bug after all.


So while an iPhone is capable of taking a real 12-megapixel selfie, Center Stage cameras capture images at 12 megapixels using the ultra-wide lens and then digitally crop them to look like a regular photo or video. This process results in less-sharp images.

Mark Gurman:

In hindsight, it appears Apple knew this all along. Their original response: “We’ll be making improvements in a future software update." The word “fix” is never mentioned. There were some sites that claimed adamantly it wasn’t a hardware issue though. Appears it is.

Steve Troughton-Smith:

I don’t understand how the biggest camera company in the world thinks it’s okay to put such mediocre cameras in its multi-thousand-dollar products.

Jason Snell:

Should it just be a standard 4K webcam that doesn’t move instead of a CS camera with all those issues? Probably. But someone at Apple fell in love with that CS hardware.

James Thomson:

I recorded video directly from the Studio Display with 15.5 (1st pic) and the iPad Pro (2nd pic), and they are definitely closer, though I think the iPad Pro is still a bit better contrast-wise.

Mike Rundle:

I’m still shocked that Apple produced this. It’s impossible that Apple tested it thoroughly and saw a completely different quality before it hit the manufacturing line: they knew it looked that bad during various stages of early development and gave it the green light

Nick Heer:

Apple is usually great at making cameras perform well and its marketing emphasizes the new camera system in this display. Something is going very wrong here.

James Thomson:

[There’s] totally enough information in the output from the Studio Display to do a better job of rendering my face (I manually tweaked colours in Photoshop).

Maybe there’s still hope with some ML or even just basic calibration options?

Jasper Hauser:

What’s utterly weird to me is that the histogram clearly shows that the white balance is pretty off. It’s almost as if the iOS camera team was in no way involved in leveraging their image processing knowledge. Why?

Update (2022-04-28): Carolina Milanesi:

So I did a quick video comparison with my DSLR and the #AppleStudioDisplay after the update. A pretty clear improvement that serves most needs for a day spent in video meetings.

Infinite Mac

Mihai Parparita (tweet, Hacker News):

I’ve extended James Friend’s in-browser Basilisk II port to create a full-featured classic 68K Mac in your browser. You can see it in action at or


At this point I switched my approach to downloading pieces of the disk image on demand, instead of all upfront. After some false starts, I settled on an approach where the disk image is broken up into fixed-size content-addressed 256K chunks. Filesystem requests from Emscripten are intercepted, and when they involve a chunk that has not been loaded yet, they are sent off to a service worker who will load the chunk over the network. Manually chunking (as opposed to HTTP range requests) allows each chunk to be Brotli-compressed (ranges technically support compression too, but it’s lacking in the real world). Using content addressing makes the large number of identical chunks from the empty portion of the disk map to the same URL. There is also basic prefetching support, so that sequential reads are less likely to be blocked on the network.

Along with some old fashioned web optimizations, this makes the emulator show the Mac’s boot screen in a second, and be fully booted in 3 seconds, even with a cold HTTP cache.


While Emscripten has an IDBFS mode where changes to the filesystem are persisted via IndexedDB, it’s not a good fit for the emulator, since it relies on there being an event loop, which is not the case in the emulator worker. Instead I used an approach similar to uploading to send the contents of a third ExtFS “Saved” directory, which can then be persisted using IndexedDB on the browser side.

It includes BBEdit 2, CodeWarrior 4, GraphicConverter, Hotline, KPT Bryce, Netscape, and lots of games.


The Story of iBeer

Quinn Myers (via Nick Heer):

In the year leading up to the App Store’s debut, Apple sought out developers to create software that put the iPhone’s prowess on display. One of those developers was down-and-out, 37-year-old magician Steve Sheraton. “Apple started scouting for developers, and they approached me because I’d made a YouTube video where I made the phone look like a glass of beer,” Sheraton recalls. “They wanted me to make an app out of that because they obviously thought it would show off the phone pretty well.”

To be sure, Sheraton’s path to creating an app that would be featured in tandem with the new App Store was several years in the making. The idea itself harkened back to Sheraton’s career in magic. “Anything that uses visual effects to cause shock or humor is up my alley,” he tells me. “I built the very first iteration of this mechanism for the Palm Pilot, called E-spresso, which turned the little monochrome screen into a cup of coffee — but because it didn’t have an accelerometer, I just made it a video that you could time with your drinking motion.”


Before the App Store was even a concept, Sheraton started selling the beer-drinking video file for $2.99. “It was just a little video file that people had to hardwire in and download via iTunes,” he says. “But I probably made around $2,000 a day for the longest time from that.”

Dave Mark:

A Reddit AMA with the creator of one of the first iPhone apps that went viral, made millions.

Video Conferencing Apps Send Audio When Muted

Jason Daley (PDF, via Tristan Greene):

They used runtime binary analysis tools to trace raw audio in popular videoconferencing applications as the audio traveled from the app to the computer audio driver and then to the network while the app was muted.

They found that all of the apps they tested occasionally gather raw audio data while mute is activated, with one popular app gathering information and delivering data to its server at the same rate regardless of whether the microphone is muted or not.

The researchers then decided to see if they could use data collected on mute from that app to infer the types of activities taking place in the background. Using machine learning algorithms, they trained an activity classifier using audio from YouTube videos representing six common background activities, including cooking and eating, playing music, typing and cleaning. Applying the classifier to the type of telemetry packets the app was sending, the team could identify the background activity with an average of 82% accuracy.


Update (2022-04-27): See also: Hacker News.

Update (2022-04-29): See also: Bruce Schneier.

Update (2022-05-09): Rogue Amoeba:

With SoundSource’s Input muting, you can be certain your mic is muted.