Tuesday, June 23, 2020

Apple Silicon

Apple (also: TidBITS, MacRumors, Hacker News):

Apple today announced it will transition the Mac to its world-class custom silicon to deliver industry-leading performance and powerful new technologies. Developers can now get started updating their apps to take advantage of the advanced capabilities of Apple silicon in the Mac. This transition will also establish a common architecture across all Apple products, making it far easier for developers to write and optimize their apps for the entire ecosystem.


To help developers get started with Apple silicon, Apple is also launching the Universal App Quick Start Program, which provides access to documentation, forums support, beta versions of macOS Big Sur and Xcode 12, and the limited use of a Developer Transition Kit (DTK), a Mac development system based on Apple’s A12Z Bionic System on a Chip (SoC).

Apple plans to ship the first Mac with Apple silicon by the end of the year and complete the transition in about two years. Apple will continue to support and release new versions of macOS for Intel-based Macs for years to come, and has exciting new Intel-based Macs in development.

I think this is going to be great in the long run (except for those who use abandoned apps or need Intel virtualization), but I’m not looking forward to the transition period or to converting all my apps on such a short schedule.

Eric Slivka:

The program requires a brief application, with limited availability and priority for developers with an existing macOS application. The program costs $500 and includes access to beta software, developer labs, private discussion forum, technical support, and other resources.

I applied yesterday. It took a while, as the site was hammered.

Apple (via Hacker News):

Rosetta can translate most Intel-based apps, including apps that contain just-in-time (JIT) compilers. However, Rosetta doesn’t translate the following executables[…]


Making a big deal of virtualization still being there is necessary, but the way it was presented totally gave the (wrongful) impression that virtualizing Intel from Apple Silicon was possible.


OpenGL support will be present-but-deprecated from the start, which essentially means the full OpenGL stack (beyond OpenGL ES) is available.


Being able to use XPC to support Intel and ARM plugins separately is inspired.

See also:


11 Comments RSS · Twitter

Don’t assume a discrete GPU means better performance. The integrated GPU in Apple processors is optimized for high performance graphics tasks.

This is a bold claim. Apple isn't just ditching Intel, they're taking on AMD and Nvidia.

I hope Parallels or VMWare figures out how to make it work. I hate using Windows, but there are a few times a month where I absolutely need it to configure some music hardware that I own which doesn't have a Mac app or just to test how my website looks on Windows when I make updates. I've seen some people say that a solution is using a cloud instance of Windows, but I don't see how this would work for configuring hardware that's sitting on my desk. Can cloud virtualized Windows access USB ports on my Mac as if they were connected to the remote computer? I can't imagine that's possible...

> I think this is going to be great in the long run

Perhaps a year ago I would have agreed, but right now, as competition between AMD and Intel is heating up again, this looks like a bad move. I don't see how Apple will be able to compete at the high end. This will be good for light notebooks, but terrible for everything else.

> Don’t assume a discrete GPU means better performance

I'm sure what Apple will be able to do in two years will easily be competitive with AMD's current offerings, but there's a reason discrete GPUs are so much faster than integrated ones, and that reason will still be true in two years. Prediction: Nvidia's mid- to high-end cards in two years (and probably AMDs as well) will be an order of magnitude more powerful than what Apple will offer by that same time.

>I hope Parallels or VMWare figures out how to make it work

They won't. Remember how "awesome" Intel emulation was on PPC Macs? It's probably going to be even worse now.

I get why Apple is doing this, and I'm sure it'll work out well for their bottom line, but for the people who actually use their devices, this is just bad on all fronts.

(Although now that I think about it, the reason most people have powerful GPUs is gaming. But this will probably pretty much end ports of PC games to the Mac, will make it impossible to boot into Windows to play these games, and will instead result in more mobile games being ported to the Mac. Whatever high-end gaming still existed on Macs will cease to exist. So maybe the lack of powerful GPUs will only be relevant to niches like graphic artists, for whom Apple already doesn't offer suitable hardware.)

> Don’t assume a discrete GPU means better performance. The integrated GPU in Apple processors is optimized for high performance graphics tasks.

It's a bit of moot point since you can't use PyTorch, Caffé, Tensorflow, or Theano with GP-computer on Apple hardware. In WWDC '18 some Apple developers claimed that they were working on a Metal Compute Shader backend for Tensorflow. It never happened.

Imagine being a professional game-developer confronted with the choice between SpriteKit, or making your own 3D game engine from scratch with low-level Metal. This is the situation scientific users find themselves when confronted with CreateML and Metal Computer Shaders.

Scientific computing is a profession Apple has chosen not to cater to. Much like education they give it lip service for branding reasons, but they don't follow through with software development and maintenance.

Scott Boone

I’m still not sure I understand why the DTK required a Mac mini enclosure. The Mac mini main board is pretty small. Most of the internal space is taken up with the power supply, cooling fan, and space for DIMMs and storage… all things we’d been led to believe an Apple ARM wouldn’t need. Certainly the iPad Pro with the same A12Z doesn’t need that stuff.
I really saw the “future” of low#cost pervasive computing being the Apple TV. This move somewhat diminishes my hopes that Apple “gets” it. If the next $200 Apple TV gets an A12X but does not get the ability to run Mac OS 11, my suspicions will be confirmed: profit above all else.

(What we get in the base iMac will be telling too… does Apple take the opportunity to embrace componentization and the “Reuse” part of “Recycling” to align with the “green” PR, or do the continue on the current path of planned obsolescence of expensive highly non-recycle-able chassises?

Old Unix Geek

If you're wedded to Apple, this seems great. If you're not, it isn't. But the ultimate goal is to embrace those who are wedded to Apple even harder.

For instance, as an iOS dev, code should work exactly as it does on next generation iPhones/iPads, making dev cycles faster, and profiling easier. Using APIs instead of cross-platform code, Apple can accelerate various bits of functionality by adding hardware to their next silicon, helping them beat Moore's law. But of course, software is the complement to hardware, and companies maximise profit by commodifying their complements. Hence all the free apps on the iOS App store. Bringing those to the Mac will commodify Mac apps... resulting in most Mac developers starving, without a way out since their knowledge and software is so tied to MacOS.

If you're not wedded to Apple, because you're programming OS-independent C++, targeting servers, games, or doing Scientific programming... you are being shown the door. As it is, tools like valgrind or ghc had to "be updated" each time a new OS came out. Many ML frameworks rely on CUDA or AVX. It probably won't be worth continuing to port them. So I anticipate a parting of ways for all those relying on third-party ecosystems. The dream of the best of Unix and the best of a consumer environment seems to be dying. Choose one, or the other.

>If you're wedded to Apple, this seems great

I'm actually genuinely surprised by the positive reaction (e.g. Gus: "Anyway, the more I think about the ARM transition, the more I get excited about it"). From the consumer's point of view, there are very few advantages to this, and a lot of disadvantages. It's not like Apple is going to pass the savings on to us.

>The dream of the best of Unix and the best of a consumer environment seems to be dying

Linux is actually excellent these days (e.g. Pop!_OS), and WSL pretty much makes Windows as good a Unix as Mac OS.

@Lukas Yeah, I would love to be surprised, but my guess is that the ARM Macs are not going to be any less expensive.

@Scott Boone

"I’m still not sure I understand why the DTK required a Mac mini enclosure."

Apple is still being raked over the coals by developers for making their notebooks have just four identical ports. And these are desktop systems, which require more ports. For developers, who also require more ports (dual or triple monitors, for instance) than regular users. Putting a mac inside an ATV case would absolutely not leave room enough for anywhere near enough ports.

Also, like the last time Apple released a transition DTK, they used the simplest, cheapest enclosure they had on hand that required minimal engineering effort. An ATV enclosure would have thermal constraints that they'd have to work around, while the Mac Mini box they can just slap a logic board in there and feel confident the stock fan will do the job.

>> I’m still not sure I understand why the DTK required a Mac mini enclosure.

It doesn't. It's an existing enclosure for a machine that will never ship, is only for developers to test on (hence Developer Transition Kit), containing a processor off the shelf meant for iPad Pros. Apple said all this. Developers aren't even purchasing this hardware, they're borrowing it: signing the application and accepting the hardware shipment means you're promising to send back the DTK by a certain date.

Apple's not going to go to any extra effort to build an enclosure for it. The DTK for Intel was a PowerMac enclosure.

Leave a Comment