Catching Native Apps
Daniel Jalkut, in 2010:
If you imagine a world where the sum of all things you can do with a computer is exactly matched, and locked down for all time with what you can do inside a browser, then the arguments for the web are persuasive. Why write for a specific platform when you can write for all platforms at once and gain the other advantages as well?
The error is in disregarding the many unmatchable attractions of “the desktop.”
[…]
But if I want to write a truly great app, it has to be a desktop app. And this will be true forever, or until there is no difference between the web and the desktop.
Apple fixed the hardware problems with the Mac, now they must address software. We need M1-level software platform differentiation, and three competing app frameworks won’t create it. Are they even aware how tentative their footing in consumer software is? They’re not showing it.
12 years ago, I wrote “Can’t Catch Me”, wherein I proclaimed with confidence that the Mac would continue to outpace web platforms. That cockiness presupposed a much greater level of commitment from Apple than we’ve seen.
Since then, Apple has slowed the pace of improvements to the frameworks for writing native Mac apps. It added technical (sandboxing, TCC, SIP, kernel extension restrictions) and policy (App Review) roadblocks that make it harder to develop apps that go beyond what can be done with Web technologies. Apple switched to an annual release cycle, increasing the proportion of time native developers spend testing and working around bugs. For the most part, that doesn’t affect Electron apps, which are insulated from the OS with a layer of middleware, or apps that don’t take advantage of OS-specific features. And it doesn’t affect apps that run purely in the browser. So it has the effect of holding back the types of apps that push the envelope, that increase the distance between Web and desktop.
Meanwhile, Apple is no longer leading by example, at least not in a good way, as its recent Mac apps have been Catalyst ports or weird hybrids that feel more Web or iOS than Mac. Former role model apps were rewritten for iOS, then brought back to the Mac, losing features and desktop-oriented design in the process.
Automation has been a major platform-specific advantage. We once hoped for a successor to AppleScript; now we are grateful that it is at least still on life support. Automator never got much follow-through. Shortcuts for Mac is finally here but is currently rough and lacking some capabilities of the iOS version. The Mac’s Unix layer has been withering, and built-in scripting languages are being removed. Developer tools used to come free on a CD with the OS. Now, you need a paid account to ship an app that isn’t accompanied by a malware warning, and even then you have to upload each build to Apple first. Web app developers don’t need permission to deploy their code.
Apple stopped maintaining an online directory of Mac apps, so it’s harder for customers to find what’s available if it’s not in the Mac App Store. The more distance there is between your app and what a Web app could do, the less likely it is to be allowed in the store. (Even for apps in the store, browsing is more difficult than with the old directory.) Apple also stopped offering affiliate commissions on apps, reducing the incentives for third-party coverage that would help people find a Mac-only app. Web apps, however, get to share marketing across multiple platforms, and they don’t have to pay Apple 30%.
In short, it feels like the distance has closed somewhat since 2010. This is partially because Web technologies got better, but also because of inattention and poor incentives from Apple.
To head off any critics who might ask, “OK, smartass, what would YOU do to improve the Mac as a platform?” I say: I don’t know, I look to historic innovators like Apple for that. I would probably start by picking three intrinsic advantages of web apps and strategize against them.
Never gonna happen, but:
- provide means for updating/crash catching of non-MAS apps
- provide means for paying/licensing of non-MAS apps
- make cross-device document storage suitable for shoebox-style apps the default (indexing, full text search, conflict handling just work)
[…]
Isn’t it ironic how the Mac App Store promised a quick, secure, and easy way for developers to get their apps discovered, installed, and paid for - and it turned out to be the exact opposite with its sandboxing requirements, malware infestation, and bogus review process.
Uggh electron. I’m now getting bug reports for my Mac app that the keybindings don’t work like windows.
This past summer we narrowly avoided a major user interface regression on Apple devices. The story ended well, but I think it’s important to look back on the situation and ask a simple question:
Why did this happen in the first place?
My answer is something I call “consistency sin”. Understanding the cause lets us avoid similar situations in the future.
Previously:
- Music.app and TV.app Use JET in macOS 12.2 Beta
- The Persistent Gravity of Cross Platform
- Apple Execs on the Mac App Store
- 10th Anniversary of the Mac App Store
- Sketch on Native Mac Apps
- Catalyst and Cohesion
- Desktop Apps Post-Catalyst
- Scripting Languages to Be Removed
- Mojave’s rsync From the Days of Tiger
- Is There Hope for the Mac App Store?
- Electron and the Decline of Native Apps
- Apple Removes Apps From Their Affiliate Program
- An Aging Collection of Unix Tools
Update (2022-01-14): See also: Hacker News.
Update (2022-01-17): Steve Troughton-Smith:
I think a big part of what Apple needs to do is give AppKit apps an onramp to the future of Apple’s universal app ecosystem. SwiftUI is decidedly not that, and I think that’s a huge mistake. The Mac is going to be minimized further when Apple’s next major OS/platform ramps up
My 2¢:
- Documentation: Apple’s is the worst of any platform I use.
- Bugs: fix ’em. Also, the secret bug tracker is stupid.
- @SwiftLang: make it good on every platform; nobody wants an(other) OS-specific language.
Update (2022-01-24): Francisco Tolmasky:
I think the history of tabs serves as a fascinating case study of how Apple’s neglect for its own UI frameworks assisted the rise and acceptance of cross-platform frameworks like @electronjs and the corresponding decline in the importance of “nativeness” and “the HIG”.
[…]
Apple did eventually end up adding an API for tabs to AppKit, but not until 2017, 12 years after they shipped their first tabs in Safari! By that point, it was more of a pain to convert legacy code than anything, and it wasn’t helped by a woeful lack of documentation.
For a long time, the best resource on how to use AppKit’s tabs was a single WWDC video. And of course, being new, it lacked many of the features existing apps had already implemented.
[…]
In the last decade, macOS has slowly been allowed to degrade into an environment where there are increasingly fewer downsides to counteract the benefits of going non-native, and that’s not on developers. It’s been a long time coming, and it isn’t going to be easy to fix.
Update (2022-01-25): Colin Cornaby:
Apple has made the Mac experience much more web like. And that makes it easier to ship a Mac app as a web app.
Back in the day, Mac apps used to have a lot of rich behaviors that were best exposed by system frameworks. Stuff like sheets, utility panels and windows, rich customizable toolbars, etc. As those were all striped away, it provided less of a reason to use the system frameworks.
I think there is a decent argument that this maybe traces back to the iOS-ification of the Mac experience. Apple trying to unify the macOS UX with the iOS UX lead to a lot of those behaviors being stripped out, as they weren’t as easily implementable on an iPhone or iPad.
[…]
Also in case it wasn’t clear: Catalyst and iPhone apps on Mac are clearly not solutions to this problem, and may be exacerbating this problem. These frameworks should probably exist but the core Mac experience should not be built around them.
Yeah, unfortunately the calendar APIs are a good example where things have gone wrong with Apple loosing interest in devs and purely focusing on making the features ship before wwdc.. 😢 None of the stuff they added in recent years made it into the dev API.
[…]
Of course the big problem here is that if you can’t do more than what Apple has thought of for their Apps, you can’t expect to innovate beyond what’s there either. […]
Sadly, at first instance this might even lead to Apple believing that they are the only ones out there doing real innovation, not seeing that this and things like the sandbox simply curbs us developers. Ultimately it will simply mean a dead by a thousand cuts…
If even die-hard Apple devs like myself start to long back to the web world where your creative idea is just one upload away from anyone who can type a web address (just like it felt to make Mac apps for OSX in 2001), it should raise alarm bells, but I doubt it does in Cupertino.
Update (2022-02-11): Orta Therox:
I love the care & attention in solid native apps but on the Mac I only use ~4 3rd party apps now.
As the Mac becomes more iOS-y, x-plat apps have made regularly switching between Linux easier.
It’s a trade of polish/convenience for long-term value alignment.
Which is a shame because I have the strongest value alignment with the sorts of folks who make native apps, and I’m very happy to pay for good software.
Yet they don’t have much say in how the OS and ecosystem shift, and everyone has to adapt in their own ways.
One reason why people “live in the browser” is that Apple has totally undermined the native Mac software ecosystem. The crap store race to the bottom. Gatekeeper. Notarization. Catalina Vista TCC, implementing what Apple parodied in “Get a Mac”.
Look at the flaming hoops you have to jump through just to install Audio Hijack on M1 Macs. It’s absurd!
No wonder people live in the browser. Apple doesn’t let you live anywhere else.
Previously:
22 Comments RSS · Twitter
Don't foget to blame Swift: the new language that was supposed to be available on many platforms (contrary to Objective-C) and which ended up being exclusively used on Apple platforms (like Objective-C).
@someone Swift is an interesting case because it’s better than Objective-C in so many ways. But is it as much better as today’s JavaScript (and associated ecosystem) is compared with JavaScript in 2010? I don’t think so. Certainly not prior to Swift Concurrency, which just landed, while JavaScript had async/await in 2017.
IMHO the problem is simple. Reducing investment in a (pretty stable) product helps to improve ROI, which helps to increase profitability, which helps to keep share prices up, which helps the CEO claim multi-million dollar bonuses.
"But if I want to write a truly great app, it has to be a desktop app. And this will be true forever, or until there is no difference between the web and the desktop."
This has always been wrong, because it completely discounts the advantages of the web. Cross-platform compatibility makes an app greater. No installation makes an app better. Simple collaboration and sharing with anyone, including people on weird devices like Chromebooks, makes an app greater. Automatic updating makes an app greater. Being able to open the app I normally use on my desktop on my phone in a pinch makes an app greater. Not locking me into using a Mac makes an app greater.
The advantages native apps have over web apps pale in comparison to the advantages web apps have over native apps. And while web apps can - and do - get better in the areas they are currently worse than native apps, that same thing doesn't apply to native apps. I'll never be able to just open any native app I use on my phone.
As of right now, if you want to create a truly great app, it has to be a web app. And this will *actually* be true forever.
This is a tricky and important subject.
If we concede for a moment that Web apps are not only good enough but better in many ways, then the Mac is kind of moot, because you might as well just use a Chromebook. Therefore, we must instead look at what a Mac could hypothetically do better than the Web.
Michael makes some great points. I would add that what Apple has lost, compared to the early 2000s, their drive to bundle great apps. A Mac used to come with arguments in favor right on the box: iLife, iWork, and third-party bundled apps such as OmniGraffle. Many of those apps still exist, of course, but what's missing is the same kind of "you'll buy a Mac because these great apps ship right with it — and plenty more like them exist for the platform" sales pitch.
> Daniel Jalkut: To head off any critics who might ask, “OK, smartass, what would YOU do to improve the Mac as a platform?” I say: I don’t know, I look to historic innovators like Apple for that.
Unfortunately that’s also the only option, which didn’t use to be the case. I’ve said many times already, this is what you get if you force people to play in a Sandbox. Doesn’t exactly allow you to think outside the box, now does it?
I honestly don't see why this is tricky or important. Most people don't care about the details devoted developers love to sweat over.
But, at the same time people care about nice things, so as long as Apple keeps building the best Chromebooks money can buy, they'll be able to sell.
@Plume Yes, I quoted Joe making a similar point in the web3 post. Installation and updating are disadvantages (in some ways), though Apple could have done more to make them better. Collaboration and sharing are huge for certain kinds of apps, though there’s no reason in principle that we couldn’t have native clients for Web APIs.
@Sören @Kristoffer I think this is basically right and was going to mention it in the post yesterday but ran out of time. It used to be that Apple had different, and often slower and/or more expensive, hardware and was missing native apps that Windows had, so it needed good native apps to help sell the platform. Now, with so much moved to the Web, those apps run anywhere, so the OS matters much less (to most people). Apple likes to think of Safari as a differentiator, but despite being bundled it’s not even the leading browser (by marketshare) on macOS. The point is that software, in general, is no longer as much of a selling point for Apple hardware. Good design, build quality, performance, and a battery life are more important differentiators for many people.
@Alexander Yes, restricting innovation from third parties means that it has to come from Apple or other platforms first.
@ Michael:
>It used to be that Apple had different, and often slower and/or more expensive, hardware and was missing native apps that Windows had, so it needed good native apps to help sell the platform. Now, with so much moved to the Web, those apps run anywhere, so the OS matters much less (to most people). Apple likes to think of Safari as a differentiator, but despite being bundled it’s not even the leading browser (by marketshare) on macOS. The point is that software, in general, is no longer as much of a selling point for Apple hardware. Good design, build quality, performance, and a battery life are more important differentiators for many people.
Yes. And I'll agree with Daniel that they've done a lot to improve the hardware situation.
But, on the "platform as a whole" side, I worry that education is a canary in the coalmine. Kids grow up with Chromebooks being good enough, because schools want to save money and really just need a very simple platform to run basic (and heavily restricted and centrally managed) software. Maybe the solution there is to push the envelope on what education software can be, but Apple isn't doing that either.
Then, kids get their office job, and the requirements aren't that much different: a whole lot of enterprise software is really just CRUD stuff. Having the local computer do computation is neither desirable (because you want consistent state) nor necessary (because computation typically isn't heavy, nor is required bandwidth). So you give workers dumb terminals.
Some of these use cases may be lost for good. But others aren't. Listening to ATP the other week (unfortunately, I can't quite remember which episode this was) about how creating a photo book for the family used to be easy and no longer is was frustrating. This is exactly the kind of use case where Apple could be great, and has instead decided (more or less explicitly) to give up. Make the iMac a fantastic living room computer not just in hardware, but also software, and the desire will come for people to have that same kind of experience at work (and maybe even at school).
And, yep, my preferred browser is Safari, for a number of reasons, but it definitely isn't much of a differentiator. Nobody goes, "I should switch to the Mac because Safari is great".
@ Alexander:
>Unfortunately that’s also the only option, which didn’t use to be the case. I’ve said many times already, this is what you get if you force people to play in a Sandbox. Doesn’t exactly allow you to think outside the box, now does it?
Yep, and I think Apple underestimates this problem. Or they're painfully aware, but don't know how to square that circle.
Apple splitting their resources to support AppKit, SwiftUI, and Catalyst probably doesn't help. For me this makes me not want to make Mac apps anymore. Who wants to spend a year or more working on a quality app when you don't know if the technology you choose will be deprecated out from underneath you? Ironically SwiftUI and Catalyst were created to convince more developers to come to the Mac. I don't have any numbers but am interested in knowing if they have succeeded in this goal? Is Mac app development up or down since these new frameworks were introduced?
The cost of not installing software includes
* less privacy
* doesn't work whenever there's no internet
* goes down whenever the server or the intermediate route goes down
* disappears if the company goes bankrupt / kicks you off
* need to continually pay for a subscription
And all that to save what? Dragging and dropping an application?
Ownership has distinct advantages: you buy it, then you have it. If you need it years later, you can still use it.
Federighi enthusiastically hinted at this year’s WWDC that AppleScript is probably going away in a few years.
1:25:59 “The Mac has a long, deep history of automation, with command line, shell scripts, AppleScript and Automator.” — https://youtu.be/0TD96VTf0Xs?t=5159
1:27:20 “Shortcuts is the future of automation on Mac and this is just the start of a multi-year transition. Automator will continue to be supported.” — https://youtu.be/0TD96VTf0Xs?t=5240
See the omission?
It's weird to realize that there are fans of web applications here. I guess if you started using a Mac in this iOSified/webified era of inconsistent user interfaces where any UI element might be a button or editable field, it makes sense. This goes along with Sören's "good enough" point and what kids have grown up with, but it's anathema to those of us who grew up with Mac OS 7/8/9/X and wrote free Cocoa applications for users who would have a fit if button placement was 2 points off per the Aqua HIG.
People who grow up using MS Teams or experiencing Apple Mail/Photos data corruption and loss issues aren't going to understand what a good experience is like. OmniGraffle, OmniOutliner, DataTank, DataGraph, Igor Pro, BibDesk, TIFFany3, iPhoto, TextMate, TextWrangler, TeXShop…all great applications I've used over the years (disclaimer: I wrote parts of BibDesk, back in the day). I'm forced to switch between a Mac and Windows at work, and Emacs keybindings, menus vs. weird scrolling toolbars, Spotlight, and Quick Look are killer features that I miss on Windows.
I'm with Old Unix Geek on this as far as data ownership and network problems. The fastest broadband I can get is 5 Mbps, and I know people who are lucky to get 512 kbit/s, when it works. Web applications and cloud syncing/storing are basically unusable, unless you have a higher tolerance for UI latency and data corruption than I do.
I tried to get into max app development and there are a few good courses, lots of complaining about the state of documentation and not enough leadership from Apple.
Now I’m still writing web services.
"I guess if you started using a Mac in this iOSified/webified era of inconsistent user interfaces where any UI element might be a button or editable field, it makes sense"
It's funny to experience this trend where my peer group goes from "it's all our parents' fault" to "it's all the kids' fault."
My first computer was a Mac Classic running System 6 and MultiFinder. If you don't like the web, it's probably your fault. Our generation made it.
OmniGraffle is great. It would be even greater if it was a web app.
And yeah, I don't mind paying a subscription fee for a good product, because then at least the company will have a reliable revenue stream, and not abandon the product two years down the line. Also, it's possible to make web apps that run offline, and that run well on slow connections.
I agree with the fact that notarization (even outside the App store) and the $99 developer fee (AKA "you must be at least THIS committed") are significant stumbling blocks, since my kids currently write Catalyst apps (they have the patience) and I briefly considered going through the entire rigmarole of setting up a family dev account.
But for me the reason there simply just aren't any new mainstream Mac native apps of note (besides niche apps and cross-platform stuff like OBS) is a combination of the mediocre expectations towards computing experiences that the Web bestowed upon developers, the unfortunate creation of Electron (don't get me started on JavaScript), and the mediocre state of Apple developer documentation in general -- I had to do some Swift the other day for a CLI tool (but which had to do screen captures), and it took me _forever_ to do something I could have found and done in literally _seconds_ in Windows or Linux.
As someone who dabbled in Win16 and Mac OS 8 development and then moved to Qt and GTK before giving up GUI development, I find the state of Apple's documentation to be atrocious -- it is even worse than Javadoc in the sense that it doesn't provide any examples, doesn't clarify the relationships between some classes and types, and doesn't even cover all the arguments that I can see, for a fact, being used in code samples in the wild.
As to AppleScript and automation, I used the PyObj-C bridge for a _long_ time, and have routinely tried to find a "modern", forward-looking way to keep doing some automation tasks. JAX is woefully undocumented (and seems dead), AppleScript is stalled, and Shortcuts (despite usable for some things) is a Chicco-inspired mess that was designed for a mobile device, not for a desktop environment.
This has all happened before (I survived COM/OLE on Windows, and watched as Visual Basic stopped being relevant in the Windows automation space as well), but it doesn't mean that we have a better computing experience today--in fact, if you were to pick up Tog On Interface or any other UI/UX book from that time, it would seem like a long lost tale of an ancient advanced civilization in comparison to my typing this in on Firefox here...
@Adam Maxwell, you nailed it. I went to the Mac as a main machine in mid-2020 and pretty much everything great that I heard about the Mac is either gone or actually was never that great.
The only app that I came across that was a delight in part because of how Mac-like it is is Things (which uses custom elements). I didn’t try the apps you list (but thanks for this list, I’ll investigate them), but I tried Panic’s Nova as a code editor. In trying to be very idiomatic to the Mac it provides a subpar experience in a number of ways. Two examples: can’t use cmd+shift+right when text is selected, same as on Xcode (at least with a French keyboard), and by default the folders and files are sorted by names instead of being at the top, because that’s how the Finder does it I guess, but it makes no sense in a code editing context.
Regarding stuff that is said to be great but is actually not that great, the best example might be the menu bar. It’s out of sight, it’s bloated with default options, and most of its non-default options are repetitive with what’s in the in-sight main window. It’s a hamburger menu on steroids.
Also the menu bar sub-menus disappear if you go in an instinctive diagonal to access them. It takes time getting used to it. I remember reading an article on how Amazon fixed that problem (on their *web site*) around ten years ago: https://bjk5.com/post/44698559168/breaking-down-amazons-mega-dropdown/amp
The performance is terrible. On the early 2020 MacBook Air it takes 200 ms for cmd+tab to appear. Coupled with the legacy decision of separating windows from apps (and letting apps be open without windows) it makes app switching hell. It’s very well done and performant (on much weaker hardware) in Windows. We know that the entry MacBook Air is by far the most sold computer by Apple, so the 2020 I’ve got is more performant than most used machines today.
Regarding native app development, the AppKit docs and the atrocious performance of Xcode (it takes two seconds to highlight a syntax problem. Write a line, wait two seconds to check that you’ve got it right as a Swift beginner, that’s the workflow) is just laughable.
The WWDC 2021 video “Write a great Mac app with SwiftUI” highlights that you can differentiate the style of your app by giving it an accent color, which is ridiculous. Seeing that garden app later on I thought for a split second that it was showing the Finder. The presenter is showing what’s he’s doing in Xcode really fast (unlike every video tutorial out there, that feels quite slow), and when you want to go back because you haven’t had the time to memorize the line of code he typed, Apple’s official Developer app takes you 15 seconds back in the video, doubling your watch time anyway. But they thought to put carrots next to him to go along with the narrative that he likes to garden, how delightful.
The performance implications of Electron are I feel overstated. A simple app takes 80MB of RAM, so what? Every Mac has 8GB now. App installer size being 300MB is supposedly so so gross, well, please take a look at the size of MacOS security fixes (2.5GB). Regarding launch time, no native apps is a delightful screamer anyway.
I’m Booboo the fool having given native Mac development a try. Just embracing a 20% hit in UX (for most people) and going with Electron (versus WKWebView even) will allow me to develop the UI ~6 times faster as a web dev who believes in attention to detail who has a cross-platform (including Linux) GUI app to bring to the world.
Those are just a few examples, but what I’ve experienced is really a fractal of bad design. I had high expectations as delivered by Mac fans these past 15 years, and I resent the platform.
My understanding is that in the past ten years macOS didn’t get UI improvements, though it got lots of UI regressions, while Windows did get UI improvements. I’ve come to terms that to experience just how fucking great the Mac is I’ll have to run a Snow Leopard/Mountain Lion (for Retina) VM and peruse torrent sites for old versions of commercial software. And that I’ve lost €1200 and much productivity.
@Old Unix Geek I strongly agree that owning software has advantages. I buy physical copies of Switch games for my kids (often 2nd hand). But I strongly disagre that this: "And all that to save what? Dragging and dropping an application?" is a fair description of the advantages that webapps deliver.
In the 10 year old post linked at the top Daniel Jalkut list the top ones:
* Cross-platform. You can run a web app on any computer that has a capable browser.
* Ubiquity. You don’t need to be near your computer to run a web app. You just “log in.”
* Instant updates. No need to coerce customers to update, just change the code on the server and they’re quietly updated to the latest running code.
* Open, standard technologies. You have control over and access to most of the source code at the heart of your application.
(Given the state of current webd evelopment I don't fully agree on the last point though)
On a Mac I find it easier to click a link and be off, than using the Mac App Store.
"Given the state of current webd evelopment I don't fully agree on the last point though"
I agree that current web app frameworks are overly convoluted, but still: the fact that I can click on the little "Code Injector" icon in my Firefox bookmark bar and just add random features to web apps, or change how they look, is a clear plus in my book. Just yesterday, I added a "save all documents" button to a web app that didn't have that feature. It took me ten minutes.
I'm never going to do that with a native app, and if I am, it's going to take way more than ten minutes.
Re: Subscriptions/Auto-Updates/etc
I mind all these "features". I use Xara (on Windows) once every... 2 years or something. Then I use it intensively for a couple of weeks. I still use 5.x. Why?
* I own it, and don't have to decide to pay again for it, or leave it draining my bank account in case I'll need it later.
* The UI is exactly what I remember, so I can get on with it.
* It's on the windows box (which is not attached to the network because... I don't want it to auto-update and break itself, or to get a virus and break itself).
There's plenty software that I do not want updated, do not want to pay for all the time, do not need to be "ubiquitous", and do not need to be cross platform. Just like, I don't use my circular saw every day, but it's there, and I don't want to learn to use a different interface each time I use it, so I bought it instead of renting it.
People making software cannot imagine that for most of us it's just a tool. Not something we want to be "delighted by" after we first met it: "Don't make me think!" to quote Norman. A pencil either does what we expect when we scratch it on a piece of paper, or it's broken.
As to "open, standard, technologies", that sounds good, in theory, as you mention. Except the web is moving so damn quickly that lots of websites do not work with an older webbrowser. Google et al are demonstrating that you don't have to "own" a standard to control it. You just have to mutate it quickly enough and auto-update the devices you still want your users to use. (Planned obsolescence.)
Also the price of those "open, standard, technologies" is the use of the internet which you have to rent. And it uses a lot of energy. It accounts for ~4% of all CO2 production, like air-travel does.
If you stop and think about it, the modern world is rather unhinged. Making a computer uses a lot of energy and releases a lot of toxins (silicon). A computer can be built to last a couple of decades. (For extreme examples, people still run computers from the 1980s, and for some purposes, they're absolutely good enough such as a camping ground reservation system on the Atari ST). Yet, your modern computer is deprecated after a couple of years, your phone even less. Your modern computer is many times more powerful than a supercomputer in the 1980s and your software could run locally at a fraction of the energy cost. It's not like you need a mainframe. But instead we run the software on a server, and convert our local supercomputer into a dumb terminal. All this at a time when the world had its hottest year (2021) and we desperately need to cut back on CO2 emissions. (Don't get me started on bitcoin). A saner species might make different choices: to make things last longer and use less energy.
The cause is obviously incentives. The web is winning, not only for the advantages people mention, but because of the advantages it provides corporations, large and small:
* write once, on a single OS
* cheaper to maintain (commodity programmers, only the "latest" browsers to test on)
* can force users down paths they don't want to go
* subscriptions, no piracy
* easier to do A/B testing
* can force hardware & software upgrades ("Please upgrade to the latest browser" that only runs on MacOS 999 that doesn't work on your machine... oh and that means all the rest of your software has to be upgraded too, again for more profit...)
It's so weird. I started programming in the early days of micro-computers. Wasn't it nice to own your own software, and not rely on some mainframe somewhere? And now, we're back to that. Perhaps the incentives will change again as disasters (floods, wildfires, etc) become more common.
There is a class of software that uses a web-browser only as a UI, but runs on your computer. Jupyter lab, SyncThing, etc. Unlike Electron, it's not written in JS and doesn't take up half your RAM. But it has the UI inconsistencies people mention. On the other hand, people do know how to use the web. I'm undecided on this, although the amount of code between deciding to draw the UI and actually drawing the UI makes something want to die in me. For simple UIs, however, it might be a good compromise.
Part of the problem is that Apple often breaks OS X so that old apps don't run correctly. So the argument about buying software and then using it years later is kinda moot. What happens when you buy a new Mac? You can't downgrade to a previous OS version in order to make old software work. This is what I really like about Windows -- it will run almost anything I throw at it, including old esoteric MIDI software from 20 years ago (too bad the UI still sucks in Win 11). Good luck getting something like SoundDiver to run on a modern Mac.
This is a great point:
> Since then, Apple has slowed the pace of improvements to the frameworks for writing native Mac apps. It added technical (sandboxing, TCC, SIP, kernel extension restrictions) and policy (App Review) roadblocks that make it harder to develop apps that go beyond what can be done with Web technologies.