Mark Alldritt and Shane Stanley (Mastodon):
January 2025 marks Script Debugger’s 30th anniversary. It’s been a very long run for a two-person effort. Script Debugger began as a Classic MacOS product, survived Apple’s near-death experience, transitioned to macOS X and migrated across 4 CPU processor types. We are so grateful for the support we’ve received over these years. This support allowed us to keep working on Script Debugger much longer than we ever imagined.
Shane and I are retiring and the effort and costs associated with continuing Script Debugger’s development are too great for us to bear any longer.
[…]
In June 2025, Script Debugger will no longer be offered for sale and all support and maintenance will cease.
At this time, Script Debugger will become a free download.
This is really sad news. Script Debugger is an excellent app that I use nearly every day, and there’s nothing else like it. Alldritt had hinted at retirement before, but I had hoped that they would sell the app or that, with AppleScript not changing very quickly these days, it wouldn’t be too much of a burden to maintain. But with a constant stream of new OS bugs, new privacy and security requirements, and deprecated APIs, it’s impossible for an app to stand still. You have to keep updating it or it will break over time.
In any case, I thank them for spending decades developing an app that belongs in the Mac hall of fame.
Previously:
Update (2025-01-06): Uli Kusterer:
Pretty sure I used Script Debugger to do some extensive reworks of EyeTV’s AppleScript support, and it was so much more helpful than just waiting for Script Editor to abort with an error.
Brian Webster:
Sad to see Script Debugger going away, though I totally understand the decision. This tool has saved me sooooo many hours of time over the years, I very much do not look forward to whatever future macOS update that ultimately ends up breaking it. 😩
Update (2025-01-08): Jason Snell:
There are many great independent Mac apps out there that have been developed for decades by a single developer or a small team; I admit that I’ve been worried about the fate of those apps for a while now. Developers deserve to retire just like anyone else, but as happy as that moment can be for the people involve, I also selfishly dread the loss of another indie Mac app I’ve relied on for years.
Update (2025-02-11): Zsolt Benke:
I don’t know the current state of AppleScript inside Apple, but I know that the difference between Script Debugger and Script Editor is night and day. Script Debugger should be part of the system, which is why I feel sad that another great Mac app, especially one with such a long history, is getting retired.
See also: Mac Power Users Talk.
Update (2025-06-03): Mark Alldritt (forum):
The day has finally come. After 30 years of continuous development, Script Debugger has been retired and will no longer be available for sale.
I’m still really sad about this.
Gus Mueller:
I still use Script Debugger to this day.
30 years of development is a long time, and Script Debugger is such a great app. Congrats Mark - you made something awesome.
John Gruber:
30 years ago was 1995 — which was so solidly in the classic Mac era that the OS was still named “System 7”, not “Mac OS 7”. I forget when I first started using Script Debugger, but it was definitely in the classic Mac era.
[…]
Script Debugger isn’t just a spectacularly good Mac developer tool. (Indispensable, I would say. A lot of the problems many scripters have with AppleScript aren’t just mitigated by using Script Debugger instead of Apple’s free Script Editor — they go away.) It has also always come with spectacularly thorough and exceedingly well-written documentation — a good user manual describes what a product does, but a great one also explains how to use it.
But even better than that, the product always fostered a community of users. You could email tech support for help and get world-class expert personal assistance, or, you could participate in their (still vibrant!) user forum.
[…]
I haven’t really spent much time thinking about “apps” retiring, even while at the top of their game, but here we are.
AppleScript Mac App macOS 15 Sequoia Script Debugger Sunset
Scott Knaster:
I worked in Silicon Valley for many years with brilliant people at amazing companies that changed the world. A lot of my stories are about those people and places. But some of them are about something unexpected I saw on a walk around my neighborhood. Stuff like that.
I tell stories face to face, over a meal, in online posts, and on stage. And now I’m trying this new way! Here I’ve written a bunch of stories in this Google Doc, like a little book. I’ve told some of them before and refreshed them a bit for this book. Others are brand new and I’m telling them here for the first time.
[…]
Steve entered the little interview room and sat down 3 feet away from me across a tiny round table. He leaned forward and said: “Are you the best technical writer in the world?”
I was stunned into silence for a few seconds, as I tried to figure what to say. And then, like an idiot, I gave a direct, thoughtful answer. “No. The best technical writer in the world is my friend Caroline Rose, and she already works here at NeXT.”
Via Dave Mark:
I’ve known my buddy Scott Knaster for a VERY long time. He and I wrote some of the earliest Apple developer books, became fast friends in that surprisingly small universe.
Scott just released a Google doc with a draft of his memoirs. Scott is a very entertaining writer, and the doc is chock full of pictures and wonderful anecdotes.
If you are a techie of any stripe, this is worth your time.
Some of his excellent books are How to Write Macintosh Software (PDF) and Macintosh Programming Secrets (PDF).
Previously:
Update (2025-03-14): See also: his Adjacent to Greatness video and Substack and story about Douglas Adams at WWDC.
Update (2025-03-18): See also: Adam Engst.
Apple Apple II Book Documentation Google Hiring History Mac Microsoft NeXT Programming Steve Jobs
Jeff Johnson (Mastodon, Hacker News, Reddit, 2, The Verge, Yahoo):
This morning while perusing the settings of a bunch of apps on my iPhone, I discovered a new setting for Photos that was enabled by default: Enhanced Visual Search.
[…]
There appear to be only two relevant documents on Apple's website, the first of which is a legal notice about Photos & Privacy:
Enhanced Visual Search in Photos allows you to search for photos using landmarks or points of interest. Your device privately matches places in your photos to a global index Apple maintains on our servers. We apply homomorphic encryption and differential privacy, and use an OHTTP relay that hides IP address. This prevents Apple from learning about the information in your photos. You can turn off Enhanced Visual Search at any time on your iOS or iPadOS device by going to Settings > Apps > Photos. On Mac, open Photos and go to Settings > General.
The second online Apple document is a blog post by Machine Learning Research titled Combining Machine Learning and Homomorphic Encryption in the Apple Ecosystem and published on October 24, 2024. (Note that iOS 18 and macOS 15 were released to the public on September 16.)
As far as I can tell, this was added in macOS 15.1 and iOS 18.1, not in the initial releases, but it’s hard to know for sure since none of Apple’s release notes mention the name of the feature.
It ought to be up to the individual user to decide their own tolerance for the risk of privacy violations. In this specific case, I have no tolerance for risk, because I simply have no interest in the Enhanced Visual Search feature, even if it happened to work flawlessly. There’s no benefit to outweigh the risk. By enabling the “feature” without asking, Apple disrespects users and their preferences. I never wanted my iPhone to phone home to Apple.
Remember this advertisement? “What happens on your iPhone, stays on your iPhone.”
Apple is being thoughtful about doing this in a (theoretically) privacy-preserving way, but I don’t think the company is living up to its ideals here. Not only is it not opt-in, but you can’t effectively opt out if it starts uploading metadata about your photos before you even use the search feature. It does this even if you’ve already opted out of uploading your photos to iCloud. And “privately matches” is kind of a euphemism. There remains no plain English text saying that it uploads information about your photos and specifically what information that is. You might assume that it’s just sharing GPS coordinates, but apparently it’s actually the content of the photos that’s used for searching.
Ben Lovejoy:
One piece of data which isn’t shared is location. This is clear as several of my London skyline photos were incorrectly identified as a variety of other cities, including San Francisco, Montreal, and Shanghai.
Nick Heer:
What I am confused about is what this feature actually does. It sounds like it compares landmarks identified locally against a database too vast to store locally, thus enabling more accurate lookups. It also sounds like matching is done with entirely visual data, and it does not rely on photo metadata. But because Apple did not announce this feature and poorly documents it, we simply do not know. One document says trust us to analyze your photos remotely; another says here are all the technical reasons you can trust us. Nowhere does Apple plainly say what is going on.
[…]
I see this feature implemented with responsibility and privacy in nearly every way, but, because it is poorly explained and enabled by default, it is difficult to trust. Photo libraries are inherently sensitive. It is completely fair for users to be suspicious of this feature.
In a way, this is even less private than the CSAM scanning that Apple abandoned, because it applies to non-iCloud photos and uploads information about all photos, not just ones with suspicious neural hashes. On the other hand, your data supposedly—if their are no design flaws or bugs—remains encrypted and is not linked to your account or IP address.
jchw:
What I want is very simple: I want software that doesn’t send anything to the Internet without some explicit intent first. All of that work to try to make this feature plausibly private is cool engineering work, and there’s absolutely nothing wrong with implementing a feature like this, but it should absolutely be opt-in.
Trust in software will continue to erode until software stops treating end users and their data and resources (e.g. network connections) as the vendor’s own playground. Local on-device data shouldn’t be leaking out of radio interfaces unexpectedly, period. There should be a user intent tied to any feature where local data is sent out to the network.
Apple just crowed about how, if Meta’s interoperability requests were granted, apps the user installed on a device and granted permission to would be able to “scan all of their photos” and that “this is data that Apple itself has chosen not to access.” Yet here we find out that in an October OS update Apple auto-enabled a new feature that sends unspecified information about all your photos to Apple.
I’m seeing a lot of reactions like this:
I’m tired with so much privacy concerns from everyone without any reason… Yes it sends photo data anonymously to make a feature work or improve it. So what? Apple and iOS are the most private company/software out there.
But I’m tired of the double standard where Apple and its fans start from the premise of believing Apple’s marketing. So if you’re silently opted in, and a document somewhere uses buzzwords like “homomorphic encryption” and “differential privacy” without saying which data this even applies to, that’s good enough. You’re supposed to assume that your privacy is being protected because Apple is a good company who means well and doesn’t ship bugs.
You see, another company might “scan” your photos, but Apple is only “privately matching” them. The truth is that, though they are relatively better, they also have a history of sketchy behavior and misleading users about privacy. They define “tracking” so that it doesn’t count when the company running the App Store does it, then send information to data brokers even though they claim not to.
Eric Schwarz:
With Apple making privacy a big part of its brand, it is a little surprising this was on by default and/or that Apple hasn’t made a custom prompt for the “not photo library, not contact list, not location, etc.” permissions access. Some small changes to the way software works and interacts with the user can go a long way of building and keeping trust.
Matthew Green:
I love that Apple is trying to do privacy-related services, but this just appeared at the bottom of my Settings screen over the holiday break when I wasn’t paying attention. It sends data about my private photos to Apple.
I would have loved the chance to read about the architecture, think hard about how much leakage there is in this scheme, but I only learned about it in time to see that it had already been activated on my device. Coincidentally on a vacation where I’ve just taken about 400 photos of recognizable locations.
This is not how you launch a privacy-preserving product if your intentions are good, this is how you slip something under the radar while everyone is distracted.
Jeff Johnson:
The issues mentioned in Apple’s blog post are so complex that Apple had to make reference to two of their scientific papers, Scalable Private Search with Wally
and Learning with Privacy at Scale, which are even more complex and opaque than the blog post. How many among my critics have read and understood those papers? I’d guess approximately zero.
[…]
In effect, my critics are demanding silence from nearly everyone. According to their criticism, an iPhone user is not entitled to question an iPhone feature. Whatever Apple says must be trusted implicitly. These random internet commenters become self-appointed experts simply by parroting Apple’s words and nodding along as if everything were obvious, despite the fact that it’s not obvious to an actual expert, a famous cryptographer.
Previously:
Update (2025-01-02): See also: Hacker News.
Franklin Delano Stallone:
If it were off by default that would be a good opportunity for the relatively new TipKit to shine.
Jeff Johnson:
The release notes seem to associate Enhanced Visual Search with Apple Intelligence, even though the OS Settings don’t associate it with Apple Intelligence (and I don’t use AI myself).
The relevant note is that in 15.1 the Apple Intelligence section says “Photos search lets you find photos and videos simply by describing what you’re looking for.” I’ve seen reports that the setting was not in 15.0, though its release notes did include: “Natural language photo and
video search
Search now supports natural language
queries and expanded understanding,
so you can search for just what you
mean, like ‘Shani dancing in a red dress.’”
Eric deRuiter:
There are so many questions. Does disabling it on all devices remove the uploaded data? Is it only actually active if you have AI on? Does it work differently depending on if you have AI enabled?
My understanding is that there is nothing to remove because nothing is stored (unless in a log somewhere) and that there is no relation to Apple Intelligence.
Rui Carmo:
I fully get it that Photos isn’t really “calling home” with any personal info. It’s trying to match points of interest, which is actually something most people want to have in travel photos–and it’s doing it with proper masking and anonymization, apparently via pure image hashing.
But it does feel a tad too intrusive, especially considering that matching image hashes is, well, the same thing they’d need to do for CSAM detection, which is a whole other can of worms. But the cynic in me cannot help pointing out that it’s almost as if someone had the feature implemented and then decided to use it for something else “that people would like”. Which has never happened before, right?
thisislife2:
I was going through all the privacy settings again today on my mom’s iPhone 13, and noticed that Apple / ios had re-enabled this feature silently (enhanced visual search in Photos app), even though I had explicitly disabled it after reading about it here on HN, the last time.
This isn’t the first time something like this has happened - her phone is not signed into iMessage, and to ensure Apple doesn’t have access to her SMS / RCS, I’ve also disabled “Filter messages from unknown senders”. Two times, over a period of roughly a year, I find that this feature has silently been enabled again.
These settings that turn themselves back on or that say they will opt you out of analytics but don’t actually do so really burn trust.
Update (2025-01-07): Thomas Claburn:
Put more simply: You take a photo; your Mac or iThing locally outlines what it thinks is a landmark or place of interest in the snap; it homomorphically encrypts a representation of that portion of the image in a way that can be analyzed without being decrypted; it sends the encrypted data to a remote server to do that analysis, so that the landmark can be identified from a big database of places; and it receives the suggested location again in encrypted form that it alone can decipher.
If it all works as claimed, and there are no side-channels or other leaks, Apple can’t see what’s in your photos, neither the image data nor the looked-up label.
Fazal Majid:
There are two issues with this, even before considering possible bugs in Apple’s implementation, or side-channels leaking information:
1) as with the CSAM scanning case, they are creating a precedent that will allow authoritarian governments to require other scanning
2) uploading the hash/fingerprint reveals to someone surveilling the network that someone has taken a photo.
[…]
In a previous breach of trust and consent, they also turned on without consent in Safari the Topics API (Orwellianly called “privacy-preserving analytics/attribution” when it is nothing but an infringement of privacy by tracking your interests in the browser itself). Even Google, the most voyeuristic company on the planet, actually asked for permission to do this (albeit with deliberately misleading wording in the request, because Google).
Fred McCann:
Even if the results are encrypted you don’t control the keys - best case scenario is Photos is generating without telling you and placing it somewhere(?). And the server side could store encrypted results for which they or some other party could have a backdoor or just store them until advances render the enc scheme defeatable. Who gets to audit this?
Roland:
It is quite easy to see how governments will order Apple to abuse this feature in future, without any need to sabotage or compromise any Apple-supplied homomorphic-encryption / private-query / differential-privacy / ip-address-anonymizing security features[…]
[…]
That government instructs Apple to match “landmark” searches against (non-landmark) images (archetypes) which the government cares about.
[…]
When a match is found, Apple sets a “call the secret police” flag in the search response to the client device (iPhone, Mac, whatever).
[…]
Everyone can analyze the heck out of the Apple anonymous search scheme-- but it doesn’t matter whether it is secure or not. No government will care. Governments will be quite satisfied when Apple just quietly decorates responses to fully-anonymous searches with “call the secret police” flags-- and Apple will “truthfully” boast that it never rats out any users, because the users’ own iPhones or Macs will handle that part of it.
Jeff Johnson:
With Enhanced Visual Search, Apple appears to focus solely on the understanding of privacy as secrecy, ignoring the understanding of privacy as ownership, because Enhanced Visual Search was enabled by default, without asking users for permission first. The justification for enabling Enhanced Visual Search by default is presumably that Apple’s privacy protections are so good that secrecy is always maintained, and thus consent is unnecessary.
My argument is that consent is always necessary, and technology, no matter how (allegedly) good, is never a substitute for consent, because user privacy entails user ownership of their data.
[…]
The following is not a sound argument: “Apple keeps your data and metadata perfectly secret, impossible for Apple to read, and therefore Apple has a right to upload your data or metadata to Apple’s servers without your knowledge or agreement.” There’s more to privacy than just secrecy; privacy also means ownership. It means personal choice and consent.
[…]
The oversimplification is that the data from your photos—or metadata, however you want to characterize it—is encrypted, and thus there are no privacy issues. Not even Apple believes this, as is clear from their technical papers. We’re not dealing simply with data at rest but rather data in motion, which raises a whole host of other issues. […] Thus, the question is not only whether Apple’s implementation of Homomorphic Encryption (and Private Information Retrieval and Private Nearest Neighbor Search) is perfect but whether Apple’s entire apparatus of multiple moving parts, involving third parties, anonymization networks, etc., is perfect.
See also: Bruce Schneier.
iOS iOS 18 Mac macOS 15 Sequoia Photos.app Privacy
David Nield:
Honey, which is owned by PayPal, is a popular browser extension—with 19 million users on Chrome alone—but the shopping tool is being accused of some seriously shady practices, including keeping users away from the lowest online prices and blocking creator affiliate links to deprive them of revenue. The scandal surfaced through a comprehensive video posted by MegaLag, who calls it “the biggest influencer scam of all time” based on an investigation that’s apparently been ongoing for several years. MegaLag claims to have reviewed masses of documents, emails, and online ads in the course of the investigation, as well as having spoken to victims and personally falling foul of Honey’s methods.
Wes Davis:
Honey works by popping up an offer to find coupon codes for you while you’re checking out in an online shop. But as MegaLag notes, it frequently fails to find a code, or offers a Honey-branded one, even if a simple internet search will cover something better. The Honey website’s pitch is that it will “find every working promo code on the internet.” But according to MegaLag’s video, ignoring better deals is a feature of Honey’s partnerships with its retail clients.
MegaLag also says Honey will hijack affiliate revenue from influencers. According to MegaLag, if you click on an affiliate link from an influencer, Honey will then swap in its own tracking link when you interact with its deal pop-up at check-out. That’s regardless of whether Honey found you a coupon or not, and it results in Honey getting the credit for the sale, rather than the YouTuber or website whose link led you there.
The official response denies nothing:
Honey is free to use and provides millions of shoppers with additional savings on their purchases whenever possible. Honey helps merchants reduce cart abandonment and comparison shopping while increasing sales conversion.
Update (2025-01-02): See also: Wladimir Palant and Marques Brownlee.
Preetham Narayanareddy:
Honey sponsored Mr. Beast in 3 videos, gaining a total of 140M views after spending approximately $120,000.
Update (2025-01-06): Elliot Shank:
Lawyer YouTuber is starting a class-action lawsuit against PayPal/Honey.
Update (2025-03-12): Jay Peters (via Dare Obasanjo):
Google has updated its affiliate ads policy for Chrome extensions after creators accused PayPal’s popular Honey browser extension of being a “scam.”
[…]
It is not permitted to inject affiliate links without related user action and without providing a tangible benefit to users.
App Store Scams Bargain Business Google Chrome Honey Mac Mac App Store macOS 15 Sequoia PayPal Safari Extensions Shopping Web