Brent Simmons reported to me that my blog’s RSS feed wasn’t updating in recent versions of NetNewsWire. They’ve added support for the Cache-Control
response header, and, for reasons unknown, my site was returning an interval of 2 days:
$ curl --head https://mjtsai.com/blog/feed/
HTTP/2 200
date: Wed, 08 Jan 2025 14:28:24 GMT
server: Apache
vary: Accept-Encoding,Cookie,User-Agent
link: <https://mjtsai.com/blog/wp-json/>; rel="https://api.w.org/"
etag: "9efc6f6ed8885592fcee58bc1685dcaf"
cache-control: max-age=172800
expires: Fri, 10 Jan 2025 14:28:24 GMT
content-type: application/rss+xml; charset=UTF-8
even though plain HTML content was only cached for 10 minutes:
$ curl --head https://mjtsai.com/blog/
HTTP/2 200
date: Wed, 08 Jan 2025 14:34:20 GMT
server: Apache
vary: Accept-Encoding,Cookie,User-Agent
cache-control: max-age=3, must-revalidate
content-length: 307509
last-modified: Wed, 08 Jan 2025 14:30:02 GMT
cache-control: max-age=600
expires: Wed, 08 Jan 2025 14:44:20 GMT
content-type: text/html; charset=UTF-8
I spent a while trying to figure out why WordPress would do that, but it turns out to be a default set by my server provider, DreamHost. RSS feeds fall under the default file type even though they are more likely to change frequently.
There are various ways to override this using Apache’s .htaccess file. Simmons is using this for his feed:
<Files "rss.xml">
<IfModule mod_headers.c>
Header set Cache-Control "max-age=300"
</IfModule>
</Files>
But I don’t want to list each file separately because this blog has many feeds, e.g. one for the comments on each post. What seems to work is setting the expiration by MIME type:
<IfModule mod_expires.c>
ExpiresActive on
ExpiresByType application/rss+xml "access plus 300 seconds"
ExpiresByType application/atom+xml "access plus 300 seconds"
</IfModule>
Please let me know if you run into any problems with this.
Previously:
Apache Atom Syndication Format DreamHost NetNewsWire RSS This Blog Web
Graham Fraser:
Apple Intelligence, launched in the UK earlier this week, uses artificial intelligence (AI) to summarise and group together notifications.
This week, the AI-powered summary falsely made it appear BBC News had published an article claiming Luigi Mangione, the man arrested following the murder of healthcare insurance CEO Brian Thompson in New York, had shot himself. He has not.
Imran Rahman-Jones:
A news summary from Apple falsely claimed darts player Luke Littler had won the PDC World Championship - before he even played in the final.
The incorrect summary was written by artificial intelligence (AI) and is based on a BBC story about Littler winning the tournament semi-final on Thursday night.
Within hours on Friday, another AI notification summary falsely told some BBC Sport app users that Tennis great Rafael Nadal had come out as gay.
Nick Heer:
The ads for Apple Intelligence have mostly been noted for what they show, but there is also something missing: in the fine print and in its operating systems, Apple still calls it a “beta” release, but not in its ads. Given the exuberance with which Apple is marketing these features, that label seems less like a way to inform users the software is unpolished, and more like an excuse for why it does not work as well as one might expect of a headlining feature from the world’s most valuable company.
[…]
Apple has also, rarely, applied the “beta” label to features in regular releases which are distributed to all users, not just those who signed up. This type of “beta” seems less honest. Instead of communicating this feature is a work in progress, it seems to say we are releasing this before it is done. Maybe that is a subtle distinction, but it is there. One type of beta is testing; the other type asks users to disregard their expectations of polish, quality, and functionality so that a feature can be pushed out earlier than it should.
[…]
This all seems like a convoluted way to evade full responsibility of the Apple Intelligence experience which, so far, has been middling for me. Genmoji is kind of fun, but Notification Summaries are routinely wrong. Priority messages in Mail is helpful when it correctly surfaces an important email, and annoying when it highlights spam. My favourite feature — in theory — is the Reduce Interruptions Focus mode, which is supposed to only show notifications when they are urgent or important. It is the kind of thing I have been begging for to deal with the overburdened notifications system. But, while it works pretty well sometimes, it is not dependable enough to rely on.
Kirk McElhearn:
I don’t think that the vast majority of people know what beta means. Apple has been promoting the shit out of these features, and putting beta in a footnote.
Xe Iaso (via Hacker News):
This phrases a literal scam message in ways that make me think immediate action is required. You can see how this doesn’t scale, right?
[…]
Even more, if you have Apple Intelligence enabled for some of the other features but disable notification summaries because you find them worthless, you can get your notifications delayed up to five seconds. It’s kind of depressing that telling your computer to do less work makes the result take longer than doing more work.
Additionally, none of the summarization features work on my iPhone and I can’t be bothered to figure out why and fix it. I personally don’t find them useful. I just leave them enabled on my MacBook so that notification delivery is not impacted.
Eric Schwarz:
[The] whole vibe of Apple Intelligence is off-putting and feels like a not-ready-for-primetime suite of features that make the user experience worse.
Juli Clover:
Apple is working on an update for Apple Intelligence that will cut down on confusion caused by inaccurate summaries of news headlines, Apple told BBC News. In a statement, Apple said software coming soon will clarify when notifications have been summarized by Apple Intelligence.
[…]
There have been several prior events where Apple Intelligence provided incorrect details from incoming news app notifications. In November, Apple Intelligence suggested Israeli Prime Minister Benjamin Netanyahu had been arrested, incorrectly interpreting a story from The New York Times.
[…]
Apple Intelligence notification summaries are an opt-in feature and they can be disabled.
My understanding is that they are opt-out in that once you opt into Apple Intelligence in general, you have to opt out of the notification summaries if you don’t want them. And, crucially, this is at the user level. There is no way for an app developer such as the BBC to prevent its app’s notifications from being summarized.
John Gruber (Mastodon):
Apple is promoting the hell out of Apple Intelligence to consumers, and its advertisements hide, rather than emphasize, its “beta” quality.
The promotion of a feature is an implicit encouragement to, you know, actually use it.
[…]
Apple Intelligence notification summaries are marked with an icon/glyph, sort of like the “↪︎” Unicode glyph with a few horizontal lines to suggest text encapsulated by the arrow — a clever icon to convey an abstract concept, to be sure.
The meaning of that icon/glyph is not at all obvious unless you know to look for it, and most users — even those who opted in to Apple Intelligence understanding that it was “beta” and might produce erroneous results — don’t know to look for that particular glyph.
[…]
I can also see why Apple doesn’t want to offer such an option to developers. To whom do notifications belong — the developer of the app that generates them, or the user who is receiving them?
Jason Snell:
The statement uses the beta tag it has placed on Apple Intelligence features as a shield, while promising to add a warning label to AI-generated summaries in the future. It’s hard to accept “it’s in beta” as an excuse when the features have shipped in non-beta software releases that are heavily marketed to the public as selling points of Apple’s latest hardware. Adding a warning label also does not change the fact that Apple has released a feature that at its core consumes information and replaces it with misinformation at a troubling rate.
Apple is shipping these AI-based features rapidly, and marketing them heavily, because it fears that its competitors so far out in front that it’s a potentially existential issue. But several of these features simply aren’t up to Apple’s quality standards, and I worry that we’ve all become so inured to AI hallucinations and screw-ups that we’re willing to accept them.
[…]
So what can Apple do now? A non-apology and the promise of a warning label isn’t enough. The company should either give all apps the option of opting out of AI summaries, or offer an opt-out to the developers of specific classes of apps (like news apps). Next, it should probably build separate pathways for notifications of related content (a bunch of emails or chat messages in a thread) versus unrelated content (BBC headlines, podcast episode descriptions) and change how the unrelated content is summarized.
John Gruber:
I side with Apple in not giving developers the option to opt out of notification summaries, and (b) that I’m a bit more of the mind that Apple can address this by somehow making it more clear which notifications are AI-generated summaries. Like, perhaps instead of their “↪︎” glyph, they could use the 🤪 emoji.
Guy English:
If Apple Intelligence summarizes your notifications then Apple should badge it with their Apple logo. Not some weird cog or brain or some other such icon. Put your name on it! Apple is the one presenting this information to you and they should be held accountable for the veracity of it. Put your highly regarded Apple logo on your AI work or get outta here. It’s either an Apple product or it’s not.
Jason Snell:
The problem with Apple’s approach is that it’s summarizing a headline, which is itself a summary of an article written by a human being. As someone who has written and rewritten thousands of headlines, I can reveal that human headline writers are flawed, some headlines are just not very good, and that external forces can lead to very bad headlines becoming the standard.
Specifically, clickbait headlines are very bad, and an entire generation of headline writers has been trained to generate teaser headlines that purposefully withhold information in order to get that click.
[…]
Summarizing summaries isn’t working out for Apple, but more broadly I think there’s something to the idea of presenting AI-written headlines and summaries in order to provide utility to the user. As having an LLM running all the time on our devices becomes commonplace, I would love to see RSS readers (for example) that are capable of rewriting bad headlines and creating solid summaries. The key—as Artifact learned—is to build guardrails and always make it clear that the content is being generated by an LLM, not a human.
Craig Grannell:
Starting to think Apple might regret sticking its name in front of ‘Intelligence’ for all its AI stuff. Notifications are a disaster. Image Email categories are a disaster. And so on. Then again, the ad campaign is somehow even worse than all of that.
The sad thing is, there are good elements to Apple AI/ML. Prompt-based memories in the Photos app. Auto-tagging. Accessibility features like Personal Voice. But so much attention has been grabbed by flashy stuff that did not – and in some cases could not – work.
Steve Troughton-Smith:
The Apple Intelligence vs BBC story is a microcosm of the developer story for the feature. We’re soon expected to vend up all the actions and intents in our apps to Siri, with no knowledge of the context (or accuracy) in which it will be presented to the user. Apple gets to launder the features and content of your apps and wrap it up in their UI as ‘Siri’ — that’s the developer proposition Apple has presented us. They get to market it as Apple Intelligence, you get the blame if it goes awry.
Tim Hardwick:
Apple plans to scale up its News app by adding new countries to the platform beyond the US, Canada, the UK, and Australia, according to the Financial Times.
The plans reportedly include building its locally focused news coverage in the UK, as well as bringing its puzzles section to the country which is currently limited to the US and Canada.
With Apple News, Apple does have access to the full article text. Maybe it will use this to dogfood a way of making this available for notification summaries.
Previously:
Apple Intelligence Apple News Artificial Intelligence iOS iOS 18 Mac macOS 15 Sequoia Notification Center The Media United Kingdom
Michael Burkhardt:
Sonuby is a different kind of weather app, designed for users who often partake in outdoor activities. For example, if you often snowboard, you can have a weather forecast that places snow conditions front and center. Weather needs can be very individualistic, which is why Sonuby allows you to tailor the app to what you care about.
I like that Sonuby lets you customize the display to choose which data to emphasize. My favorite feature is that you can make collections of locations and then easily switch between locations within one of these subsets. Most other apps offer a flat list that becomes unwieldy or has a limit, so that I have to keep deleting and re-adding locations depending on which are most important at any time.
The data is from a combination of sources provided by meteoblue, which I don’t think I’ve used before, so I don’t know how accurate it is.
The app’s overall design is not really my cup of tea, and I ran into some problems adding locations. Some names that I searched for were not available, and others did show up but were lower in the list of matches—the app prioritized similar names that were thousands of miles away from me.
None of the weather apps I’ve tried, including Sonuby, really offers the kind of workflow I’d like for planning an outdoor activity. It’s not just that I want to know the forecast for a certain location on a certain day. I also want to compare several potential mountains to decide where to go based on the weather. Bonus points if it can also compare multiple weather data providers.
Weathergraph is my preferred app for home, and it lets me switch data providers without having to dig into the settings, but it’s useless for this purpose since it only supports one location. Apple Weather and Mercury Weather require a bunch of taps to switch locations and then get back to the right screen. I wish I could navigate to the display I want—say, precipitation next Saturday—and then swipe to see that exact data but for different locations and data providers. (Or, wild idea, how about showing the same data for multiple locations on the same screen at the same time?)
Previously:
iOS iOS 18 iOS App Mercury Weather Sonuby Weather Weather.app Weathergraph
James Thomson (Mastodon):
On the 5th of January 2000, Steve Jobs unveiled the new Aqua user interface of Mac OS X to the world at Macworld Expo.
[…]
The version he showed was quite different to what actually ended up shipping, with square boxes around the icons, and an actual “Dock” folder in your user’s home folder that contained aliases to the items stored.
I should know – I had spent the previous 18 months or so as the main engineer working away on it.
[…]
I didn’t design the dock – that was Bas Ording, a talented young UI designer that Steve had personally recruited. But it was my job to take his prototypes built in Macromind Director and turn them into working code, as part of the Finder team.
[…]
I figured if anybody was finally going to kill off DragThing, it might as well be me.
After DP3, he resigned because Apple wanted him to move to Cupertino. Apple fired all the software engineers in Cork, and then they rewrote all his code before shipping Mac OS X 10.0. It’s remarkable how little the Dock has outwardly changed in the years since.
Jason Snell:
The timeline is interesting. James wrote his classic Mac utility DragThing before working at Apple, then was hired by Apple, then ended up working on the Dock, and then left Apple… to resume working on DragThing.
Also: James’s story about Apple trying to hide James’s location from Steve Jobs is an all-time classic.
Jason Snell:
When I watch the video back, it’s almost surreal how Steve Jobs keeps doing utterly normal, boring things in Mac OS X while the crowd completely loses its collective mind. Viewed by someone without any historical context, it would seem like a cult being whipped into a frenzy by its leader.
But I was there, and I can tell you that it wasn’t that. This was the moment, after 16 years of classic Mac OS–and let’s face it, the last five of those were pretty rough–when all the failings of the Mac were swept away and replaced with something modern, ready for the challenge of the 21st century.
[…]
It’s a bit of a head trip to watch Jobs explain how windows now have three buttons in the top left corner, colored “like a stoplight,” with symbols that appear when you roll the mouse pointer over them. Those buttons have become as much symbols of the Mac as the menu bar itself, but this was the first time anyone saw them.
Joe Groff:
In honor of the 25th anniversary of Mac OS X DP3 and the first public reveal of Aqua, this year’s MacBooks will feature an Apple-logo-shaped notch in the center of the menu bar.
Mario Guzmán:
Full height sidebars and inspectors also contribute to unnecessary waste of space in the toolbar. Also dividing toolbars to match column widths (like Mail and Notes) further makes unnecessary waste of toolbar space.
I’m ready for a Mac OS UI redesign that raises the bar for Desktop OS design. The way Aqua did.
Even going back to the old Aqua toolbar design would be fine. The new Big Sur way—where there’s lots of empty space, yet the window title gets truncated and important buttons, and sometimes even the search field, get stuffed into the overflow menu—is a regression.
See also: John Siracusa (in 2000), Stephen Hackett, Nick Heer.
Previously:
Anniversary Design Dock DragThing History Mac Mac OS X DP 3 macOS 11.0 Big Sur macOS 15 Sequoia Steve Jobs
Adi Robertson (Hacker News):
Apple has agreed to a $95 million settlement with users whose conversations were inadvertently captured by its Siri voice assistant and potentially overheard by human employees. The proposed settlement, reported by Bloomberg, could pay many US-based Apple product owners up to $20 per device for up to five Siri-enabled devices. It still requires approval by a judge.
If approved, the settlement would apply to a subset of US-based people who owned or bought a Siri-enabled iPhone, iPad, Apple Watch, MacBook, iMac, HomePod, iPod touch, or Apple TV between September 17th, 2014 and December 31st, 2024. A user would also need to meet one other major criteria: they must swear under oath that they accidentally activated Siri during a conversation intended to be confidential or private.
Juli Clover:
The lawsuit alleges that Apple recorded conversations captured with accidental Siri activations, and then shared information from those conversations with third-party advertisers.
Two plaintiffs claimed that after speaking about products like Air Jordan shoes and Olive Garden, their devices showed ads for those products, while another said he received ads for a surgical treatment after discussing it privately with his doctor.
[…]
While the lawsuit initially focused on Apple’s lack of disclosure, the first filing was dismissed in February 2021 because it did not include enough concrete data about the recordings that Apple allegedly collected. An amended complaint that focused on Siri recordings used for “targeted advertising” was refiled in September 2021, and that was allowed to move forward.
[…]
Apple says that it “continues to deny any and all alleged wrongdoing and liability, specifically denies each of the Plaintiffs’ contentions and claims, and continues to deny that the Plaintiffs’ claims and allegations would be suitable for class action status.” Apple is settling to avoid further costs of litigation.
I had thought this controversy was about contractors hearing the audio. The advertising angle is new to me. If Apple actually did that, it would be one of the biggest Apple news stories ever. I think it’s much more likely that a third-party app was listening to the microphone or that the ads were not based on audio at all. That said, given that privacy is so important to Apple’s brand, and that it seems so unlikely that Apple’s actually guilty of this, it’s a bit of a mystery why it would want to settle. I would think that proving its innocence would be well worth the legal fees, unless it fears the exposure of other information that would become public in discovery.
Ashley Belanger (Hacker News):
While the settlement appears to be a victory for Apple users after months of mediation, it potentially lets Apple off the hook pretty cheaply. If the court had certified the class action and Apple users had won, Apple could’ve been fined more than $1.5 billion under the Wiretap Act alone, court filings showed.
[…]
It was also possible that the class size could be significantly narrowed through ongoing litigation, if the court determined that Apple users had to prove their calls had been recorded through an incidental Siri activation—potentially reducing recoverable damages for everyone.
Or, maybe they fear a combination of the class being enlarged—almost every iOS user probably had some accidental activations—and a court deciding that the users don’t have to prove anything. Then the damages could really multiply.
Apple probably figures correctly that the advertising allegation will be quickly forgotten. But it’s not a very satisfying resolution. We don’t get to learn the details of what went on, and the compensation is ridiculously low for the people who were actually harmed.
Previously:
Update (2025-01-07): See also: Slashdot.
Iain Thomson:
After being questioned about privacy in a letter from Congress, Cook stated unequivocally that Apple doesn’t collect audio recordings of users without consent.
“Far from requiring a ‘clear, unambiguous trigger’ as Apple claimed in its response to Congress, Siri can be activated by nearly anything, including ‘[t]he sound of a zip’ or an individual raising their arms and speaking,” the complaint reads. “Once activated, Siri records everything within range of the Siri Devices’ microphone and sends it to Apple’s servers.”
[…]
Google is also facing a similar lawsuit after Belgian journalists reportedly found that the Chocolate Factory’s Assistant was also listening in without authorization. That case is still unresolved, and a German investigation into the matter is also ongoing.
Damien Petrilli:
IMHO people should stop giving a pass to Apple and just assume the worst, like for Meta and Google.
Years after years we are told the koolaid that Apple “cares” about privacy. And every year there is a controversy like this, privacy issues, “bugs”.
Nick Heer:
The original complaint (PDF), filed just a couple of weeks after Hern’s story broke, does not once mention advertising. A revised complaint (PDF), filed a few months later, mentions it once and only in passing (emphasis mine)[…] This is the sole mention in the entire complaint, and there is no citation or evidence for it. However, a further revision (PDF), filed in 2021, contains plenty of anecdotes[…]
[…]
I am filing this in the needs supporting evidence column alongside other claims of microphones being used to target advertising. I sympathize with the plaintiffs in this case, but nothing about their anecdotes — more detail on pages 8 and 10 of the complaint — is compelling, as alternative explanations are possible.
[…]
Yet, because Apple settled this lawsuit, it looks like it is not interested in fighting these claims. It creates another piece of pseudo-evidence for people who believe microphone-equipped devices are transforming idle conversations into perfectly targeted ads.
None of these stories have so far been proven, and there is not a shred of direct evidence it is occurring — but I can understand why people are paranoid.
John Gruber:
Apple doesn’t serve well-targeted ads based on text you type, describing exactly what you’re looking for, in the search box in the App Store, but a million gullible idiots believe they’re serving uncannily accurate ads based on snippets of random conversations secretly recorded from across the room.
Juli Clover:
No Siri data has ever been used for marketing purposes or sold to a third-party company for any reason, Apple said today in response to accusations that conversations Siri has captured were used for advertising.
Advertising Apple iOS iOS 12 Lawsuit Legal Privacy Siri
Encyclopedia Macintosh (p. 65, via Alex Rosenberg, rezmason):
HFS and MFS disks can be distinguished by the presence or absence
of the HFS pixel. You can tell if a drive or disk is formatted as HFS or
MFS by looking for the “HFS pixel” in the upper-left corner of any
window from the drive or disk. If this pixel is on, the drive or volume
uses the HFS; if it is off, the drive or volume uses the MFS.
[…]
The HFS pixel can be seen in the left window between the two horizontal
lines just above the folder icon. In the center window it is not present. An
enlargement of the pixel is presented at right.
This reminds me of Norton Disk Light, which used a single flashing pixel in the top-left corner of the display (back when the menu bar was rounded) to indicate disk activity.
Mihai Parparita:
Looks it went away in System 7, even with the B&W window frame.
Alex Rosenberg:
Seems equally likely they didn’t carry over the feature when rewriting the Finder in C++ for System 7.
Jim Luther:
MFS was so ignored in the Finder’s System 7 rewrite that the Finder crashed if you mounted a MFS volume with a long volume name. I found and reported that bug when learning about the File Manager when I switched from Apple II to Macintosh Developer Technical Support.
Update (2025-01-06): Josh Justice:
Who remembers positioning the cursor in System 7 so that it showed 1 pixel between it and the progress bar, so you could tell if it had progressed?
Who remembers trying this in Mac OS 8+ and being frustrated that the beautiful gradient made it harder to tell if there was progress? 😄
Update (2025-01-07): HACKTRIX (via Josh Hrach):
The XYZZY code is a simple cheat code for Minesweeper that helps you find the mines without clicking on the cells. To use this code, open Minesweeper, then type the letters xyzzy and hold the shift button for three seconds. Then minimize all open programs and look closely in the top left corner of your monitor screen. You will see a single pixel turned white.
File System Finder History Mac System 6 System 7
Charles Rollet (Hacker News):
Bench, a Canada-based accounting startup that offered software-as-a-service for small and medium-sized businesses, has abruptly shut down, according to a notice posted on its website.
[…]
The company’s entire website is currently offline except for the notice, leaving thousands of businesses in the lurch. Bench touted having more than 35,000 U.S. customers just hours before it was shut down, according to a snapshot saved by the Internet Archive.
Bench, which had raised $113 million from high-profile backers such as Shopify and Bain Capital Ventures, developed a software platform to help customers store and manage their bookkeeping and tax reporting documents.
[…]
Bench’s notice says its customers should file a six-month extension with the IRS to “find the right bookkeeping partner.” It also says customers will be able to download their data by December 30 and will have until March 2025 to do so.
Ian Crosby:
I’ve avoided speaking publicly about Bench since just over 3 years ago when I was fired from the company I co-founded.
[…]
In November 2021 I went out for what I thought would be a regular lunch with one of my board members. We had just raised a Series C and turned down a highly lucrative acquisition offer. We had budding partnerships with companies like Shopify that were interested in the technology we were developing. We were winning.
The board member thanked me for bringing the company to this point, but that they would be hiring a new professional CEO to “take the company to the next level.”
Charles Rollet (Hacker News):
The San Francisco-based HR tech company Employer.com focuses on payroll and onboarding, in contrast to Bench, which specializes in accounting and tax. Employer.com’s chief marketing officer Matt Charney told TechCrunch the company will revive Bench’s platform and provide instructions for customers to log in and obtain their data.
Dare Obasanjo:
12,000 small businesses who were left in a lurch just before tax time may have been saved.
This reminds me of the Synapse whose customers lost money when it failed but wasn’t FDIC insured. This is the risk of betting on startups for your financial needs.
Bench (Hacker News):
This acquisition ensures that Bench customers can continue relying on the same high-quality service they’ve always received, while also opening the door to future enhancements and capabilities powered by Employer.com’s extensive resources. Employer.com is committed to empowering small businesses with the tools and support they need to thrive, and Bench’s expertise in financial management aligns perfectly with that mission.
wdaher:
For Bench customers that want to look elsewhere, Pilot is doing free migrations from Bench to QBO, even if you don’t want to use Pilot. (So you can even take advantage of it if you want to instead DIY or work with some local firm.)
Previously:
Update (2025-01-08): Nicholas C. Zakas (via Ruffin Bailey):
Here’s
@bench
clarifying that no one is getting refunds.
Acquisition Business Datacide Financial Sunset Taxes Web
Mark Alldritt and Shane Stanley (Mastodon):
January 2025 marks Script Debugger’s 30th anniversary. It’s been a very long run for a two-person effort. Script Debugger began as a Classic MacOS product, survived Apple’s near-death experience, transitioned to macOS X and migrated across 4 CPU processor types. We are so grateful for the support we’ve received over these years. This support allowed us to keep working on Script Debugger much longer than we ever imagined.
Shane and I are retiring and the effort and costs associated with continuing Script Debugger’s development are too great for us to bear any longer.
[…]
In June 2025, Script Debugger will no longer be offered for sale and all support and maintenance will cease.
At this time, Script Debugger will become a free download.
This is really sad news. Script Debugger is an excellent app that I use nearly every day, and there’s nothing else like it. Alldritt had hinted at retirement before, but I had hoped that they would sell the app or that, with AppleScript not changing very quickly these days, it wouldn’t be too much of a burden to maintain. But with a constant stream of new OS bugs, new privacy and security requirements, and deprecated APIs, it’s impossible for an app to stand still. You have to keep updating it or it will break over time.
In any case, I thank them for spending decades developing an app that belongs in the Mac hall of fame.
Previously:
Update (2025-01-06): Uli Kusterer:
Pretty sure I used Script Debugger to do some extensive reworks of EyeTV’s AppleScript support, and it was so much more helpful than just waiting for Script Editor to abort with an error.
Brian Webster:
Sad to see Script Debugger going away, though I totally understand the decision. This tool has saved me sooooo many hours of time over the years, I very much do not look forward to whatever future macOS update that ultimately ends up breaking it. 😩
Update (2025-01-08): Jason Snell:
There are many great independent Mac apps out there that have been developed for decades by a single developer or a small team; I admit that I’ve been worried about the fate of those apps for a while now. Developers deserve to retire just like anyone else, but as happy as that moment can be for the people involve, I also selfishly dread the loss of another indie Mac app I’ve relied on for years.
AppleScript Mac App macOS 15 Sequoia Script Debugger Sunset
Scott Knaster:
I worked in Silicon Valley for many years with brilliant people at amazing companies that changed the world. A lot of my stories are about those people and places. But some of them are about something unexpected I saw on a walk around my neighborhood. Stuff like that.
I tell stories face to face, over a meal, in online posts, and on stage. And now I’m trying this new way! Here I’ve written a bunch of stories in this Google Doc, like a little book. I’ve told some of them before and refreshed them a bit for this book. Others are brand new and I’m telling them here for the first time.
[…]
Steve entered the little interview room and sat down 3 feet away from me across a tiny round table. He leaned forward and said: “Are you the best technical writer in the world?”
I was stunned into silence for a few seconds, as I tried to figure what to say. And then, like an idiot, I gave a direct, thoughtful answer. “No. The best technical writer in the world is my friend Caroline Rose, and she already works here at NeXT.”
Via Dave Mark:
I’ve known my buddy Scott Knaster for a VERY long time. He and I wrote some of the earliest Apple developer books, became fast friends in that surprisingly small universe.
Scott just released a Google doc with a draft of his memoirs. Scott is a very entertaining writer, and the doc is chock full of pictures and wonderful anecdotes.
If you are a techie of any stripe, this is worth your time.
Some of his excellent books are How to Write Macintosh Software (PDF) and Macintosh Programming Secrets (PDF).
Previously:
Apple Apple II Book Documentation Google Hiring History Mac Microsoft NeXT Programming Steve Jobs
Jeff Johnson (Mastodon, Hacker News, Reddit, 2, The Verge, Yahoo):
This morning while perusing the settings of a bunch of apps on my iPhone, I discovered a new setting for Photos that was enabled by default: Enhanced Visual Search.
[…]
There appear to be only two relevant documents on Apple's website, the first of which is a legal notice about Photos & Privacy:
Enhanced Visual Search in Photos allows you to search for photos using landmarks or points of interest. Your device privately matches places in your photos to a global index Apple maintains on our servers. We apply homomorphic encryption and differential privacy, and use an OHTTP relay that hides IP address. This prevents Apple from learning about the information in your photos. You can turn off Enhanced Visual Search at any time on your iOS or iPadOS device by going to Settings > Apps > Photos. On Mac, open Photos and go to Settings > General.
The second online Apple document is a blog post by Machine Learning Research titled Combining Machine Learning and Homomorphic Encryption in the Apple Ecosystem and published on October 24, 2024. (Note that iOS 18 and macOS 15 were released to the public on September 16.)
As far as I can tell, this was added in macOS 15.1 and iOS 18.1, not in the initial releases, but it’s hard to know for sure since none of Apple’s release notes mention the name of the feature.
It ought to be up to the individual user to decide their own tolerance for the risk of privacy violations. In this specific case, I have no tolerance for risk, because I simply have no interest in the Enhanced Visual Search feature, even if it happened to work flawlessly. There’s no benefit to outweigh the risk. By enabling the “feature” without asking, Apple disrespects users and their preferences. I never wanted my iPhone to phone home to Apple.
Remember this advertisement? “What happens on your iPhone, stays on your iPhone.”
Apple is being thoughtful about doing this in a (theoretically) privacy-preserving way, but I don’t think the company is living up to its ideals here. Not only is it not opt-in, but you can’t effectively opt out if it starts uploading metadata about your photos before you even use the search feature. It does this even if you’ve already opted out of uploading your photos to iCloud. And “privately matches” is kind of a euphemism. There remains no plain English text saying that it uploads information about your photos and specifically what information that is. You might assume that it’s just sharing GPS coordinates, but apparently it’s actually the content of the photos that’s used for searching.
Ben Lovejoy:
One piece of data which isn’t shared is location. This is clear as several of my London skyline photos were incorrectly identified as a variety of other cities, including San Francisco, Montreal, and Shanghai.
Nick Heer:
What I am confused about is what this feature actually does. It sounds like it compares landmarks identified locally against a database too vast to store locally, thus enabling more accurate lookups. It also sounds like matching is done with entirely visual data, and it does not rely on photo metadata. But because Apple did not announce this feature and poorly documents it, we simply do not know. One document says trust us to analyze your photos remotely; another says here are all the technical reasons you can trust us. Nowhere does Apple plainly say what is going on.
[…]
I see this feature implemented with responsibility and privacy in nearly every way, but, because it is poorly explained and enabled by default, it is difficult to trust. Photo libraries are inherently sensitive. It is completely fair for users to be suspicious of this feature.
In a way, this is even less private than the CSAM scanning that Apple abandoned, because it applies to non-iCloud photos and uploads information about all photos, not just ones with suspicious neural hashes. On the other hand, your data supposedly—if their are no design flaws or bugs—remains encrypted and is not linked to your account or IP address.
jchw:
What I want is very simple: I want software that doesn’t send anything to the Internet without some explicit intent first. All of that work to try to make this feature plausibly private is cool engineering work, and there’s absolutely nothing wrong with implementing a feature like this, but it should absolutely be opt-in.
Trust in software will continue to erode until software stops treating end users and their data and resources (e.g. network connections) as the vendor’s own playground. Local on-device data shouldn’t be leaking out of radio interfaces unexpectedly, period. There should be a user intent tied to any feature where local data is sent out to the network.
Apple just crowed about how, if Meta’s interoperability requests were granted, apps the user installed on a device and granted permission to would be able to “scan all of their photos” and that “this is data that Apple itself has chosen not to access.” Yet here we find out that in an October OS update Apple auto-enabled a new feature that sends unspecified information about all your photos to Apple.
I’m seeing a lot of reactions like this:
I’m tired with so much privacy concerns from everyone without any reason… Yes it sends photo data anonymously to make a feature work or improve it. So what? Apple and iOS are the most private company/software out there.
But I’m tired of the double standard where Apple and its fans start from the premise of believing Apple’s marketing. So if you’re silently opted in, and a document somewhere uses buzzwords like “homomorphic encryption” and “differential privacy” without saying which data this even applies to, that’s good enough. You’re supposed to assume that your privacy is being protected because Apple is a good company who means well and doesn’t ship bugs.
You see, another company might “scan” your photos, but Apple is only “privately matching” them. The truth is that, though they are relatively better, they also have a history of sketchy behavior and misleading users about privacy. They define “tracking” so that it doesn’t count when the company running the App Store does it, then send information to data brokers even though they claim not to.
Eric Schwarz:
With Apple making privacy a big part of its brand, it is a little surprising this was on by default and/or that Apple hasn’t made a custom prompt for the “not photo library, not contact list, not location, etc.” permissions access. Some small changes to the way software works and interacts with the user can go a long way of building and keeping trust.
Matthew Green:
I love that Apple is trying to do privacy-related services, but this just appeared at the bottom of my Settings screen over the holiday break when I wasn’t paying attention. It sends data about my private photos to Apple.
I would have loved the chance to read about the architecture, think hard about how much leakage there is in this scheme, but I only learned about it in time to see that it had already been activated on my device. Coincidentally on a vacation where I’ve just taken about 400 photos of recognizable locations.
This is not how you launch a privacy-preserving product if your intentions are good, this is how you slip something under the radar while everyone is distracted.
Jeff Johnson:
The issues mentioned in Apple’s blog post are so complex that Apple had to make reference to two of their scientific papers, Scalable Private Search with Wally
and Learning with Privacy at Scale, which are even more complex and opaque than the blog post. How many among my critics have read and understood those papers? I’d guess approximately zero.
[…]
In effect, my critics are demanding silence from nearly everyone. According to their criticism, an iPhone user is not entitled to question an iPhone feature. Whatever Apple says must be trusted implicitly. These random internet commenters become self-appointed experts simply by parroting Apple’s words and nodding along as if everything were obvious, despite the fact that it’s not obvious to an actual expert, a famous cryptographer.
Previously:
Update (2025-01-02): See also: Hacker News.
Franklin Delano Stallone:
If it were off by default that would be a good opportunity for the relatively new TipKit to shine.
Jeff Johnson:
The release notes seem to associate Enhanced Visual Search with Apple Intelligence, even though the OS Settings don’t associate it with Apple Intelligence (and I don’t use AI myself).
The relevant note is that in 15.1 the Apple Intelligence section says “Photos search lets you find photos and videos simply by describing what you’re looking for.” I’ve seen reports that the setting was not in 15.0, though its release notes did include: “Natural language photo and
video search
Search now supports natural language
queries and expanded understanding,
so you can search for just what you
mean, like ‘Shani dancing in a red dress.’”
Eric deRuiter:
There are so many questions. Does disabling it on all devices remove the uploaded data? Is it only actually active if you have AI on? Does it work differently depending on if you have AI enabled?
My understanding is that there is nothing to remove because nothing is stored (unless in a log somewhere) and that there is no relation to Apple Intelligence.
Rui Carmo:
I fully get it that Photos isn’t really “calling home” with any personal info. It’s trying to match points of interest, which is actually something most people want to have in travel photos–and it’s doing it with proper masking and anonymization, apparently via pure image hashing.
But it does feel a tad too intrusive, especially considering that matching image hashes is, well, the same thing they’d need to do for CSAM detection, which is a whole other can of worms. But the cynic in me cannot help pointing out that it’s almost as if someone had the feature implemented and then decided to use it for something else “that people would like”. Which has never happened before, right?
thisislife2:
I was going through all the privacy settings again today on my mom’s iPhone 13, and noticed that Apple / ios had re-enabled this feature silently (enhanced visual search in Photos app), even though I had explicitly disabled it after reading about it here on HN, the last time.
This isn’t the first time something like this has happened - her phone is not signed into iMessage, and to ensure Apple doesn’t have access to her SMS / RCS, I’ve also disabled “Filter messages from unknown senders”. Two times, over a period of roughly a year, I find that this feature has silently been enabled again.
These settings that turn themselves back on or that say they will opt you out of analytics but don’t actually do so really burn trust.
Update (2025-01-07): Thomas Claburn:
Put more simply: You take a photo; your Mac or iThing locally outlines what it thinks is a landmark or place of interest in the snap; it homomorphically encrypts a representation of that portion of the image in a way that can be analyzed without being decrypted; it sends the encrypted data to a remote server to do that analysis, so that the landmark can be identified from a big database of places; and it receives the suggested location again in encrypted form that it alone can decipher.
If it all works as claimed, and there are no side-channels or other leaks, Apple can’t see what’s in your photos, neither the image data nor the looked-up label.
Fazal Majid:
There are two issues with this, even before considering possible bugs in Apple’s implementation, or side-channels leaking information:
1) as with the CSAM scanning case, they are creating a precedent that will allow authoritarian governments to require other scanning
2) uploading the hash/fingerprint reveals to someone surveilling the network that someone has taken a photo.
[…]
In a previous breach of trust and consent, they also turned on without consent in Safari the Topics API (Orwellianly called “privacy-preserving analytics/attribution” when it is nothing but an infringement of privacy by tracking your interests in the browser itself). Even Google, the most voyeuristic company on the planet, actually asked for permission to do this (albeit with deliberately misleading wording in the request, because Google).
Fred McCann:
Even if the results are encrypted you don’t control the keys - best case scenario is Photos is generating without telling you and placing it somewhere(?). And the server side could store encrypted results for which they or some other party could have a backdoor or just store them until advances render the enc scheme defeatable. Who gets to audit this?
Roland:
It is quite easy to see how governments will order Apple to abuse this feature in future, without any need to sabotage or compromise any Apple-supplied homomorphic-encryption / private-query / differential-privacy / ip-address-anonymizing security features[…]
[…]
That government instructs Apple to match “landmark” searches against (non-landmark) images (archetypes) which the government cares about.
[…]
When a match is found, Apple sets a “call the secret police” flag in the search response to the client device (iPhone, Mac, whatever).
[…]
Everyone can analyze the heck out of the Apple anonymous search scheme-- but it doesn’t matter whether it is secure or not. No government will care. Governments will be quite satisfied when Apple just quietly decorates responses to fully-anonymous searches with “call the secret police” flags-- and Apple will “truthfully” boast that it never rats out any users, because the users’ own iPhones or Macs will handle that part of it.
Jeff Johnson:
With Enhanced Visual Search, Apple appears to focus solely on the understanding of privacy as secrecy, ignoring the understanding of privacy as ownership, because Enhanced Visual Search was enabled by default, without asking users for permission first. The justification for enabling Enhanced Visual Search by default is presumably that Apple’s privacy protections are so good that secrecy is always maintained, and thus consent is unnecessary.
My argument is that consent is always necessary, and technology, no matter how (allegedly) good, is never a substitute for consent, because user privacy entails user ownership of their data.
[…]
The following is not a sound argument: “Apple keeps your data and metadata perfectly secret, impossible for Apple to read, and therefore Apple has a right to upload your data or metadata to Apple’s servers without your knowledge or agreement.” There’s more to privacy than just secrecy; privacy also means ownership. It means personal choice and consent.
[…]
The oversimplification is that the data from your photos—or metadata, however you want to characterize it—is encrypted, and thus there are no privacy issues. Not even Apple believes this, as is clear from their technical papers. We’re not dealing simply with data at rest but rather data in motion, which raises a whole host of other issues. […] Thus, the question is not only whether Apple’s implementation of Homomorphic Encryption (and Private Information Retrieval and Private Nearest Neighbor Search) is perfect but whether Apple’s entire apparatus of multiple moving parts, involving third parties, anonymization networks, etc., is perfect.
See also: Bruce Schneier.
iOS iOS 18 Mac macOS 15 Sequoia Photos.app Privacy
David Nield:
Honey, which is owned by PayPal, is a popular browser extension—with 19 million users on Chrome alone—but the shopping tool is being accused of some seriously shady practices, including keeping users away from the lowest online prices and blocking creator affiliate links to deprive them of revenue. The scandal surfaced through a comprehensive video posted by MegaLag, who calls it “the biggest influencer scam of all time” based on an investigation that’s apparently been ongoing for several years. MegaLag claims to have reviewed masses of documents, emails, and online ads in the course of the investigation, as well as having spoken to victims and personally falling foul of Honey’s methods.
Wes Davis:
Honey works by popping up an offer to find coupon codes for you while you’re checking out in an online shop. But as MegaLag notes, it frequently fails to find a code, or offers a Honey-branded one, even if a simple internet search will cover something better. The Honey website’s pitch is that it will “find every working promo code on the internet.” But according to MegaLag’s video, ignoring better deals is a feature of Honey’s partnerships with its retail clients.
MegaLag also says Honey will hijack affiliate revenue from influencers. According to MegaLag, if you click on an affiliate link from an influencer, Honey will then swap in its own tracking link when you interact with its deal pop-up at check-out. That’s regardless of whether Honey found you a coupon or not, and it results in Honey getting the credit for the sale, rather than the YouTuber or website whose link led you there.
The official response denies nothing:
Honey is free to use and provides millions of shoppers with additional savings on their purchases whenever possible. Honey helps merchants reduce cart abandonment and comparison shopping while increasing sales conversion.
Update (2025-01-02): See also: Wladimir Palant and Marques Brownlee.
Preetham Narayanareddy:
Honey sponsored Mr. Beast in 3 videos, gaining a total of 140M views after spending approximately $120,000.
Update (2025-01-06): Elliot Shank:
Lawyer YouTuber is starting a class-action lawsuit against PayPal/Honey.
App Store Scams Bargain Business Honey Mac Mac App Store macOS 15 Sequoia PayPal Safari Extensions Shopping Web