Archive for March 12, 2025

Wednesday, March 12, 2025

ChatGPT Now Integrates Directly With Xcode

Tim Hardwick:

OpenAI has updated the ChatGPT app for macOS with the ability to directly edit code in popular development environments including Xcode, VS Code, and JetBrains tools.

The new feature allows the AI chatbot to make changes to code without requiring users to copy and paste between applications. ChatGPT can now read code from these environments and make edits directly within them.

Marcin Krzyzanowski:

macos developers going to mock macOS System Settings, yet ship app settings like this

people, that is an anti-pattern. please don’t spread that.

Steve Troughton-Smith:

I had a lot of success using ChatGPT to selectively convert my Objective-C code to Swift across a portfolio of projects. Couple hundred thousand lines. It’s fluent in everything.

Previously:

Whither Swift Assist?

Apple (June 2024):

Swift Assist serves as a companion for all of a developer’s coding tasks, so they can focus on higher-level problems and solutions. It’s seamlessly integrated into Xcode, and knows the latest software development kits (SDKs) and Swift language features, so developers will always get the latest code features that blend perfectly into their projects. With Swift Assist, tasks like exploring new frameworks and experimenting with new ideas are just one request away.

kironet:

Xcode 16.3 and still no Swift Assist. WWDC25 is around the corner....

Swift Assist was supposed to arrive in 2024, but it never even appeared in a beta. Apple hasn’t announced that it’s postponed or cancelled. It’s not even mentioned in the release notes.

Previously:

Update (2025-03-13): John Gruber (Mastodon):

If anyone else who was in those WWDC briefings remembers whether Swift Assist was actually demoed, please let me know. I’m genuinely curious if Swift Assist was another thing — like all of “more personalized Siri” — that wasn’t even in demonstratable shape at WWDC.

Anyone besides John Voorhees?

See also: Tim Hardwick and John Voorhees.

Apple Delays “More Personalized Siri” Apple Intelligence Features

John Gruber (Mastodon, Dithering, MacRumors, Slashdot):

Reading between the lines, and based on my PhD-level fluency in Cupertino-ese, what Apple is saying here is that these “more personalized Siri” features are being punted from this year’s OS cycle to next year’s: to iOS 19 and MacOS 16.

[…]

It was already pretty obvious these features weren’t coming in iOS 18.4/MacOS 15.4, because they’re not in the developer betas that are already out. And if these features were coming in iOS 18.5/MacOS 15.5, or even iOS 18.6/MacOS 15.6, Apple wouldn’t have felt the need to issue this “It’s going to take us longer than we thought...” statement today. Delivering the “more personalized Siri” in iOS 18.5 or even 18.6 would have been delivering them on the originally announced schedule — the “coming year” that began with WWDC 2024. That would have been the tail end of that “year”, but within it.

[…]

This is the “App Intents” stuff — the features requiring Siri to have access to and, effectively, understanding of your personal information, in your apps, stored privately on your devices. These are the most helpful-sounding, most practical, most futuristic features of Apple Intelligence. And they’re the sort of features Apple is almost uniquely positioned to offer, alongside only Google, as the provider of a mobile platform and device ecosystem.

[…]

Most of Apple Intelligence has felt like Apple has pushed it out a year ahead of the company’s usual level of baked-ness. I’ve held all year long that if the entire industry — along with Wall Street — weren’t in the midst of a generative-AI/LLM mania, that “Apple Intelligence” wouldn’t have been announced until this year’s WWDC, not last year’s.

Jason Snell:

Those Apple Intelligence announcements at WWDC 2024 were vitally important for Apple. The company felt that it had to show that it hadn’t completely missed the boat on the hottest topic in the tech industry, and that it was working hard to infuse the power of AI through all its products. I would argue that succeeded at doing so, and its barrage of Apple Intelligence marketing the past six months has reinforced the point. People in the know might criticize that Apple’s behind, or that its tools aren’t close to the state of the art, but the general perception is that Apple’s in the game—which was a real question last year at this time.

[…]

Apple got exactly what it wanted out of WWDC 2024. The penalty for failing to ship some of those features will be the equivalent of a slap on the wrists. But this all increases the stakes for WWDC 2025, when Apple will still need to show that it’s capable of creating useful Apple Intelligence features—and its audience should be more skeptical about the company’s ability to ship them.

Juli Clover:

After announcing that some Apple Intelligence Siri features promised for iOS 18 will be delayed, Apple has tweaked the wording on its iOS 18, iPadOS 18, and macOS Sequoia webpages to remove mentions of the Siri capabilities that are being pushed back.

Zac Hall:

Since last fall, Apple has been marketing the iPhone 16 and Apple Intelligence with an unreleased Siri feature. After confirming today that the more personal version of Siri isn’t coming anytime soon, Apple has pulled the ad in question.

The commercial starred Bella Ramsey who should probably win an award for acting like Siri worked.

John Gruber (Mastodon):

I think that was the only TV commercial Apple ran showing the “personalized Siri through App Intents” feature that Apple has now admitted won’t ship in iOS 18, but I saw that commercial a lot during the baseball playoffs and NFL season.

[…]

Apple’s product pages for Apple Intelligence, iOS 18, and MacOS 15 Sequoia are lousy with references to these “new era for Siri” features that we now know aren’t going to ship this year. This is a marketing fiasco.

Jimmy Callin:

It’s particularly daunting when reading your words in 2011 on how Apple would never publish future concept videos. It feels like Apple crossed a line here.

Cabel Sasser:

i don’t mean to be too pessimistic, but i’m genuinely curious: when will the first apple intelligence false advertising class action drop?

Nick Heer:

Unsurprisingly, this news comes on a Friday and is announced via a carefully circulated statement instead of on Apple’s own website.

Unsurprising but still disappointing. This is like burying a correction to a front-page news story in the back of the newspaper. Maybe someday there will be a “Thoughts on Siri”, blame Forstall for Maps, or “So why the f**k doesn’t it do that” moment.

It is a single feature, but it is a high-priority one showcased in its celebrity infused ads for its flagship iPhone models. I think Apple ought to have published its own news instead of relying on other outlets to do its dirty work, but it fits a pattern. It happened with AirPods and again with AirPower; the former has become wildly successful, while the latter was canned.

[…]

Instead of monolithic September releases with occasional tweaks throughout the year, Apple adopted a more incremental strategy. I would like to believe this has made Apple’s software more polished — or, less charitably, slowed its quality decline. What it has actually done is turn Apple’s big annual software presentation into a series of promises to be fulfilled throughout the year.

[…]

Apple only just began including the necessary APIs in the latest developer betas of iOS 18.4. No matter when Apple gets this out, the expansiveness of its functionality is dependent on third-party buy-in. There was a time when developer adoption of new features was a given. That is no longer the case.

Steve Troughton-Smith:

Delayed or not, Apple’s proposed Intents-based Apple Intelligence features require a ton of buy-in from developers for it to be of any real use, and the incentives really aren’t there — donating your app’s content and functionality to enrich Apple’s AI alone, bypassing your UI, UX, branding, metrics, et al, to be delivered in a content soup alongside your competitors.

A third party developer base that won’t even bother recompiling Netflix or YouTube for visionOS won’t care much for that.

Jeff C.:

This sounds like OpenDoc all over again.

Todd Heberlein:

Apple’s WWDC 2024 was dominated by Apple Intelligence. Being able to run Apple Intelligence things like the new Siri was the main marketing pitch to get people to buy the new iPhones this year. The iPhone SE was replaced with a much more expensive iPhone 16e, so it could run Apple Intelligence.

I think this is Apple’s biggest product blunder since Apple’s inability to ship a modern OS in the 1990s (which led to Apple’s acquisition of NeXT).

Simon Willison:

I have a hunch that this delay might relate to security.

These new Apple Intelligence features involve Siri responding to requests to access information in applications and then performing actions on the user's behalf.

This is the worst possible combination for prompt injection attacks!

Via John Gruber (Mastodon):

So Apple had promised for this year — and oft promoted — an entire set of features that they not only have now acknowledged will not ship this year, but which they might, in fact, never be able to ship. Makes me wonder how many people inside Apple were voicing these concerns a year ago, and why they lost the debate to start promising these features last June and advertising them in September.

Federico Viticci (Mastodon):

Willison has been writing about prompt injection attacks since 2023. We know that Mail’s AI summaries were (at least initially?) sort of susceptible to prompt injections (using hidden HTML elements), as were Writing Tools during the beta period. It’s scary to imagine what would happen with a well-crafted prompt injection when the attack’s surface area becomes the entire assistant directly plugged into your favorite apps with your data. But then again, one has to wonder why these features were demoed at all at Apple’s biggest software event last year and if those previews – absent a real, in-person event – were actually animated prototypes.

[…]

Regardless of what happened, here’s the kicker: according to Mark Gurman, “some within Apple’s AI division” believe that the delayed Apple Intelligence features may be scrapped altogether and replaced by a new system rebuilt from scratch.

[…]

From his story, pay close attention to this paragraph:

There are also concerns internally that fixing Siri will require having more powerful AI models run on Apple’s devices. That could strain the hardware, meaning Apple either has to reduce its set of features or make the models run more slowly on current or older devices. It would also require upping the hardware capabilities of future products to make the features run at full strength.

Matthew Cassinelli:

iPhone hardware being the choke point is going to be brutal.

And anyone like us with lots of apps and intents is going to stress their devices constantly.

I already force quit my apps often to free up slivers of RAM.

[…]

I think the fundamental problem with App Intents as Apple Intelligence is in theory it works and in practice it doesn’t scale unless you’ve got a great chip and lots of power to give.

Sebastiaan de With:

Siri is doing actual damage to Apple’s brand/credibility. It shouldn’t be delayed, it should be canned next year.

The company’s never been flat-out dishonest on its products like this before. The Mac Studio page still showcases a use case that flat out does not work

it has somehow gotten worse with Apple Intelligence, too. I asked my phone via CarPlay to open the Books app and it replied “I don’t know an app named Books. You can search for it on the App Store”.

How… I am sorry, how is this so bad 14 years later?

Adam Engst:

I don’t think most people would actually use those features because Siri has let them down for too many years. Tonya and I talk to Siri all the time because of our HomePods and many HomeKit devices, but our usage is extremely limited. I trust Siri to turn lights on and off (sometimes leading to “How many digital assistants does it take to screw in a light bulb?” jokes), set timers, make reminders, and play music, but that’s about it. Even with these use cases, I’m unsurprised—but annoyed—when Siri fails to hear a command, gets part of it wrong, or responds completely inappropriately.

I don’t trust—or use—Siri for anything else because so many attempts have failed. Like Lucy, Siri frequently pulls the football away. Unlike Charlie Brown, I long ago gave up embarrassing myself.

I still wish they had just focused on fixing the basics first. Not only would I not trust it to handle sophisticated tasks when, say, the music controls don’t work, but it also makes it seem like they’re in denial about how Siri exists today.

Dave Mark:

I wonder if people at Apple are puzzled at our reaction to Siri? If they think Siri is terrific and working perfectly, or if they recognize the things that drive us crazy, but can’t seem to find the path to fix them?

Kirk McElhearn:

The thing is, it has never worked well. It’s impossible that they could think that it’s working as anyone expects.

Warner Crocker:

Yes, Apple also does have a history of turning some poorly received rollouts around. The best examples of that are Apple Maps and the Apple Watch. Even so, once a product launched becomes a product laughed at, it’s difficult to erase the echos of that laughter.

[…]

Siri has never fulfilled Apple’s bold promises with any consistent value beyond setting a timer or adding a reminder. Even that fails enough of the time to earn users’ distrust and provide late night comedians with jokes so easy to make that the shrewder jokesters have moved on.

John Gruber:

“Product knowledge” is one of the Apple Intelligence Siri features that, in its statement yesterday, Apple touted as a success. But what I asked here is a very simple question, about an Apple Intelligence feature in one of Apple's most-used apps, and it turns out Siri doesn't even know the names of its own settings panels.

Dare Obasanjo:

I’ve previously argued they need to delete all the Siri code and start fresh with a ChatGPT wrapper but I expect pride will make them fight that to the bitter end.

It’s their best choice if they believe competing with Gemini on Android is critical for iPhone sales.

Basic Apple Guy:

But from the moment it was announced, Apple Intelligence has felt like it was rushed to market, suggesting that Apple was unprepared for the sector’s explosive growth and how quickly competitors would integrate these features into their products. From the outside looking in, Apple appeared so focused on advancing its silicon, launching Vision Pro, and figuring out what to do with its now cancelled Project Titan vehicle project that the talent and resources to devote to AI weren’t there.

Apple’s saving grace, for now, is the belief that many users are still oblivious to the full potential of AI in their daily lives and that Friday’s announcement will create the most frustration among those deeply embedded in Apple’s ecosystem who are also the cutting edge of technology.

I mention that last point because, for many of my friends and family, AI is either 1) a non-issue or 2) simply synonymous with asking ChatGPT stuff. But features will trickle down fast, and soon, manually combing through a dozen emails to find that one PDF or determining who said they’d bring what to the potluck from several messages will feel prehistoric relative to any other platform. And at that point, if Apple still runs a version of Siri that struggles to set multiple timers, they’re f***ed.

[…]

My first move would be to work on integrating more with AI services like ChatGPT, DeekSeek, Co-Pilot, or Gemini. Right now, Apple makes launching out to ChatGPT a very stilted experience. But if there’s a way to enable more seamless query handling, it could significantly improve the experience and satisfaction and placate most users until you can roll out the version of Siri you intended to.

Steve Troughton-Smith:

Realistically, this AI race is a giant ship anchor Apple has willingly strapped to its own neck, that will mire it dreadfully. A game they can’t ever win, and a distraction that is eating up all the OS engineering energy that some of their product lines desperately need. It has added literally nothing of value to this year’s OS releases.

Apple needed to do two things: make Siri not suck, and provide an LLM through API so that devs could make their own features and choices. It did neither.

ChatGPT didn’t need access to all your local or sensitive data to be incredibly useful. It didn’t need to be wired into every corner of the OS or outside of the sandbox. It didn’t need special privileges or entitlements given to only it. It didn’t need the biggest and most powerful Apple chips, or the latest OS to run on.

Apple has lost itself completely in the weeds, when a Siri app and API that works just like ChatGPT’s would have done 90% of what anybody is actually looking for

David Heinemeier Hansson:

After reliving that Ballmer moment, it’s uncanny to watch this CNBC interview from one year ago with Johny Srouji and John Ternus from Apple on their AI strategy. Ternus even repeats the chuckle!! Exuding the same delusional confidence that lost Ballmer’s Microsoft any serious part in the mobile game.

Ostyn Hyss:

Still trying to figure out what they were laughing about.

Steve Troughton-Smith:

Pundits are going to keep telling you ‘there’s no moat’ around generative AI, and yet still Apple appears to be falling further and further behind after being caught flat-footed.

The moat is Nvidia, the moat is data, the moat is doing it in the cloud at the expense of privacy, the moat is trying to tie any of it together for massive monolithic OS updates that come but once a year.

All of which Apple, at a DNA level, will struggle massively to compete with.

Alexis Gallagher:

I thought Apple would do their usual thing, of being the last to adopt a new technology but the first to use it to create polished user experiences. but instead they’ve shipped promises.

Dave B.:

Apple used to promise the impossible, and then it would actually deliver.

Apple still makes the same promises, but no longer has the institutional competence to deliver.

It is like an aging sports GOAT that is in denial and refuses to accept that he isn’t the same guy anymore.

Ian Betteridge:

The pattern is depressingly familiar in successful businesses. A company achieves spectacular success, then gradually shifts its focus from transformative innovation to shareholder value – dividends, stock splits, and the comforting predictability of incremental improvements. Meanwhile, the future takes shape elsewhere, often in messier, less immediately profitable corners of the industry.

Apple has long been characterised as a “fast follower” rather than a pioneering innovator. It wasn’t the first to make an MP3 player, smartphone, or even a personal computer. This strategy served Apple brilliantly in the past – observing others’ mistakes, then delivering exquisitely refined products with unmatched attention to design, usability, and integration. The first iPhone wasn’t novel in concept, but revolutionary in execution because it had a unique interface: multitouch. In fact, I would argue this was the last time Apple’s user interfaces went in a bold direction.

But AI presents a fundamentally different challenge. This isn’t merely a new product category to be perfected – it’s a paradigm shift in how humans interact with technology. Unlike hardware innovations where Apple could polish existing concepts, AI is redefining the entire computing experience, from point-click or touch-tap to conversations. The interface layer between humans and devices is transforming in ways that might render Apple’s traditional advantages increasingly irrelevant.

More troubling still is the misalignment between Apple’s business model and AI’s trajectory. Apple thrives on high-margin hardware in a controlled ecosystem, while AI is primarily software and services-driven, often benefiting from openness and scale.

Ben Thompson:

The number one phrase that has been used to characterize Apple’s response to the ChatGPT moment in November 2022 is flat-footed, and that matches what I have heard anecdotally. That, by extension, means that Apple has been working on Apple Intelligence for at most 28 months, and that is almost certainly generous, given that the company likely took a good amount of time to figure out what its approach would be. That not nothing — xAI went from company formation to Grok 3 in 19 months — but it’s certainly not 17 years!

[…]

Still, it’s worth pointing out that exclusive access to data is downstream of a policy choice to exclude third parties; this is distinct from the sort of hardware and software integration that Apple can exclusively deliver in the pursuit of superior performance. This distinction is subtle, to be sure, but I think it’s notable that Apple Silicon’s differentiation was in the service of building a competitive moat, while Apple Intelligence’s differentiation was about maintaining one.

[…]

At the same time, it’s not as if Siri is new; the voice assistant launched in 2011, alongside iMessage. In fact, though, Siri has always tried to do too much too soon; I wrote last week about the differences between Siri and Alexa, and how Amazon was wise to focus their product development on the basics — speed and accuracy — while making Alexa “dumber” than Siri tried to be, particularly in its insistence on precise wording instead of attempting to figure out what you meant.

To that end, this speaks to how Apple could have been more conservative in its generative AI approach (and, I fear, Amazon too, given my skepticism of Alexa+): simply make a Siri that works.

Gus Mueller:

As a Mac user, I have this incredible wealth of GPU and CPU power, which in turn allows me to run LLMs locally.

A few weeks ago, before a trip out of the country for my daughter’s spring break, I set up a local instance of DeepSeek and made sure I could connect to it via Tailscale running on my Mac.

[…]

A week or so ago I was grousing to some friends that Apple needs to open up things on the Mac so other LLMs can step in where Siri is failing.

[…]

The crux of the issue in my mind is this: Apple has a lot of good ideas, but they don’t have a monopoly on them. I would like some other folks to come in and try their ideas out. I would like things to advance at the pace of the industry, and not Apple’s.

With iOS being more locked down, it would be even harder for third parties to step in. I don’t see Apple ever opening up the intents system so that other AIs could consume the data rather than just provide it. (I’d love to even see Spotlight on the Mac opened up in this way.) But there should at least be an API so that you can invoke other AI assistants with the same ease as Siri: via a hardware button, “Hey,” etc.

Federico Viticci (Mastodon):

The idea is a fascinating one: if Apple Intelligence cannot compete with the likes of ChatGPT or Claude for the foreseeable future, but third-party developers are creating apps based on those APIs, is there a scenario in which Apple may regain control of the burgeoning AI app ecosystem by offering their own native bridge to those APIs?

Essentially, I’m thinking of a model similar to what Cursor, Perplexity, and dozens of other AI companies do: instead of necessarily bringing your own API key, you can use an abstraction layer in the middle that absorbs all the costs of the API you’re calling – usually, for a monthly fee and within certain limits. What if Apple followed a similar approach in iOS 19/macOS 16 with an Apple Intelligence API that is actually a bridge between native apps and other cloud-based AI providers?

[…]

Of course, the alternative is for Apple to bide their time, wait until they have a proper Apple Intelligence LLM to offer as an API for all kinds of features (that is, beyond summaries alone), and let third-party developers continue building primarily through other providers’ SDKs and APIs. But if the enemy of my enemy is my friend, I wouldn’t be surprised to see Apple offer something along these lines in the near future.

Previously:

Update (2025-03-13): Steve Troughton-Smith:

It bugs me when people suggest Apple spending resources on AR/VR was a waste and they should have spent it on Siri/AI. Here’s the thing — AI, especially the way Apple is doing it, is a bolt-on. They could wait 10 years, then add the latest & greatest, and be no worse off than today. But a whole new platform built around AR/VR (which I think is going to be huge) is the kind of thing that only Apple can build, and it needs significant multi-year buy-in and investment from Apple and third parties.

Craig Hockenberry:

It kills me that we’re sitting on top a huge pile of content with Tapestry and can’t explore how to mine it using the models that are already on my device.

All kinds of interesting UI would emerge: what folks are talking about, meme timelines, etc. App Intents only give that data away and don’t work to our advantage.

AI Summaries of App Store Reviews

Filipe Espósito (October 2024, MacRumors):

As seen by 9to5Mac in an unlisted App Store article, Apple has developed a new system that will use all user reviews on the App Store to create a summary highlighting “the most common customer feedback” about each app. According to the article, the summaries will be updated every time new reviews are added.

[…]

The idea is to make it easier for users to identify when an app doesn’t deliver what it promises in the App Store. Since these summaries will be auto-generated, Apple says developers will be able to report when they consider a summary to be “inaccurate.”

Adam Overholtzer:

The best thing Apple could do with AI is detect when a customer is writing a support request into the App Store review box and prompt them to send it to the developer instead.

Apple:

Starting in iOS 18.4 and iPadOS 18.4, people can see a new review summary on your App Store product page to more easily learn about apps and games at a glance. Generated using large language models (LLMs), each summary compiles highlights and key information from individual user reviews into a short paragraph. Summaries are refreshed at least once a week for apps and games with enough reviews to provide a summary. People can tap and hold the summary to report a problem, and developers can also use App Store Connect.

As part of a phased rollout, review summaries are currently available in English for a limited number of apps and games on the App Store in the United States. This feature will expand to all apps with a sufficient number of reviews in additional storefronts and languages over the course of the year.

Via Sarah Perez (MacRumors):

Apple is not the only tech giant to look to AI for analyzing reviews. Amazon introduced AI summaries for product reviews on its platform back in 2023. Google’s Gemini AI can also be used for product review summaries, as one developer tutorial explains. The company also added AI-powered review summaries in Google Maps last year.

Previously: