Apple Intelligence in macOS 15.1 and iOS 18.1
Apple will launch iOS 18.1 next week, bringing its much anticipated generative-AI tools to the iPhone 15 Pro models and the new iPhone 16 lineup. It will be available for most newer iPads and Macs, too.
If you’re expecting AI fireworks, prepare for AI…sparklers. Back in June, at the company’s annual developers conference, executives showed off do-it-yourself emojis, ChatGPT integration and a Siri that can recall the name of a person you met months ago. Apple has even been running ads for some features. None are in this release.
[…]
I’ve been testing Apple Intelligence on my iPhone and iPad. Apple’s ability to build tools right into the operating systems is undeniably powerful and convenient. But many are half-baked. I asked Federighi to explain the features—and Apple’s broader AI strategy.
It’s a very good interview, and also available on YouTube.
[…]
But as Stern herself points out in the article, the features that are shipping are genuinely useful. Notification summaries are good — the occasional mistakes can be funny, but overall it’s solid, and especially helpful for batches of notification from the same app or group text. The Clean Up unwanted-object-remover in Photos is great.
The first version of Apple Intelligence, which has been in beta testing for a few months now and is rolling out broadly next week, is pretty underwhelming. There’s just not much there. Not a lot beyond perhaps notification summaries that you’re going to be using all the time.
The most significant Siri enhancements are scheduled for iOS 18.4 around March 2025. These include onscreen awareness for contextual commands, personal context for better understanding of user data, and expanded app control capabilities. Initially, Apple Intelligence will only support U.S. English, with additional languages planned for next year.
If my memory serves me correctly, it’s roughly the same number Apple shared during the WWDC keynote. So, essentially flat. While 1.5 billion might appear big, when it comes to internet scale, it isn’t such a large number for a company the size of Apple. I looked up the number of active Apple devices. That number is estimated to be 2.2 billion devices — I assume this includes phones, computers, watches, headphones, TV-streaming devices, and speakers. So 1.5 billion requests a day is actually far less than one daily request per active device.
Writing Tools don’t themselves generate new content in text, but use the original text to produce derivatives. I’m particularly looking forward to using its proofreading feature, which can suggest improvements that I can choose to ignore, or adapt to my own style, as I wish.
I’ve relied on Grammarly for years for proofreading. It catches typos, doubled words, and extra spaces, and its newer AI-powered features sometimes make helpful suggestions for recasting awkward sentences. I’m slightly annoyed that Grammarly’s proofreading tools are so helpful, but it’s challenging to edit your own text to a professional level, and Grammarly can identify errors much faster than I can. Don’t assume that tools like Apple Intelligence’s proofreading capabilities for helping with grammar, word choice, and sentence structure are necessarily a crutch. They may be for some people, but even people who care about their writing can still benefit from some suggestions while ignoring unhelpful ones.
AI suggested a total of six changes to my piece, of which three were duplicates – adding periods to bullet-point text.
[…]
Overall, though, a really excellent job.
I am very likely underselling how valuable the new writing tools might prove to people trying to write in a second language, or who simply aren’t capable of expressing themselves well in their first language.
This guide goes over everything you can do with Writing Tools, where you can use them, and what you need to access the feature.
Previously:
- Apple Intelligence in macOS 15.2 and iOS 18.2
- What Is a Photo?
- macOS 15 Sequoia
- iOS 18
- Apple’s Hidden AI Prompts
- Beta for Apple Intelligence in Apple Mail
- The First Apple Intelligence Beta
- Apple Intelligence for Siri in Spring 2025
- Apple Intelligence Announced
Update (2024-10-28): Apple (Hacker News):
Apple today announced the first set of Apple Intelligence features for iPhone, iPad, and Mac users is now available through a free software update with the release of iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1. Apple Intelligence is the personal intelligence system that harnesses the power of Apple silicon to understand and create language and images, take action across apps, and draw from personal context to simplify and accelerate everyday tasks while taking an extraordinary step forward for privacy in AI. Today marks the availability of the first set of features, with many more rolling out in the coming months.
It’s unquestionable that Apple is putting its weight behind these efforts, but what’s been less clear is just how effective and useful these tools will be. Perhaps unsurprisingly, for anybody who has used similar generative AI tools, the answer is a definite maybe.
[…]
Despite Apple’s marketing of a new and improved Siri, the voice assistant hasn’t changed that much with this first set of Apple Intelligence capabilities. The most obvious “new feature” is actually a new look: on iOS and iPadOS, instead of the little glowing orb that used to indicate Siri had been activated, you’ll now see a colorful wash over the entire screen, accompanied by a “ripple” effect.
Previously:
Update (2024-10-30): Kirk McElhearn:
These writing tools could be useful for non-native speakers or people with limited writing skills. But for any serious writing, they are limited and problematic. Someone in a hurry may accept rewrites without checking and later discover that their text has been corrupted, their style flattened, and their message obfuscated.
iA:
Apple’s Intelligence misses a crucial step in the process. Writing Tools will simply replace your original text. If you can’t see the edits. The more you use it, the more you risk losing control over what you wrote.
Update (2024-10-31): Joe Rosensteel:
- Clean Up (YIKES)
- Pixelmator (yuck)
- Retouch (ok, and what I used originally)
- Lightroom (many removal options that all seem viable)
13 Comments RSS · Twitter · Mastodon
Regarding Malik's breakdown of Siri usage, I have 9 devices registered with Apple and only one of them gets used for Siri, and that's for a handful of timers being set and math calculations every day. I've got dozens of friends and family who use me as tech support, most of whom have multiple devices, and I've never heard of any of them ever using Siri.
No doubt, Siri is feeble. As are the other voice assistants out there. That's kind of okay since it seems to be where the technology is at, so what stands out to me is the over-marketing they've all been accorded. Doesn't make sense to me the level of promotion voice assistants have received from manufacturers when the limitations and unreliability are inescapable.
@NaOH Overall, I’m pretty frustrated with Siri, but I was surprised to see the low usage numbers. I probably do about 25 very basic requests per day. I’m sure others do far more.
I am not sure about the use of AI for something that we have not had before, such as summaries. But there are several use cases that AI could vastly improve: Siri is a tool, and in my house it is a glorified light switch. Siri is not even capable of turning on my bedside light with the consistency of a light switch, turning on all lights at times. Another use case is the HomePod. I have an extensive music collection from all over the world and I am fed up with mispronouncing French artist names just so that Siri can find them. Ditto for autocorrection in some of the languages I speak that require the knowledge of context and grammar (unlike English, seemingly the only language Apple bases its rules on). This is a perfect playing field for AI.
So please Apple, stop trying to find new use cases. Instead use AI to improve existing ones.
I wanted to use Siri to play/stop a podcast while doing yardwork with thick winter gloves on. It didn't end up being as useful as I had imagined it would be. I gave up on it and haven't used it since.
@Old Unix Geek Yeah, I have tried Siri in situations like that, and sometimes it works, but sometimes it gets wedged and won’t let me do anything without unlocking the phone first.
Writing Tools > Concise is fine. Friendly and Professional are word salads, which is awful writing.
Proofread should be named "Fix all". A proofreader helps you understand your mistakes or offers insight to improve your writing.
These features are to written communication what Instagram filters are to photos: a proliferation of inauthentic material and embellishments from people who want to sound impressive (but usually aren't). Dunning–Krugers are going to fire off soulless texts devoid of intent or deliberation, and other Dunning-Krugers will soon hit summarize on everything they receive. What a future.
Better written communication is a lifelong effort that requires effort and changes in thinking. We've gone from "a bicycle for the mind" to "a rickshaw for the braindead".
"Friendly and Professional are word salads, which is awful writing."
It depends on your audience. What I perceive as "concise" often comes across as aggressive and insulting to others.
A lawyer friend of mine uses ChatGPT to "translate" her emails so they're digestible by her company's marketing team. Is it word salad, or is it just friendly, professional communication? It's in the eye of the beholder.
@Michael Tsai I'm actually surprised the number is that high. As Malik pointed out, we don't know if Apple includes erroneous Siri requests in this number, but I'd be shocked if not. Literally the only time I even try it is when I'm in the car and I cannot easily perform the task with hands. This results in a burst of requests, because it usually takes me 5-6 tries to get Siri to play the album I want or to get my message right. If other's experience is similar, it means the number of actual successful interactions with Siri should be 0.3 billion per day, 1 in 7 active devices.
@kku It probably counts as a request if you receive an iMessage in the car, Siri asks whether you want to reply, and you say “No.” If you say “Yes,” speak your reply, and then confirm, that’s probably three requests.
>The first version of Apple Intelligence, which has been in beta testing for a few months now and is rolling out broadly next week, is pretty underwhelming. There’s just not much there. Not a lot beyond perhaps notification summaries that you’re going to be using all the time.
The notification summaries (and e-mail summaries, in Apple's Mail app) I've found to be mostly good, and a neat, useful feature.
There was _one_ case, though, where the summary got the meaning wrong. Summary:
"(Their name) requests to connect with (my name) on LinkedIn."
This is almost correct, exact for one critical error: no, they didn't. The actual mail starts:
"Do you know (their name)? Request to connect with members you know and trust"
IOW, it flipped the meaning of "requests to connect".
But, to MG's point, _most_ Apple Intelligence features just aren't a big deal to me. And maybe that's fine. I do like that Apple continues to put an emphasis on privacy. Whether you use a service like Grammarly (it amazes me what kinds of people are OK with doing that) or have your writing analyzed and improved _locally_ is a huge difference.
>I probably do about 25 very basic requests per day.
Huh.
For years, I did, what, one request a month?
Now that macOS has easily accessible type to Siri, though, I enjoy doing (tap cmd twice) "timer three minutes" (tap return). Done. Timer. Without having to say anything or move my hands away. The one annoyance is that the Siri prompt first shows on one display, then moves to another.
For Sören and anyone else... I don't have Type to Siri to try this as my macOS predates that capability, but on iOS "Timer three minutes" can be reduced to "Three minutes" and it knows to set a timer.*** Maybe worth a try on the Mac.
***I encountered this Siri inconsistency for the first time the other day. Any of these statements to iOS Siri will set a corresponding timer:
Five minutes
Thirty minutes
Half an hour
One-and-a-half-hours
Ninety minutes
But once you hit 2.5 hours, saying "Two-and-a-half hours" (or any longer time with some number of hours spoken along with a fractional amount) elicits a reply from Siri asking what you'd like to convert. Switching to saying, "150 minutes" does set a timer, and though I have a good enough memory to remember this inconsistency it's weird and kinda odd.
It would be neat if there were a toggle to enable only the Siri/AI features that can be processed on-device. I know Apple make a big deal about privacy, but they've designed the features to make it nebulous whether a given feature — or a given invocation of a feature — accesses the cloud.
The Siri request numbers are probably inflated as you regularly have to ask Siri multiple times to get the right way of phrase for Siri to do what you intended. In particular when asking for specific songs, or how much time is left on a timer.