Adam Engst:
However, the real win in centralizing newsreading in email has come from RSS-to-email services. I’ve tried numerous RSS readers over the years but have never settled down with one because they require me to devote specific time to reading news. That requires remembering to do so and switching context. I actively want to see what’s new in my email every morning and throughout the day, but I never even think to launch an RSS reader. I have the same issue with Apple News, which languishes on my Mac and iPhone for weeks or months between launches. By employing an RSS-to-email service, new posts from blogs and other sites that provide RSS feeds can appear in my email automatically.
Which one to use? I’ve been testing three: Blogtrottr, Feedrabbit, and Follow.it. Although the interfaces vary a bit, the basics are similar—enter a feed URL, configure a few options, and then sit back and receive an email for each new post. Each of these services offers a free account with paid upgrades that remove limits and provide additional features. Here’s how they compare.
I prefer an actual RSS reader, since I find it more efficient for following large numbers of feeds. However, I can see the appeal of having both RSS and e-mail news in the same app, and e-mail is nice in that it naturally creates an offline archive that can be searched later.
Alas, it does not look like any of these services supports non-RSS sites like Facebook and Twitter.
Previously:
E-mail Facebook Mailing Lists RSS Twitter Web
Matthew Ball (Hacker News):
The Vision Pro is arguably the highest-profile and most important device debuted by Apple since the iPhone in January 2007. The company spent more time (eight years versus the iPhone’s three) and money (see point #2) developing the device than any other in its history. The Vision Pro is clearly the most ambitious of their product launches since the iPhone, the first to be wholly developed under the purview of CEO Tim Cook (though various head-mounted display prototypes were underway as early as 2006), and reporting suggests that its viability was controversial internally (with some employees arguing that Head-Mounted Displays (“HMDs”) impart harm by isolating its wearers from other people and, ultimately, the world around them).
[…]
The very sentence before Apple announced the price of the Vision Pro at WWDC23, Rockwell explained—rationalized—that “If you purchased a new state-of-the-art TV, surround sound system, powerful computer with multiple high-definition displays, high-end camera, and more, you still would not have come close to what Vision Pro delivers.” Given this, we have to evaluate the Vision Pro with the fullest of expectations. And to that end…
[…]
A few months later, there is a wider understanding that while Apple has built some brilliant technology (inclusive of software and hardware), much of its relative spectacle stemmed from the high-end components Apple chose to use and which Meta has thus far opted against.
[…]
EyeSight was not a wholly unique—Meta had even publicly demonstrated a similarly minded prototype in 2021—but culturally, it seemed uniquely Apple. When marketing the Apple Watch, for example, Cook had emphasized the way it reduced digital isolation by keeping users from pulling out their phones and tilting their heads down to it. In time, we may come to consider EyeSight (or similar technologies) essential to the mainstream adoption (and, further, use of) HMDs. Thus far, however, the feature seems like a costly mistake.
[…]
The Vision Pro is best-in-class when it comes to “spatial mapping” of real-world environments. It’s passthrough functionality is also best-in-class in latency, precision, and image quality. It was also important to Apple that the device be seen as a “mixed-reality” or “spatial computing” device, not a virtual reality one. At the same time, the device is, functionally speaking, a virtual reality device.
Sylvia Varnham O’Regan and Wayne Ma (via Slashdot, MacRumors, Hacker News):
Meta Platforms has canceled plans for a premium mixed-reality headset intended to compete with Apple’s Vision Pro, according to two Meta employees.
Meta told employees at the company’s Reality Labs division to stop work on the device this week after a product review meeting attended by Meta CEO Mark Zuckerberg, Chief Technology Officer Andrew Bosworth and other Meta executives, the employees said.
Ryan Christoffel:
Apple’s Vision Pro seems to have scared Meta off from entering the premium headset market. But in this case, that’s not exactly a win.
Previously:
Update (2024-09-13): Adam Engst:
Nothing I’ve read about the Vision Pro, nor my in-person demo at an Apple Store several months ago, has made me wish I had bought one. The hardware is impressive, and it works largely as advertised, though I was highly perturbed by several of the spatial photos and videos that put me too close to their subjects, making me feel like I was invading their personal space. I’m sure others have different opinions and experiences, but I still can’t see where a Vision Pro would fit into my current world of computing and media consumption.
Ryan Christoffel:
It’s too early to call the Vision Pro a success or flop, but to mark six months, I’d like to explore what the device’s success ultimately hinges on. And I think it all comes down to Apple’s own words: ‘spatial computer.’
[…]
Apple needs to prove that the Vision Pro is actually a computer. And one that does computer-y things better than traditional alternatives.
Update (2024-09-23): Michael Love:
I’ve become convinced that the original sin of Vision Pro was not allowing developers access to the camera, not even with, say, a permission dialog every time the app starts using it.
I’m skeptical that the Vision Pro would have become a rousing success even if that wasn’t the case, but to the extent that the project was relying on developers giving it a reason to exist, a whole lot of potential reasons were closed off by that decision.
Previously:
Apple Vision Pro Mac Virtual Display visionOS visionOS 1
Juli Clover:
Apple today announced the launch of a Podcasts on the web feature, which works in Safari, Chrome, Edge, and Firefox on Macs, PCs, and other devices. Podcasts on the web allows users to search for, browse through, and listen to podcasts with access to the Up Next queue and library when signed in to an Apple Account.
John Voorhees:
The UI is essentially the same as Apple’s native app but with the added flexibility of working on non-Apple devices.
[…]
Links opened on Apple devices will open in the native Podcasts app and in the browser on other devices, although on the Mac, it is possible to play episodes in a browser if you prefer.
John Gruber:
The only use case for something like this is for users who spend a lot of time on Windows — presumably at work — and wish they could listen to their own podcast queue. That’s a big use case though!
I continue to use Overcast, but I’m considering adding Apple Podcasts as a second app to manage podcasts that we listen to in the car as a family. That would let me keep the subscription lists separate, and perhaps the Web version would make it possible to make additions from other devices (alas, not from iOS devices) that aren’t logged into my account. Of course, it would be better to have actual family sharing support within the Podcasts app. And it’s still clunky and doesn’t support OPML.
Adam Engst:
Those who don’t wish to sign in can listen to millions of free podcasts, browse Top Charts, and take advantage of Apple’s editorial collections. Signing in with your Apple ID gives you access to your Library, Up Next Queue, and subscriptions. Signed-in users can also follow shows and save play progress.
I must admit some curiosity as to why Apple has suddenly started producing Web versions of some of its apps and services. Nothing prevented Apple from doing this years ago—Google and Spotify have produced capable Web apps for ages.
Tim Hardwick:
Apple Podcasts, once the dominant platform for podcast listening, is experiencing a significant decline in popularity as competitors like YouTube and Spotify gain ground, according to a recent study by Cumulus Media and Signal Hill Insights.
[…]
YouTube is now the most popular platform for podcast consumption in the United States, with 31% of respondents reporting it as their primary choice. Spotify follows at 21%, while Apple Podcasts has dropped to third place with only 12% of the market share.
This is in stark contrast to Apple's position just a few years ago. In July 2019, 29% of weekly podcast listeners primarily used Apple Podcasts.
Previously:
Update (2024-09-12): M.G. Siegler:
Well beyond the obvious element – video, more on this in a moment – a big part would seem to be discovery. That is, YouTube, as you might expect given the parent company, is a great search engine for content. Apple is... well, Apple.
Update (2024-09-13): My initial experience using Apple Podcasts, the iOS app, was a mixed bag. It was nice to have a separate app with separate podcasts for family consumption. But the interface for browsing and downloading individual episodes is clunky, and CarPlay integration was unreliable, with the display of the playback position frequently getting out of sync with the audio and it sometimes restarting at the beginning of the episode. That has never happened to me with Overcast.
Previously:
Apple Podcasts Car CarPlay Family Sharing OPML Podcasts Web
Hartley Charlton:
A Reddit user discovered the pre-prompt instructions embedded in Apple’s developer beta for macOS 15.1, offering a rare glimpse into the backend of Apple’s AI features. They provide specific guidelines for various Apple Intelligence functionalities, such as the Smart Reply feature in Apple Mail and the Memories feature in Apple Photos. The prompts are intended to prevent the AI from generating false information, a phenomenon known as hallucination, and ensure the content produced is appropriate and user-friendly.
Andrew Cunningham:
The files in question are stored in the /System/Library/AssetsV2/com_apple_MobileAsset_UAF_FM_GenerativeModels/purpose_auto folder on Macs running the macOS Sequoia 15.1 beta that have also opted into the Apple Intelligence beta. That folder contains 29 metadata.json files, several of which include a few sentences of what appear to be plain-English system prompts to set behavior for an AI chatbot powered by a large-language model (LLM).
Wes Davis (Mastodon):
They show up as prompts that precede anything you say to a chatbot by default, and we’ve seen them uncovered for AI tools like Microsoft Bing and DALL-E before. Now a member of the macOS 15.1 beta subreddit posted that they’d discovered the files containing those backend prompts. You can’t alter any of the files, but they do give an early hint at how the sausage is made.
Nick Heer:
But, assuming — quite fairly, I might add — that these instructions are what underpins features like message summaries and custom Memories in Photos, it is kind of interesting to see them written in plain English. They advise the model to “only output valid [JSON] and nothing else”, and warn it “do not hallucinate” and “do not make up factual information”.
Dare Obasanjo:
I find it fascinating that what were science fiction tropes from Asimov’s “I, Robot” series of books are now real.
Telling AI to perform tasks and not make stuff up is the new programming.
Steve Troughton-Smith:
Apple’s system prompts for Apple-Intelligence-backed features show that the company’s ‘special sauce’ is just a carefully-crafted paragraph of text, hacked together just like everybody else is doing. Can’t wait to see the ‘you are Siri’ system prompt.
Tony West:
You are Siri. On HomePod devices, you pop up with “uhuh?” randomly. You start playing music without warning because you thought you heard someone ask for it. If someone asks you about a sports event on today, give them a detailed answer about the event from (perform random number calculation) years ago, but tell them you can’t display information on the current event.
Steve Troughton-Smith:
I guess this isn’t common knowledge, based on the reaction to the Apple Intelligence system prompts, but I read months ago that it was benchmarked that using ‘please’ and ‘thank you’ and telling an LLM not to hallucinate ‘improves results’. If that kind of language has made it into Apple’s own prompts, it’s likely not for no reason.
And no, telling it not to hallucinate isn’t going to stop it hallucinating. But if it on average improves a meaningful % of results, it’s worth including. This is how prompt engineering works.
Previously:
Apple Intelligence Apple Mail Artificial Intelligence iOS iOS 18 Mac macOS 15 Sequoia Photos.app Siri Writing Tools