Monday, January 29, 2024

Apple Podcasts Transcripts


This spring [with iOS 17.4], Apple is introducing transcripts on Apple Podcasts, making it easier for anyone to access podcasts.

With transcripts, your audience can read the full text of an episode, search the episode for a specific word or phrase, and tap the text to play from that point in the episode. As an episode plays, each word is highlighted, making it easy to follow along. Transcripts can also be accessed from the episode details page. Touch and hold a podcast episode to reveal an option to view a transcript.

Apple automatically generates transcripts after a new episode is published. Your episode will be available for listening right away, and the transcript will be available shortly afterwards.


When you submit your show to Apple Podcasts, transcripts will be automatically created. If you would like to provide your own transcripts, you can change the transcript setting for your show in the Availability tab on your show page in Apple Podcasts Connect.

John Voorhees:

I’ve experimented with OpenAI’s Whisper for creating transcripts of MacStories’ podcasts, and although the results are good enough for creating a searchable episode database for our internal use, they haven’t been good enough to publish. As a result, I’m very keen to see how well Apple’s solution works. If they prefer, podcasters will be able to upload their own transcripts, too.

The transcripts generated by Apple are saved as VTT files, which is a W3C standard for displaying timed text using HTML 5’s track element. I looked at AppStories, and sure enough, there’s a transcript available for the latest episode already. As one of the show’s creators, I can access, download, edit, and re-upload the transcript. Based on my preliminary scan of the latest episode, though, the transcription is very good, including timestamps and identification of each speaker, although not by name, which isn’t surprising.

Ben Thompson:

Apple’s new ToS for podcasts claims that you can opt out of transcripts, but there is no option to do so in Podcast Connect.

It seems to be a per-episode setting, though that sounds like it creates a race condition where you would have to wait long enough for Apple Podcasts to see the episode so that you could opt out but not wait so long that the episode and transcript have already been distributed to listeners.

I carefully consider every word I write down. Podcasts are a freer medium in part because they are not screen-shottable.

I don’t think that will be the case long-term, regardless of what Apple does here.

Jason Snell:

Speaking as a prolific podcaster, I’m really happy that Apple has provided this feature because it dramatically improves the accessibility of our podcasts. Transcription technology has only recently gotten good enough to make automated transcripts readable, but the ideal solution to the problem was always to have platform owners like Apple build in this technology themselves.

Update (2024-03-08): Jason Snell:

As was foretold back in January, with the release of iOS 17.4 Apple’s Podcasts app now supports podcast transcripts. This is a pretty big breakthrough in terms of access to podcast content and accessibility of podcast to audiences who might not be able to listen.


Apple’s not just running that podcast through a standard transcription engine like the one I use to generate transcripts on my Mac, but one that’s been built to detect some detailed information about how the podcast is structured.


The only thing that’s really missing is support for private podcast feeds, which is where most members-only versions of podcasts live these days.

Update (2024-04-11): Marcin Krzyzanowski:

Apple Podcasts app got text transcript, but Voice Memos did not?

Update (2024-06-18): John Voorhees:

Ari Saperstein, writing for The Guardian, interviewed Ben Cave, Apple’s global head of podcasts and Sarah Herrlinger, who manages accessibility policy for the company, about Apple Podcasts transcripts. […] “Apple’s journey to podcast transcripts started with the expansion of a different feature: indexing.”

Matt Birchler:

My one big hope for this is that it can one day get the ability to transcribe your custom feeds. As it stands today, transcripts will not appear for shows you manually enter their feed URL into the app. I’m guessing this is because the transcript processing is happening on Apple’s servers, and feeds you add yourself never touch Apple’s servers, so Apple doesn’t know about them. If they could move the transcription on-device like they do for Voice Memos in iOS 18, that would be a great feature for people like me who have a few Patreon feeds that don’t get the benefit of this awesome feature today.

2 Comments RSS · Twitter · Mastodon

"I don’t think that will be the case long-term, regardless of what Apple does here."

It's already not: I'm deaf, and podcasts are increasingly accessible to me thanks to tools like Whisper. For podcasts I'm interested in reading, like some of the recent ATPs talking about Apple's DMA changes, I'm already at the point where I'll just snag the MP3 from whatever source and run it through MacWhisper. The resulting quality is more than good enough to follow, even with the small or medium models.

I'm very much looking forward to this change in iOS 17.4 and it will probably get me to actually install and use this.

> > I carefully consider every word I write down. Podcasts are a freer medium in part because they are not screen-shottable.

> I don’t think that will be the case long-term, regardless of what Apple does here.

I've never really cared for what I've seen of Ben Thompson, but for someone who runs a subscription website about the technology industry, that is an unbelievably head-in-the-sand quote.

This has been possible for years with any random "AI" transcription app, not to mention the more costly option of having a transcription service do it.

Leave a Comment