Wednesday, May 17, 2023

Apple’s 2023 Accessibility Feature Preview

Apple (Hacker News):

Coming later this year, users with cognitive disabilities can use iPhone and iPad with greater ease and independence with Assistive Access; nonspeaking individuals can type to speak during calls and conversations with Live Speech; and those at risk of losing their ability to speak can use Personal Voice to create a synthesized voice that sounds like them for connecting with family and friends. For users who are blind or have low vision, Detection Mode in Magnifier offers Point and Speak, which identifies text users point toward and reads it out loud to help them interact with physical objects such as household appliances.

[…]

With Live Speech on iPhone, iPad, and Mac, users can type what they want to say to have it be spoken out loud during phone and FaceTime calls as well as in-person conversations. Users can also save commonly used phrases to chime in quickly during lively conversation with family, friends, and colleagues. Live Speech has been designed to support millions of people globally who are unable to speak or who have lost their speech over time.

For users at risk of losing their ability to speak — such as those with a recent diagnosis of ALS (amyotrophic lateral sclerosis) or other conditions that can progressively impact speaking ability — Personal Voice is a simple and secure way to create a voice that sounds like them.

Will there be a way to export your Personal Voice so that you aren’t totally reliant on iCloud to preserve it? Many of these users will not be able to just re-record new prompts if something goes wrong or if they need to switch to a different Apple ID.

Emma Roth (via Hacker News):

Additionally, Apple is introducing streamlined versions of its core apps as part of a feature called Assistive Access meant to support users with cognitive disabilities. The feature is designed to “distill apps and experiences to their essential features in order to lighten cognitive load.” That includes a combined version of Phone and FaceTime as well as modified versions of the Messages, Camera, Photos, and Music apps that feature high contrast buttons, large text labels, and additional accessibility tools.

[…]

As an example, Apple says a user can aim their device’s camera at a label, such as a microwave keypad, which the iPhone or iPad will then read aloud as the user moves their finger across each number or setting on the appliance.

Shelly Brisbin:

Photos and Music each display their contents in a grid that’s “flatter” in structure than the hierarchical interfaces the standard versions of those apps offer.

Assistive Access is the closest Apple has come to an interface designed specifically for people with disabilities or elders—an option that Android has offered via its support for alternative launchers. It will be interesting to see if it’s full-featured enough to not only support users with cognitive disabilities, but also offer a “grandparent-friendly” experience for those trying to choose between and iPhone and an Android phone.

[…]

Last year’s accessibility preview featured a handful of enhancements for hearing aid owners who use an iPhone. This year, Apple says support for Made for iPhone Hearing Aids is coming to the Mac. That’s been a long time coming. You’ll need an M1 or better Mac to make the connection, though.

Previously:

Update (2023-05-19): Mr. Macintosh:

In a few of the images, System Settings has a forward button ?

Joe Rossignol:

Those with an iPhone, iPad, or newer Mac will be able to create a Personal Voice by reading a randomized set of text prompts aloud until 15 minutes of audio has been recorded on the device. Apple said the feature will be available in English only at launch, and uses on-device machine learning to ensure privacy and security.

Harry McCracken:

People who create Personal Voices will get to judge for themselves how well the company met this goal. With audio samples I heard, reproducing a variety of distinct voices, the intonation and pacing could be a bit flat, as computerized speech tends to be. Overall, though, they were impressive, entirely distinct from each other, and certainly worlds apart from the one-voice-fits-all feel of most of the synthesized speech in our lives.

[…]

Those creating voices for later use will presumably want to sync them to their iCloud account for eventual access on devices they may not yet own. But that process only happens at their express instruction, and the data is encrypted on Apple’s servers.

So backup is opt-in, and there’s no mention of exporting.

John Voorhees:

To get a better sense of what some of this week’s announcements mean, I spoke to David Niemeijer, the founder and CEO of AssistiveWare, an Amsterdam-based company that makes augmentative and alternative communication (AAC) apps for the iPhone and iPad, including Proloquo, Proloquo2Go, and Proloquo4Text. Each app addresses different needs, but what they all have in common is helping people who have difficulty expressing themselves verbally.

3 Comments RSS · Twitter · Mastodon

All I want for my parents is the ability to significantly delay or disable long-press gestures.

iPads used to be the bee's knees for seniors. Now my mother can't spend more than ten seconds with her iPad before inadvertently bringing up context menus that offer options she will never, ever use. Her iPad is clogged with hundreds of Mail.app windows, which breaks e-mailing photos via the Share sheet (you can tap the Mail icon but nothing will happen).

Scrolling through websites often brings up Safari's stupid preview box.

Yes, I know about the Accessibility settings. Yes, you can delay the time it takes to register a tap. No, it doesn't help with long presses. Yes, I've sent feedback to Apple multiple times a year for the last two years. No, they don't care.

First they destroyed MacOS, now they've destroyed iOS for seniors.

My mother has the same problem. I went looking for a way to disable long presses to no avail. The setting to delay activation doesn't help. I've sent feedback as well.

Yeah, I just want them to fscking fix it. I know blind people aren't the only group Apple should care about, and of course I'm happy they're working on opening up the inclusive umbrella (though I do have concerns about the ethics of voice cloning, that's for another day). But VoiceOver is a marquee feature, and it's broken, particularly on macOS, to the point that I'm having trouble recommending Macs to would-be-users. All the shiny in the world doesn't mean jack if it doesn't work properly. Case in point, I've used Door Detection ... once, maybe? Granted it probably looks different on glasses than on a phone. But the most obvious problem with it was that I couldn't control the rate of speech. This is the problem -- it seems more useful for collecting PR brownie points than it does actually improving lives. Please Apple, stop adding shit, and fix what's already there!

See also a recent thread on Applevis.

Leave a Comment