Archive for July 18, 2022

Monday, July 18, 2022

OmniFocus 3.13 and Voice Control

Ken Case:

OmniFocus 3.13 provides a wide range of improvements to Omni Automation—perhaps most notably adding support for Speech Synthesis, but also a number of other improvements as well.

With these automation enhancements, OmniFocus 3.13 can now take full advantage of the new Voice Control features offered the latest iOS, iPadOS and macOS releases, delivering an incredible level of voice-driven productivity.

If you’re new to Apple’s Voice Control feature, it empowers control of a Mac, iPhone and iPad entirely with one’s voice. It isn’t Siri; it’s control. Voice Control offers an enhanced command and dictation experience, giving full access to every major function of the operating system. For someone with motor limitations, Voice Control is transformative; but one doesn’t need to have motor limitations to have it enhance the experience of using OmniFocus.

As someone without such limitations, I’m most interested in the potential for voice interactions on my iPhone, where the lack of a physical keyboard makes many tasks feel slow and plodding. I love using my iPhone—and now my Apple Watch—to create new OmniFocus actions, and I’ve long done this by using Siri to add reminders. But I postpone as much of the other stuff as possible until I’m back to my Mac because I know it will be so much easier there. That’s not ideal because a major benefit of OmniFocus is that it lets me get stuff out of my head; having to remember which changes to apply later works against that. I find myself using Siri to make new actions to remind myself to adjust other actions—because that’s easier than making the changes directly right then.

So, my hope is that I can use these new Voice Control features as sort of the equivalent to keyboard shortcuts. In theory, voice can offer quick random access to commands without having to first locate them with my eyes and then fingers. It can also work hands-free, when my fingers are otherwise occupied or in gloves.

It’s important to note the differences between VoiceOver, Voice Control, and Siri:

Voice Control lets users control the entire device with spoken commands and specialized tools, while Siri is an intelligent assistant that lets users ask for information and complete everyday tasks using natural language.

Voice Control (iOS, Mac) happens on device, and my experience is that it’s faster and more accurate than Siri, since it’s working with a much more restricted domain of commands and doesn’t need to talk to a server. You can also freely mix it with dictation so that you can navigate within an app and enter text into fields without switching between separate listening modes, though there are separate Dictation and Command modes if you prefer not to rely on this. Voice Control itself is also a mode, which is great because you don’t have to prefix every command with “Hey Siri”. You can turn on Voice Control either in Settings or by asking Siri. Once enabled, you can toggle it by saying “Wake up” or “Go to sleep.”

The catch is that, out of the box, Voice Control only has a system-level vocabulary. You can tell it to tap a button by name or by number and dictate text into fields. But it doesn’t know about OmniFocus-specific terms such as actions, projects, or deferment dates.

iOS and macOS do, however, let you add your own custom Voice Control commands, which are akin to the old Speakable items. With this announcement, Omni has added a library of Voice Control commands that are specific to OmniFocus. And you can make your own using OmniFocus’s JavaScript API.

Installing these commands on iOS is kind of awkward. For example, if I want to defer an action for 1–7 days or until a particular day of the week, I have to click 14 links to add those individual shortcuts. Then, in Settings, I have to add a custom command for each, select the shortcut, set it to only be active in OmniFocus, and type the voice phrase to trigger it. Then, the first time I invoke the shortcut, I have to confirm that, yes, I want to allow it to access OmniFocus.

Fortunately, this setup only has to be done once, and only for the commands that you plan to use. I found it easier to add the shortcuts from my Mac and then have them sync to my iPhone via iCloud. The Voice Control setup has to be done on the iPhone itself, though. True, you can skip creating Voice Control commands because the shortcuts are automatically accessible via Siri, but I think Voice Control just works better. (In trying to group the OmniFocus shortcuts into a folder, I realized that drag and drop from the list view in Shortcuts for Mac still doesn’t work. Neither does dragging a folder to the bottom of the sidebar. And you can’t sort the shortcuts alphabetically until Ventura.)

I’d like to see Apple move Voice Control in the direction of the new App Shortcuts, so that apps could simply tell the system which custom commands they offer. It’s great that users can add their own custom commands based on shortcuts, but commands provided by the app vendor should be built into the app, and I should be able to just tap a bunch of items in a list to enable or disable them. If there’s a bug to fix, this could be done once in the app instead of requiring each customer to download an updated shortcut.

Voice Control setup works much better on macOS. There, you can import and export XML files which contain lists of commands. So, instead of installing one shortcut for each day of the week and creating a Voice Control command for each, I can just import a single file that adds all 7 commands. There’s also a giant file that adds commands for all of the menu items.

Why import the commands when I already have the shortcuts that I imported for iOS? Those shortcuts do work on macOS, but to use them from Voice Control you would need to set them up in System Preferences. (That part doesn’t sync from iOS.) So you might as well set them up with the XML file on the Mac, as that’s much easier. Secondly, custom Voice Control commands on macOS are able to send the command’s JavaScript directly to OmniFocus via a URL scheme. This is much faster than trampolining to the Omni Automation shortcut that passes the JavaScript along to OmniFocus. Again, it would be nice if iOS could catch up to macOS here.

How well does it all work? Sometimes the voice command does what I expect quickly, and I feel like I’m saving time, even vs. tapping the Share button to invoke a plug-in for adjusting the defer date. It’s generally faster and more reliable than Siri. I’m excited for the possibilities of mixing commands with dictation, though it’s too early to tell whether this will become a core part of my workflow. I’ve also run into a few glitches. Sometimes Shortcuts spins for a while and then reports a timeout error, though OmniFocus does perform the command. Also, when I change the defer date from the action editor, this doesn’t get reflected in the interface until I close the editor, so it appears as though nothing happened.

Omni says the editor updates promptly in OmniFocus 4 (currently in beta). The new version also features a new list interface where you can select actions without having to enter a separate edit mode. This also opens up more possibilities for voice interactions, as you can tell it to Select Next Item and then make changes directly from the list. To me, this really shows the potential for Voice Control because it goes beyond what I could do with my finger. Not only is it hands-free, but I can also (as I’m used to on the Mac) do stuff without having to open and close the action editor. In that case, Voice Control can help overcome the limitations of both the iPhone’s software keyboard and its small screen.

Previously:

Update (2022-08-02): Automators:

Sal Soghoian takes David and Rosemary on an epic automation adventure. Starting with a look back at Automator on the Mac, and looking at the star Shortcuts developers now—before diving into Sal's latest project of custom voice control with OmniFocus and beyond.

Facebook Encrypting Links to Prevent URL Stripping

Bruce Schneier:

Some sites, including Facebook, add parameters to the web address for tracking purposes. These parameters have no functionality that is relevant to the user, but sites rely on them to track users across pages and properties.

Martin Brinkmann:

Mozilla introduced support for URL stripping in Firefox 102, which it launched in June 2022. Firefox removes tracking parameters from web addresses automatically, but only in private browsing mode or when the browser’s Tracking Protection feature is set to strict. Firefox users may enable URL stripping in all Firefox modes, but this requires manual configuration. Brave Browser strips known tracking parameters from web addresses as well.

[…]

Facebook could have changed the scheme that it is using, but this would have given Facebook only temporary recourse. It appears that Facebook is using encryption now to track users.

[…]

The main issue here is that there it is no longer possible to remove the tracking part of the URL, as Facebook merged it with part of the required web address.

Previously:

Invasive Spotlight Indexing

Lloyd Chambers:

The thoughtless design of providing no facility to defer/delay Spotlight indexing is bad enough. But to perform intensive Spotlight indexing when the user is needs the machine to perform well—that is design incompetence to the point of offensive. Ditto for when a CPU and disk-intensive program is running, one that the user wants done as soon as possible.

Spotlight destabilizes the performance of your Mac. You just never know when you can count on things running as they should.

What I’m asking Apple to do is to add something akin to the “Stop this backup” menu item that Time Machine offers. That works great—it will defer the backup for an hour. A “Defer indexing For...” menu command would surprise and delight me.

It’s great to have tools like TimeMachineEditor, but in my opinion there should be a built-in way to restrict both Time Machine and Spotlight to only run during certain hours and to postpone them for a specified amount of time.

Sometimes I want the Mac to be quiet, but I don’t want to turn it off because it’s still doing something important like uploading to a cloud backup. That by itself wouldn’t cause fan noise or much hard drive grinding. But sometimes a Spotlight process decides to go crazy, and then the Mac is loud for hours or even days unless I’m in a position where I can unmount the drive that it happens to be indexing.

I’ve mostly worked around this by disabling Spotlight indexing on all my spinning hard drives. However, it no longer (since APFS?) seems to be possible to exclude Time Machine drives. And, of course, my Time Machine drive has more indexable content than any single drive that I have, so there is a lot of work for Spotlight to do.

As best I can tell, the most invasive indexing is actually caused by something going wrong with the index files. Sampling the processes shows threads like com.apple.metadata.spotlightindex.Compaction that seemingly use lots of CPU and I/O forever. In such cases, I use sudo mdutil -E to delete the Spotlight index. That, of course, triggers many hours of legitimate work for Spotlight to build a new index, but then it’s eventually quiet—until the next time compaction gets stuck.