Archive for June 15, 2023

Thursday, June 15, 2023

Privacy and Security in macOS 14

WWDC 2023 session 10053:

Even though the photos look like they are part of your app, they are rendered by the system and only shared when selected, so the user’s photos always remain in their control.

[…]

Prior to macOS Sonoma, when a user wants to screen share their presentation in a virtual video conference, they need to grant the conferencing app permission to record the full screen via the Settings app, resulting in a poor experience and risk of oversharing.

With the new SCContentSharingPicker API, macOS Sonoma shows a window picker on your behalf where people can pick the screen content that they want to share.

[…]

Second, if you prefer to provide your own UI for creating events, there is a new add-only calendar permission, allowing your app to add events without access to other events on the calendar.

[…]

Without any changes on your side, macOS Sonoma will ask for permission when your app accesses a file in another app’s data container.

[…]

To do so, you can specify an NSDataAccessSecurityPolicy in your app’s Info.plist, to replace the default same-team policy with an explicit AllowList.

Apple:

In macOS 14 and later, the operating system uses your app’s code signature to associate it with its sandbox container. If your app tries to access the sandbox container owned by another app, the system asks the person using your app whether to grant access. If the person denies access and your app is already running, then it can’t read or write the files in the other app’s sandbox container. If the person denies access while your app is launching and trying to enter the other app’s sandbox container, your app fails to launch.

The operating system also tracks the association between an app’s code signing identity and its sandbox container for helper tools, including launch agents. If a person denies permission for a launch agent to enter its sandbox container and the app fails to start, launchd starts the launch agent again and the operating system re-requests access.

Jeff Johnson:

I didn’t see this [alert] the first time I ran the app, but I saw it every time I modified and re-ran the app. The reason, I discovered eventually—by remembering what I read yesterday (the above quotes)—is that the app was both sandboxed and ad hoc code signed. Ad hoc code signing is indicated by “Sign to Run Locally” in Xcode.

You’ll frequently see ad hoc signing in open source Xcode projects that are distributed on the internet, because otherwise the project would depend on the developer’s personal team and code signing certificates.

[…]

Every time I modified the app, it got a different ad hoc code signature, which is why Sonoma is complaining on subsequent launches. These cancel-or-allow style dialogs do not appear on launch for ad hoc signed apps that aren’t sandboxed, because they don’t have containers.

[…]

I said, “I don’t know yet whether there’s a way for a non-sandboxed app to preserve the granted file access across launches.” The answer appears to be no.

Previously:

Update (2023-06-19): Howard Oakley:

Until Apple tells us otherwise, I think it’s clear that nothing is changing significantly in sandboxing and notarization that would prevent hobbyists and others who aren’t developers from continuing what they do currently, nor should this hinder the distribution and use of source code.

Update (2023-06-23): macOS 14 Beta 2 Release Notes:

/usr/bin/syspolicy_check is a new command line tool to help determine if the provided macOS application will pass the current running configurations’ system policy. This includes the same checks performed by the Apple notary service and other macOS Trusted Execution layers such as codesign, Gatekeeper, XProtect, and more.

[…]

/usr/bin/gktool is a new command line tool to assess Gatekeeper Policy on applications. gktool can be called to pre-warm the system cache so users do not see the ‘Verifying…’ dialog on first launch of an application.

spctl is still there, too.

Update (2023-06-29): Brian Webster:

  1. The top message in the prompt doesn’t tell the user which app’s data is being read. In this case, my NSAppDataUsageDescription string explains what PowerPhotos is reading, but there’s no actual relation between that message and what files are actually being read. In other words, a malicious app can easily lie to the user here.
  2. Unlike most of macOS’ security prompts, this one is displayed every single time the user launches your app, regardless of whether they’ve granted permission in the past. Not only is this just annoying, but this will quickly train the user to dismiss these prompts without reading them, which undercuts the whole purpose of the feature in the first place.

I’ve filed a Feedback (FB12473837) with Apple that basically suggests that these prompts should work more along the line of the existing “Automation” prompts, which are triggered when one app sends an Apple event to another app.

Update (2023-07-26): I am getting reports that sometimes Sonoma doesn’t prompt for App Management access. It just fails the operation with a permissions error.

ODNI Report on Commercially Available Information

Byron Tau and Dustin Volz:

The vast amount of Americans’ personal data available for sale has provided a rich stream of intelligence for the U.S. government but created significant threats to privacy, according to a newly released report by the U.S.’s top spy agency.

Commercially available information, or CAI, has grown in such scale that it has begun to replicate the results of intrusive surveillance techniques once used on a more targeted and limited basis, the report found.

[…]

It represents the first known attempt by the U.S. government to examine comprehensively how federal agencies acquire, share and use commercially available data sets that are often compiled with minimal awareness by the public that its data is being collected and resold.

The report is available here.

Zack Whittaker:

The Office of the Director of National Intelligence (ODNI) declassified and released the January 2022-dated report on Friday, following a request by Sen. Ron Wyden (D-OR) to disclose how the intelligence community uses commercially available data. This kind of data is generated from internet-connected devices and made available by data brokers for purchase, such as phone apps and vehicles that collect granular location data and web browsing data that tracks users as they browse the internet.

Dell Cameron (Hacker News):

The advisers decry existing policies that automatically conflate being able to buy information with it being considered “public.” The information being commercially sold about Americans today is “more revealing, available on more people (in bulk), less possible to avoid, and less well understood” than that which is traditionally thought of as being “publicly available.”

Perhaps most controversially, the report states that the government believes it can “persistently” track the phones of “millions of Americans” without a warrant, so long as it pays for the information.

[…]

It is no secret, the report adds, that it is often trivial “to deanonymize and identify individuals” from data that was packaged as ethically fine for commercial use because it had been “anonymized” first. Such data may be useful, it says, to “identify every person who attended a protest or rally based on their smartphone location or ad-tracking records.” Such civil liberties concerns are prime examples of how “large quantities of nominally ‘public’ information can result in sensitive aggregations.”

Nick Heer:

Regulations have been slowly taking effect around the world which more accurately reflect these views. But there remains little national control in the U.S. over the collection and use of private data, either commercially or by law enforcement and intelligence agencies; and, because of the U.S.’ central location in the way many of us use the internet, it represents the biggest privacy risk. Even state-level policies — like California’s data broker law — are ineffectual because the onus continues to be placed on individual users to find and remove themselves from brokers’ collections, which is impractical at best.

Previously:

Update (2023-12-19): Joseph Cox:

A section of the Navy bought access to a tool that gave the Pentagon “global” surveillance data via an adtech company that is owned by a U.S. military contractor, according to a Navy contract obtained by 404 Media. Beyond its global scale, the document does not explicitly say what specific sort of data was included in the sale. But previous reporting from the Wall Street Journalhas shown that the marketing agency and government contractor responsible are part of a supply chain of location data harvested from devices, funneled through the advertising industry, onto contractors, which then ends with U.S. government clients.

Snowden Ten Years Later

Matthew Green (in 2019, Hacker News):

Edward Snowden recently released his memoirs. In some parts of the Internet, this has rekindled an ancient debate: namely, was it all worth it? Did Snowden’s leaks make us better off, or did Snowden just embarass us and set back U.S. security by decades? Most of the arguments are so familiar that they’re boring at this point. But no matter how many times I read them, I still feel that there’s something important missing.

[…]

And while the leaks themselves have receded into the past a bit — and the world has continued to get more complicated — the technical concerns that Snowden alerted us to are only getting more salient.

[…]

What’s harder to present in a chart is how different attitudes were towards surveillance back before Snowden. The idea that governments would conduct large-scale interception of our communications traffic was a point of view that relatively few “normal people” spent time thinking about — it was mostly confined to security mailing lists and X-Files scripts. Sure, everyone understood that government surveillance was a thing, in the abstract. But actually talking about this was bound to make you look a little silly, even in paranoid circles.

That these concerns have been granted respectability is one of the most important things Snowden did for us.

Barton Gellman (in 2020):

Someone had taken control of my iPad, blasting through Apple’s security restrictions and acquiring the power to rewrite anything that the operating system could touch. I dropped the tablet on the seat next to me as if it were contagious. I had an impulse to toss it out the window. I must have been mumbling exclamations out loud, because the driver asked me what was wrong. I ignored him and mashed the power button. Watching my iPad turn against me was remarkably unsettling. This sleek little slab of glass and aluminum featured a microphone, cameras on the front and back, and a whole array of internal sensors. An exemplary spy device.

[…]

On the Gmail page, a pink alert bar appeared at the top, reading, “Warning: We believe state-sponsored attackers may be attempting to compromise your account or computer. Protect yourself now.”

[…]

A dozen foreign countries had to have greater motive and wherewithal to go after the NSA documents Snowden had shared with me—Russia, China, Israel, North Korea, and Iran, for starters. If Turkey was trying to hack me too, the threat landscape was more crowded than I’d feared.

[…]

The MacBook Air I used for everyday computing seemed another likely target. I sent a forensic image of its working memory to a leading expert on the security of the Macintosh operating system. He found unexpected daemons running on my machine, serving functions he could not ascertain.

Via Bruce Schneier:

It’s an interesting read, mostly about the government surveillance of him and other journalists. He speaks about an NSA program called FIRSTFRUITS that specifically spies on US journalists. (This isn’t news; we learned about this in 2006. But there are lots of new details.)

Jessica Lyons Hardcastle (Hacker News):

The world got a first glimpse into the US government’s far-reaching surveillance of American citizens’ communications – namely, their Verizon telephone calls – 10 years ago this week when Edward Snowden’s initial leaks hit the press.

[…]

In the decade since then, “reformers have made real progress advancing the bipartisan notion that Americans’ liberty and security are not mutually exclusive,” Wyden said. “That has delivered tangible results: in 2015 Congress ended bulk collection of Americans’ phone records by passing the USA Freedom Act.”

[…]

Wyden also pointed to the sunsetting of the “deeply flawed surveillance law,” Section 215 of the Patriot Act, as another win for privacy and civil liberties.

That law expired in March 2020 after Congress did not reauthorize it.

[…]

One thing we do know about Section 702 is that it has been widely misused: more than 278,000 times by the FBI between 2020 and early 2021 to conduct warrantless searches on George Floyd protesters, January 6 rioters who stormed the Capitol, and donors to a Congressional campaign.

[…]

As EFF noted: “There are serious issues raised by this tool and by 12333 more broadly. Despite consistent calls for reform, however, very little has occurred and 12333 mass surveillance, using XKeyscore and otherwise, appears to continue unabated.”

Bruce Schneier:

Now, ten years later, I offer this as a time capsule of what those early months of Snowden were like.

Nick Heer:

I remember the week when articles based on these disclosures began showing up. I remember being surprised not by the NSA’s espionage capabilities — that much was hinted at — but by its brazen carelessness about operating at a scale which would ensure illegal collection. Snowden’s heroic whistleblowing gave the world a peek into this world, but it was ever so brief. There is little public knowledge of the current capabilities of the world’s most intrusive surveillance agencies — by design, of course — and even the programmes exposed by Snowden continue to be treated with extreme secrecy. My FOIA requests from that week remain open.

Previously:

Update (2023-07-05): Robert at Objective Development:

Ten years after Snowden, ten years of activism and data protection laws have not made things better – rather the opposite. We leave digital traces everywhere and they can be exploited using methods that are legal even under today’s laws. With the advent of apps, digital services, and the IoT, more and more of our lives is taking place online. AI makes it all the easier to exploit these traces. And some players don’t even care about legality.

[…]

Use tools to protect your data. Choose browsers focusing on privacy, not on features. Choose your search engine carefully — after all, you share many of your thoughts with it. Use application firewalls like Little Snitch to visualize all those data connections which normally occur under the hood and to block those connections that undermine your privacy.

Apple Execs on Facebook (2011)

Scott Forstall (Hacker News):

I just discussed with Mark [Zuckerberg] how they should not include embedded apps in the Facebook iPad app--neither in an embedded web view or as a directory of links that would redirect to Safari.

Not surprisingly, he wasn’t happy with this as he considers these apps part of the “whole Facebook experience” and isn’t sure they should do an iPad app without them. Everything works in Safari, so he is hesitant to push people to a native app with less functionality, even if the native app is better for non-third party app features.

[…]

I had thought that it would be relatively clear which links in Facebook are app-links and which are non app-links. App-links would be things like a poker game. Web links (non app-links) would be things like the NYT. But according to Mark, there is no obvious way to distinguish between a poker game and the NYT. Both are Facebook developers and provide Facebook integration. This is also true of many bloggers. He claims they have over 100,000 developers/points of web integration.

It would be unfortunate to disallow any web post in the Facebook app, including blog posts.

Phil Schiller:

I understand why FaceBook wants to create a market of 3rd party HTML 5 apps that users run from a native FaceBook app on the iPad, and realize that they can always run FaceBook on the iPad in Safari and have these apps in Safari as well anyway, but if we approve of this (regardless of the credits issue) we would then need to allow all developers to do the same thing.

So, for example If Adobe comes in with an app that links to new web apps that they promote we need to allow that “app store” in, even worse Google could come up with an app that runs all their 3rd party Chrome web apps and we would need to allow that in too! I don’t see why we want to do that.

Scott Forstall:

I agree we don’t want to open up a slippery slope for Google, Adobe, or even Amazon to start linking out to a Kindle store in Safari.

Francisco Tolmasky:

Amazing to read pages of AppStore rule minutia (FB Credits vs. what links “look like apps”) with basically zero regard for user experience (Scott comes closest by acknowledging he wouldn’t use a gutted Facebook app). Everything boils down to their position vs. other players like Adobe. I would have imagined they convinced themselves that “We can’t allow this because all those apps are junk and we want a pristine experience on the iPad!” But nope, it’s all just business.

Previously: