Swift Atomics 1.0
The public API of Swift Atomics is now considered source stable.
[…]
The low-level (pointer-based) atomic boolean operations are now public.
Previously:
The public API of Swift Atomics is now considered source stable.
[…]
The low-level (pointer-based) atomic boolean operations are now public.
Previously:
Announcement for Mac developers:
utis.cc - where shared UTIs find a home
It’s also the home of the “.paths” file spec, which is meant to store a list of files and might be supported by any Mac app that can show a file browser or operate on a set of files.
Previously:
Zoë Schiffer (tweet, Hacker News):
Jacob Preston was sitting down with his manager during his first week at Apple when he was told, with little fanfare, that he needed to link his personal Apple ID and work account.
[…]
Three years later, when Preston handed in his resignation, the choice came back to haunt him. His manager told him to return his work laptop, and — per Apple protocol — said he shouldn’t wipe the computer’s hard drive. His initial worry had come to pass: his personal messages were on this work laptop, as were private documents concerning his taxes and a recent home loan.
[…]
Employees have been asked to install software builds on their phones to test out new features prior to launch — only to find the builds expose their personal messages. Others have found that when testing new products like Apple’s Face ID, images are recorded every time they open their phones. “If they did this to a customer, people would lose their goddamn minds,” says Ashley Gjøvik, a senior engineering program manager.
Dogfooding leads to better products, but Apple’s systems aren’t designed to do this in a privacy preserving way. Apple work e-mail addresses can’t be used to sign up for iCloud or AppleConnect, so employees are required to use personal accounts. iOS and macOS only let you sign into one Apple ID at a time, so it’s not practical to have separate “personal” accounts for work and personal. Having separate devices for work and personal is also discouraged because it gets in the way of “live-on” dogfooding.
The blurring of personal and work accounts has resulted in some unusual situations, including Gjøvik allegedly being forced to hand compromising photos of herself to Apple lawyers when her team became involved in an unrelated legal dispute.
Underpinning all of this is a stringent employment agreement that gives Apple the right to conduct extensive employee surveillance, including “physical, video, or electronic surveillance” as well as the ability to “search your workspace such as file cabinets, desks, and offices (even if locked), review phone records, or search any non-Apple property (such as backpacks, purses) on company premises.”
[…]
It might seem like a company obsessed with secrecy would be sympathetic to its employees’ wishes to have confidential information of their own. But at Apple, secrecy requires the opposite: extensive knowledge, and control, over its workforce.
It’s not clear to what extent the policies are standard corporate ones that elevate the company’s interests over the employee’s—because they can—or whether they date to Tim Cook’s 2012 doubling down on secrecy, or before. But however much Apple cares about privacy for customers, that doesn’t seem to extend to employees. Developers, too, are encouraged to attach privacy invading sysdiagnose logs to each bug report, where they live in Radar “forever.”
The legal dispute that led to turning over Gjøvik’s private data did not even involve her personally, though she is separately fighting the company over discrimination and harassment.
Wrote about the twin challenges hitting Apple simultaneously: regulators and lawmakers forcing it to release its grip on the App Store, and employees organizing effectively demanding internal change.
Apple has never seen anything like it.
Zoe Schiffer (tweet, Lorenzo Franceschi-Bicchierai, Hacker News):
Apple employee organizing took another step this morning with the launch of a website called AppleToo. The goal is to collect stories from workers at all levels of the organization who’ve experienced harassment or discrimination.
Previously:
Update (2021-09-07): Neil Jhaveri:
The article implies coercion that I didn’t experience. For a while, I used a separate iCloud.com account for work (I did eventually go personal). The tradeoffs were made clear to me, and many coworkers didn’t enable iMessage on their work systems.
In a video broadcasted to staffers days before Labor Day, Apple’s retail and people chief Deirdre O’Brien addressed the growing number of Apple employees voicing their opinions about workplace issues like pay inequality.
[…]
In the video, which was seen by MacRumors, Deirdre O’Brien tells staff who are experiencing workplace issues to talk to their managers and “business relations partner.” She says that Apple has a “confidential process to thoroughly investigate, in a way that treats everyone with dignity and respect.”
See also: Decoder.
Update (2021-09-10): Zoe Schiffer (tweet, Dell Cameron, tweet, Hacker News):
Apple has fired senior engineering program manager Ashley Gjøvik for allegedly violating the company’s rules against leaking confidential information.
Update (2021-10-15): Zoe Schiffer (Hacker News):
Apple has fired Janneke Parrish, a leader of the #AppleToo movement, amid a broad crackdown on leaks and worker organizing. Parrish, a program manager on Apple Maps, was terminated for deleting files off of her work devices during an internal investigation — an action Apple categorized as “non-compliance,” according to people familiar with the situation. The files included apps like Robinhood, Pokémon GO, and Google Drive.
Ashley M. Gjøvik (via Scott):
I think the hearing today went very well. Best case scenario was I keep all my claims; worst case is I only keep the 4 claims Apple’s not fighting - but sounds like outcome will be in the middle ground, which is great news. When they’re released I’ll share transcripts & decision!
Sarah Perez (August 2020, via Hacker News):
Apple is expanding its program that provides parts, resources and training to independent repair shops to now include support for Mac computers.
[…]
To date, however, the program was only focused on iPhone repairs — not Macs. Going forward, these repair shops and others that qualify will be able to access Apple-genuine tools, repair manuals, diagnostics, official parts and other resources they need to perform common out-of-warranty repairs on Macs, too.
[…]
The news of the program’s expansion is timely, given that Apple’s stance on consumers’ “right to repair” their own devices is one of the many topics under investigation by the U.S. House Antitrust Subcommittee.
The big question is if independent shops can maintain a stock of parts on hand. If you have to order from the depot for each repair as they come, not much sense in just not sending to the depot.
💸 $999 to fix a MacBook Pro at Apple.
💸 $325 to fix it at an independent shop.
I spent weeks running around with broken 💻s to figure out what the “Right to Repair” is about.
It’s about giving us choice—and saving us ⏱ and 💰.
Previously:
Apple today announced a number of changes coming to the App Store that, pending court approval, will resolve a class-action suit from US developers. The terms of the agreement will help make the App Store an even better business opportunity for developers, while maintaining the safe and trusted marketplace users love. Apple appreciates the developer feedback and ideas that helped inform the agreement, and respects the ongoing judicial review process.
[…]
To give developers even more flexibility to reach their customers, Apple is also clarifying that developers can use communications, such as email, to share information about payment methods outside of their iOS app.
[…]
Apple will also expand the number of price points available to developers for subscriptions, in-app purchases, and paid apps from fewer than 100 to more than 500.
First, read this great thread from Ryan Jones.
Apple claims they are ‘clarifying’ this rule. That doesn’t look like a ‘clarification’ to me.
Indeed, the settlement says:
Permit all U.S. Developers to communicate with their customers via email and other communication services outside their app about purchasing methods other than in-app purchase, provided that the customer consents to the communication and has the right to opt out. In-app communications, including via Apple Push Notification service, are outside the scope of this provision. Apple will revise its App Store Guidelines to permit the foregoing for all app categories, including by deleting from Guideline 3.1.3 the following language: “Developers cannot use information obtained within the app to target individual users outside of the app to use purchasing methods other than in-app purchase (such as sending an individual user an email about other purchasing methods after that individual signs up for an account within the app).”
Deleting that sentence is more than a “clarification.” But it’s not much. The customer still has to opt into giving their e-mail address. The app still can’t mention payment methods or even link to its Web site from within the app. And there’s still the guideline that says other payment methods are forbidden except for in apps that “operate across multiple platforms.”
Apple’s draconian anti-steering provisions remain in place just as before.
And Apple deceitfully labelling the restitution they’ve agreed to pay as an “assistance” fund?
Apple also says that it will maintain the Small Business Program for at least three years, publish an annual report on App Review, and allow for more pricing tiers. It sounds like Apple’s concessions are pretty minor, especially since developers are still not allowed to mention alternative purchase avenues within apps.
Hagens Berman, the law firm representing the class of iOS developers affected by this suit, clarifies that even the weak commitments Apple made only apply to U.S. developers. This settlement is a walkover for Apple and a sweet payday for the lawyers involved, but gives developers next to nothing.
It also only affects iOS, not macOS.
Jay Peters, Sean Hollister, and Richard Lawler (via Nilay Patel):
Apple’s press release spins the entire settlement as a generous offer to developers, including the anti-steering change[…]
[…]
A website for eligible developers isn’t fully operational yet (and may not be until the settlement is approved by a judge), but court filings show how the money could be distributed. A proceeds tier ranging from one penny to over $1 million sorts out the potential return, with developers in line for a minimum payment of between $250 and $30,000. That will vary depending on how many people submit approved claims, with leftover funds going to the Girls Who Code nonprofit.
Best way to look at it is they are retroactively applying the small business program, but at a 30->27% rate vs the 30->15% rate today.
So you get a 3% refund on your total revenue as long as you qualified for the program retroactively.
How much must your class action lawyers suck if the only thing you could get out of this was “Apple will let you charge some new prices not ending in .99”
It’s so funny watching Apple PR’s song and dance, embarrassingly patting itself on the back, after yielding basically zero ground on the App Store’s most anticompetitive rules.
You know this changes nothing. We know this changes nothing. Who are you performing for?
This is a massive win for Apple - especially if it sways the judge (the same one as in this class action suit) its way in the Epic trial. They’re basically changing nothing of importance and paying “only” $100 million.
Through the Settlement, Apple acknowledges that this litigation was a factor in Apple’s January 1, 2021 launch of the Small Business Program.
Coalition for App Fairness (tweet):
Apple’s sham settlement offer is nothing more than a desperate attempt to avoid the judgment of courts, regulators, and legislators worldwide. This offer does nothing to address the structural, foundational problems facing all developers, large and small, undermining innovation and competition in the app ecosystem.
Apple’s recent “changes” do not address any of the core concerns Spotify and many others have with Apple. @horaciog explains why.
This new Apple settlement with a group of class-action legal vultures follow the arc of that iconic scene in Fight Club. In which both parties have assumed their ceremonial positions in a pointless settlement that reaffirm existing provisions and then asks for a couple of cornflower blue icons.
[…]
The plaintiffs – here being the actual lawyers in the class-action proceeding, not the interests of the developers they supposedly represent – have justified their pursuit for loot mainly by getting Apple to reaffirm existing policies. Like having them affirm that “at the request of developers, Apple has agreed that its Search results will continue to be based on objective characteristics like downloads, star ratings, …”. So part of this settlement is that Apple says it’ll continue to do search like it’s done so far, and that it won’t make it worse for users and developers by corrupting it with self-dealings and sold preferences, but only for the next three years?
I didn’t previously think the Small Business thing had an expiration date, and now it feels a bit like “nice discounted rate you’ve got there, would be a shame if something happened to it”.
Breaking: Apple has settled the butterfly keyboard lawsuit!
Key Concessions:
- Apple will stop making the 2016-2019 MBPs
- All class members may buy a new Apple laptop
- Apple will recycle old laptops
- All class members will continue to receive 5GB of free iCloud storage
Apple would like to think people are just upset about their tax rate, but it’s not just that — it’s about interfering with perfectly reasonable apps for self-serving reasons and pretending they’re protecting customer interests, and channeling ‘innovation’ down pre-approved paths
So now we have a situation where Apple has given a blessing to collect email in-app. Sure, your app has a good reason doing it, but remember all those other developers that don’t.
We’re likely to see a lot more prompts for email (or Sign In with Apple) just to harvest info.
And customers aren't stupid. They will draw a direct correlation between providing email to apps and an increase in Inbox crap.
Guess who gets hurt?
The folks who have a legitimate reason to contact customers by email.
What’s especially bizarre about Apple’s PR spin is that this settlement is supposed to be about appeasing developers, but we’re the ones who see right through the spin. Which further erodes developer sentiment.
I don’t know what happened yesterday. 🤯 Apple published this press release, and every news outlet I saw ran with a story of “big changes” to App Store rules. What’s strange is that nothing has changed, and the press release says explicitly.
I thought I had experienced the full possible range of feelings regarding Apple and the App Store, but settling a lawsuit, promising absolutely nothing new as part of the settlement, and spinning that as some kind of big developer win, has brought me to exciting new mental places
Apple was gonna smoke them. So:
- Make a list of “changes” that looks big (but don’t actually change anything)
- And pay a “big settlement” (but really as small as possible)
- And let’s GTFO
Basically dev had nada, and Apple let them save face.
Previously:
Update (2021-09-07): Patrick McKenzie (tweet):
This is likely to be important for app developers. I think people underappreciate the magnitude of the likely impact due to the incrementality of it. It is not what developers have asked for, which is the ability to pitch users within the app itself on consummating payments out-of-band, but it also is not simply a return to the status quo ante to this summer, when Apple announced it would rigorously enforce guidance against steering transactions off-platform.
To appreciate why, remember that a large share (60% or so) of revenue on the App Store is from games, and game developers are in the business of incentivizing users to take small incremental actions within a play session. Sometimes those incremental actions involve e.g. learning the UI of your princess saving application, sometimes they advance the plot, and sometimes they directly achieve business goals for the developer.
[…]
It’s trivial to imagine game companies incentivizing different purchasing choices because this is already a core competence.
The email thing that was the one real “concession” in the Apple is really something else. Basically they just rolled back a change they made a couple months ago, so it’s the same as it had been for years before that 🙃
So many glowing headlines/tweets over basically nothing 🤷🏻♂️
Personally, I don’t actually care about payment methods as a developer. I’m never going to be above the $1m threshold per year, and I think 15% is an acceptable cut.
What happens with App Review and allowing alternative stores is a more interesting question, but even then, taking everything into account, I think I still prefer my garden walled.
The problem is, as a user, you have no idea what you are missing out on having because of the walled garden, and never will while it lasts. I know I dropped my iOS app, and would never develop one that was even vaguely outside the norm because the risk factor is too high.
At this point even I’m less bothered by the 15% than by the fact that I can’t issue refunds, can’t look up previous orders, and have no way (not even via anonymized forwarding as with Sign In With Apple) to contact customers, limitations that apply to no other payment provider.
Update (2021-09-08): M.G. Siegler:
To be clear, it’s obviously the goal of every PR person around every single announcement to accentuate the positive as much as possible. No one can fault Apple for that, of course. But if this bit of reporting by Jack Nicas of The New York Times is to be believed — and I believe it is to be believed given my own experiences in such matters — Apple’s positioning and tactics were decidedly more slippery than just your standard PR spin.
MG Siegler returns to the show to talk about last week’s surprise announcement from Apple settling a class action lawsuit filed on behalf of U.S. App Store developers, and the various reactions to it.
A couple weeks ago, @jeiting and I spoke with @benthompson about all things App Store. One of the many things Ben put so well: regulation is a sledgehammer, not a scalpel. We both shared our concerns that regulation will bring about unintended consequences.
We submitted a bug fix update to Hopscotch this weekend. We wanted to get it out quickly to get ahead of the school year--schools don’t update their apps very often after downloading them.
The app was rejected because “our promoted in-app purchases had identical titles and descriptions which could be confusing to users.”
Makes sense as a guideline, except our titles and descriptions were different! Nonetheless, I changed the descriptions to be even more different than the titles. I replied to the message and resubmitted the app.
A day later, the app was rejected again. At this point, I didn’t know what to do. I was in a Kafkaesque universe where I had to blindly guess at what could be wrong and randomly change things until the bureaucrats let me through (with a one-day delay).
[…]
There’s a lot of talk about the 30% tax that Apple takes from every app on the App Store. The time tax on their developers to deal with this unfriendly behemoth of a system is just as bad if not worse.
[…]
I don’t know what’s worse: an automated system with zero human oversight continually telling me (falsely) that our app is out of compliance.
Or some person named Leo continually pressing the reject button without ever bothering to read my message because the automated system said that I was wrong.
The amount of friction we have as Apple platform developers (both in terms of developer relations and resources) far surpasses what the most profitable and mature platforms should have. It really is a shame apple doesn’t realize the amount of built-up goodwill it erodes daily.
App Tracking Transparency rules seem a mess. We had an app rejected for claiming it’s business model was enticing the user into tracking (it wasn’t). We had 3 calls with the App Review team who eventually agreed and then expedited the review. That was 2 months back. And today, we submitted a minor bug fix. Guess what? Rejected for the same reason as a couple of months back.
I appealed.
“A representative from the app review team will call you in 3 to 5 business days”
A year ago, Apple announced that “bug fixes will no longer be delayed over guideline violations except for those related to legal issues,” but in practice it seems like nothing has changed.
The thing that really gets me about this story is that Hopscotch is a great, well-established, award-winning app.
I currently have a bug-fix update held up because of a profoundly incorrect rejection. The reviewer didn’t understand a feature that’s been in @bbedit since 2002. The behavior isn’t what the reviewer claimed.
With the exception of maybe Uber and Airbnb, App Review isn’t kidding when they say they treat all developers the same, as every good app in the App Store, no matter how beloved, has at least five horror stories just like this
We have more of them then we can count at the factory. Maybe we should start documenting them cause honestly I can’t keep them all straight in my head.
Update (2021-08-27): Steve Troughton-Smith:
I had that with a recent update. “If you’d like to avail of <the new policy>, just reply to this message”. They wanted me to add a new menu item into my Mac app in a bug fix update.
I had a recent experience that went mostly well. My app contains a frowned-upon case not in the usual test flow. The app was eventually rejected. I appealed. I had a nice call with App Review. They allowed the build to go through with a promise a future build would have a fix.
Update (2021-09-07): Jonathan Deutsch:
My spam call software also blocked an important App Store call in the past. For this reason I disabled it and now deal with the interruptions of 1-2 robo calls from 408 numbers each day as a Just In Case™.
This too is the Apple Tax.
Had a similar experience as an app developer in the past. We couldn’t fix a critical issue affecting our users for almost a month because Apple was rejecting every app submission with vague reasoning.
A couple months later, Apple released a direct competitor to our product.
Update: to Apple’s credit, they called me last night to apologize and ask for feedback. Who knows if they’ll actually implement it, but at least they are listening. What would *you* change about the App Store review process?
[…]
Measure success by the time for an app to get approved, not just the time for the developer to get a response.
People keep tweeting this thread with “Apple needs to fix this” and…. no, they don’t need to do anything. That’s what lock-in and monopolies allow for! If anyone could compete for these developers Apple would have an actual incentive to change.
Craig Hockenberry (Hacker News):
In the JavaScript framework used by Safari and other parts of Apple’s products, there is a tool called
jsc
. It’s a command line interface for JavaScript that uses the same code as the rest of the system.You can find the binary at
/System/Library/Frameworks/JavaScriptCore.framework/Versions/Current/Helpers/jsc
. That path is unwieldy, so I have an alias set up that lets my just typejsc
in the Terminal.So what can you do with
jsc
? Pretty much anything you can do with JavaScript in a browser with the caveat that there aren’tdocument
andwindow
instances.
Previously:
The News Partner Program is designed for subscription news publications that provide their content to Apple News in Apple News Format (ANF). ANF enables an exceptional reading experience on Apple News and unlocks the full benefit of the platform for publishers, and empowers publishers to create brand-forward stories, immersive issues, and audio stories, with designs that scale seamlessly across Apple devices. ANF also supports advertising, and publishers keep 100 percent of the revenue from advertising they sell within Apple News. To support publishers who optimize more of their content in ANF, Apple News is offering a commission rate of 15 percent on qualifying in-app purchase subscriptions from day one.
[…]
Participants must maintain a robust Apple News channel in Australia, Canada, the United States, and the United Kingdom, and publish all content to that channel in ANF.
[…]
The primary function of a publisher app must be to deliver original, professionally authored news content.
Apple already offers all developers the chance to collect 85% commission on subscriptions for subscriptions that last more than one year. The commission is also set at 85% for apps that qualify for the Small Business Program, which take in less than $1 million in annual revenue.
[…]
At a high level, the News Partner Program is similar to the terms for the Video Partner Program, which was established last year. However, in the latter case, premium video apps are allowed to use their own existing payment methods on file. For news, it seems Apple is still requiring use of In-App Purchase unilaterally.
Bribing developers to prop up a failing arm of Apple’s services division after decimating web ad revenue so they have no other choice
Previously:
Why would Apple ask for the password or passcode for one of your other devices? Could it be some sort of scam? What exactly is going on here?
[…]
Apple has chosen to protect some data that it views as highly secure or very private with end-to-end encryption that prevents Apple from knowing anything about the contents of the synced data. Apple doesn’t possess any of the keys required to decrypt this data passing through its servers. Instead, those keys reside only on individual iPhones, iPads, and Macs.
[…]
For iCloud Keychain and similar sensitive data, Apple has your devices generate and maintain a set of public and private keys that enable interaction with the information synced across iCloud. The devices never reveal their private keys and have the public keys of all the other devices connected to an iCloud account.
[…]
The hard part isn’t syncing data privately. Rather, it comes when you want to add a new device to this set.
[…]
On at least one of the devices in the iCloud sync set, Apple adds an encrypted version of that device’s passcode or password to the set of shared information.
[…]
Apple syncs this information to iCloud, and the setup process on the new device then pulls it down, prompting you to enter the passcode or password.
This seems reasonable, although I guess it creates a slight risk in that now your device’s password has been stored in the cloud. It’s encrypted, but someone with access to the cloud could apply a lot of computing power over a long period of time in order to brute-force it. This would make it possible to break into your device during only a brief window of physical access.
Apple made it more confusing by not documenting the procedure anywhere on its site. So if you Google or search to make sure it’s safe and not phishing, you cannot find any additional information about it!
He wrote that in 2019. I was not able to find this described in the 2021 Apple Platform Security document, though it’s possible I just didn’t know where to look. I also don’t know if it has a name. Apple does describe a similar “syncing circle” system for iCloud Keychain, but that seems to be different. (And the system Fleishman describes works even if you are not using iCloud Keychain.)
Update (2021-09-08): alanzeino:
I’ve always wondered why this works like this, especially since it randomly sometimes requires every device to re-login and the last device that logs in is the password used to encrypt
It does this because dropped an old device from your circle and needs to generate a new key that old device doesn’t know.
He thinks that what Fleishman describes is the “syncing circle” mentioned in the Apple document. In other words, iCloud Keychain is running at some level even if you haven’t chosen to store your own passwords in it.
Due to changes in local regulations, the bank account holder’s address is now required if you have bank account information in App Store Connect. Account Holders, Admins, and Finance roles can now provide a valid address in the Agreements, Tax, and Banking section. Please provide this information by October 22, 2021, in order to avoid a potential interruption of your payments.
I’ve seen reports that the site is unreliable at saving, although it worked fine for me. However, it is not obvious that in order to edit the info you need to click on the Paid Apps agreement, which just looks like regular black text with no underline or button border.
Update (2021-09-08): Jesse Squires:
wow. this took me ~30 minutes to figure out.
click on the thing that totally looks like a clickable link/button.
So many people are having problems with the new Banking Information requirements in App Store Connect.
Here is what you need to do[…]
Also, don’t have exotic characters like å, ä, and ö in your address info, you dirty you. You get this super clear validation error and have to replace the characters. Then, somehow, after the save goes through, those characters get normalized and saved.
Did you also get the reminder emails today even though you filled everything in correctly? I don’t understand why there is no deeplink from the alert on the front page directly to the banking page 🤷🏻♂️
Yes.
I got 2 emails today after I entered the address a week or two ago. So I contacted their finance team. Hopefully they’ll be as annoyed as I was when I got the emails and freaked out and spent 15 minutes on AppStoreConnect this morning.
Today, we’re announcing the all-new Club MacStories featuring two additional tiers: Club MacStories+ and Club Premier. The new plans offer extra content, a brand new, powerful web app to read Club articles on the web with advanced search and RSS features, exclusive discounts, and a new Discord community.
Club Premier is the ultimate plan that includes all of Club MacStories, Club MacStories+, and the new extended, ad-free AppStories+ podcast in a single, $12/month package.
[…]
Calliope is a fully responsive, modern web app, so for the first time you’ll be able to properly read MacStories Weekly and the Monthly Log in a web browser.
[…]
This is one of the major changes enabled by Calliope: each article exists both as part of a bigger entity – the newsletter – as well as on its own, with its own unique URL. This means each article can be linked and shared with other people on the Internet.
The original Asheville concept was for a device that pretty faithfully recreated Game & Watch-style games, which are composed of LCD segments, which Neven explains are, “…kind of like an old-school clock, where it just has shapes that it can turn on or off. It doesn’t really have pixels all over the screen. That’s not how our screen works, but we thought, well, maybe it’ll look and work like that. And we played with that idea for awhile.”
[…]
“Everything we do is to create kind of like an alternative to, to the, what did I say? Touch-screen psychosis. To us, it’s very unsatisfying to use a touch device. I get it, and I think it’s a great interface for creating a lot of different applications without, you know, having hundreds of buttons and knobs and stuff. So it’s very effective for like a smartphone, but for a gaming device… A gaming device to me is almost the same as a musical instrument. It’s about zero latency, muscle memory, and you need to feel that you are in instant control of everything that happens.[…]”
[…]
“The components are packed in quite tightly. And so even for the screw, we needed the head of the screw to be a thinner dimension than what we could find off the shelf, we needed it not to be quite as long because then it would go and hit the LCD, right? So every single component is being made on a custom basis,” says Steven N.
[…]
“All of the quirks of Playdate, I think, helped tremendously in attracting developers to want to make something for Playdate,” says Cabel. “If Playdate had a full-color OLED screen and a powerful 3D chip, it would take a very long time for one person to say, ‘Yeah, I’ll make a game for that.’ By going in the opposite direction of where gaming has gone lately, we return to a scale in which one person, two people, three people can make an awesome, entertaining, you know, lengthy, meaningful title, and the constraints enable that.”
Previously:
Update (2021-09-07): John Carmack:
I have often thought that the presence of limits in the parameters of a design may, counterintuitively, result in better designs than the absence of limits. My working theory is that when you get to design whatever you want, you are “satisficing”, while limited resources force you to critically evaluate aspects and compete them against each other. Competition brings improvement, but many people shy away from competition if they aren’t forced into it. Parameters can be memory, speed, time, funding, or other factors.
Richard Blumenthal (Hacker News):
New bipartisan antitrust legislation that targets Apple’s App Store and Google’s Play Store was today introduced by U.S. Senators Richard Blumenthal, Marsha Blackburn, and Amy Klobuchar.
The Open App Markets Act [PDF] is meant to create “fair, clear, and enforceable rules” that will protect competition and strengthen consumer protections. According to the three senators, Apple and Google have “gatekeeper control” of the two main mobile operating systems and their app stores, allowing them to dictate the terms of the app market.
Under the terms of the bill, which applies to companies that own or control an App Store with more than 50,000,000 users, Apple would not be able to require developers to use its own in-app purchase system, and it would be required to allow developers to distribute apps through alternative app stores.
The Chamber of Progress’ website lists 20 “corporate partners,” with Apple and Google being the most relevant ones in this case. Amazon, Facebook, and Twitter are also funders.
[…]
But the group’s lobbying against the new app-store legislation neatly matches the positions of Apple and Google, which have been fighting attempts to make their mobile operating systems more open. Apple issued a statement yesterday that conveyed the same basic message in a less combative way. “At Apple, our focus is on maintaining an App Store where people can have confidence that every app must meet our rigorous guidelines and their privacy and security is protected,” the company said, according to CNBC.
Apple’s aggressive lobbying efforts in Georgia, the extent of which were previously unreported, highlight a pattern that has played out with little national attention across the country this year: State lawmakers introduce bills that would force Apple and its fellow tech giant Google to give up some control over their mobile phone app stores. Then Apple, in particular, exerts intense pressure on lawmakers with promises of economic investment or threats to pull its money, and the legislation stalls.
[…]
When Georgia legislators introduced a pair of app store bills in early February, Apple immediately hired five new lobbyists to advocate against the legislation in the state. And during the frenzied debate following the bill’s introduction, Apple lobbyists told legislators that the company could pull out of two important economic development projects in Georgia — a $25 million investment in a historically Black college in Atlanta and a potential multibillion-dollar partnership with Kia to build autonomous vehicles in the city of West Point — if the legislation went through, according to two people familiar with the conversations.
[…]
Apple denied making statements about pulling back on investments. The threats in Georgia came from third-party Apple lobbyists, said one of the people familiar with the conversations.
Previously:
mthoms (via Kosta Eleftheriou):
Most people don’t know this but the “Parent” (read:credit card holder) of a family sharing account can’t see, let alone cancel, any subscriptions a minor on the account has purchased.
Let that sink in for a second. Apple will happily charge the parents’ credit card a weekly recurring fee but there is nowhere in their interface (on device nor on the web) where the parent can even see that subscription.
Apple expects the child to go into their interface and cancel the recurring subscription. Something many (most?) adults find confusing. In my case, the child just deleted the App when the trial was over. Which is of course perfectly logical thinking. No bueno.
Apple says this is for privacy reasons, which doesn’t make much sense since they do e-mail the parent a receipt. So this seems like either a poorly thought out feature or a dark pattern.
That truly is egregious that they’re hiding that information, and you have to notice it via your bank statement. At the very least the primary account holder should be able to see that an unspecified member of their family account has a subscription to an unspecified service, and have the ability to summarily cancel it.
The statement doesn’t even tell you which child is the subscriber.
Previously:
Update (2021-10-08): Boris Yurkevich:
This was true, however recently I got an email about IAP I made from my sons iPad. Clicking through the “Report a problem” link in the email and logging in with my family organiser Apple ID shows this new message.
Google (via Hacker News):
In light of spam causing potential issues with SMS forwarding, text message forwarding to linked numbers will stop on or after August 1.
If you’re using a Google Voice number for 2FA, presumably you can still get the text messages in the app or on the Web.
It’s with a heavy heart today that we’re announcing the discontinuation of our award-winning iPhone keyboard for blind users.
Apple has thrown us obstacle after obstacle for years while we try to provide an app to improve people’s lives, and we can no longer endure their abuse.
[…]
Last week, we submitted an update that fixes various iOS 15-related issues & improves the app for VoiceOver users. No new features, just improvements. No changes to our App Store page either.
But Apple rejected it. They incorrectly argue again that our keyboard extension doesn’t work without “full access”, something they rejected us for THREE years ago. Back then we successfully appealed and overturned their decision, and this hadn’t been a problem since. Until now.
We tried reaching out to Apple a total of 9 times last week, with no success. At this point they seem to be ignoring our attempts to contact them directly, despite previously explicitly telling us to “feel free” to contact them if we need “further clarification”.
Our rejection history already spans more than FOURTY pages filled with repeated, unwarranted, & unreasonable rejections that serve to frustrate & delay rather than benefit end-users.
Eleftheriou previously levied a lawsuit against Apple in March over Apple’s failure to get rid of copycat apps, and he today highlighted Apple’s “terrible” third-party keyboard APIs as another reason for the App’s discontinuation. Apple’s keyboard APIs have reportedly been “buggy, inconsistent, ever-changing, and broken” since 2014.
Clearly, he has a varying relationship with Apple that has wavered across the spectrum, from at one point being in talks with the company to get acquired[…]
Suing Apple may seem over-the-top on a casual glance but… who else is he going to sue? The makers of those scam apps undoubtedly operate through phony accounts from countries that couldn’t care less about scamming people out of money. Apple is the only one that can do anything about it. And, oh, it happens to run the store it tells everyone is so safe and great.
Previously:
Jeff Johnson (tweet, 2):
I’ve heard from several other people who started noticing the same issue yesterday too, one of whom helpfully referred me to this reddit thread with even more reports. On investigation, I found that the
nsurlsessiond
process was connecting to the servervalid.apple.com
, and immediately afterwardtrustd
CPU jumped from 0% to 100%. It seems that the issue can be temporarily solved by preventingnsurlsessiond
from connecting tovalid.apple.com
. You may have to reboot or force quittrustd
to get its CPU usage back to normal. It’s important to note that this is only a temporary workaround to the CPU usage problem;trustd
is an important macOS system process that checks certificate validity and revocation status, so you probably don’t want to blockvalid.apple.com
forever.
I’ve been seeing long periods of high CPU use from trustd
since Catalina.
My postmortem theory is that at some point Apple had some bad data in their Certificate Revocation List online, everyone downloaded the bad data, and then
trustd
got stuck. Anyway, it looks like blockingvalid.apple.com
is no longer necessary after removing and regenerating thevalid.sqlite3
database.
To delete the database you need to reboot into macOS Recovery, and then find the SQLite file on the main boot drive, not at the recovery’s /private/var/protected/trustd/.
The problem is also seen on iOS, though it’s not clear how to fix it unless your device is jailbroken.
Previously:
Every time I do a screen recording, I have to go trough the hassle of converting the video to MP4 on my Apple computer.
Even my iPhone can export to MP4 so I there is no reason for it not to work on my computer.
In my experience, at least, recent versions of QuickTime Player name the file with .mov but use the H.264 format, so you can just rename the file with .mp4 without having to re-encode it. But the .mov default creates all manner of problems for people who don’t know this, e.g. when using apps and Web sites that require MP4 files and only know how to look at the filename.
Apple has dropped its lawsuit against Corellium, which was scheduled to go to trial and was being closely watched by a security industry that feared it would have a chilling effect on research tools used to make software safer.
Citing court documents, The Washington Post reports that Apple and Corellium have agreed on a confidential settlement to bring the lawsuit to an end. Despite Apple’s grievances with Corellium, however, the settlement does not include Corellium suspending the sale and distribution of tools used by security researchers.
Right after Corellium defending Apple’s photo scanning 🤔
Mr. Federighi pointed to another benefit of placing the matching process on the phone directly. “Because it’s on the [phone], security researchers are constantly able to introspect what’s happening in Apple’s [phone] software,” he said. “So if any changes were made that were to expand the scope of this in some way — in a way that we had committed to not doing — there’s verifiability, they can spot that that’s happening.”
Patrick Howell O’Neill (tweet, MacRumors):
On Monday, Corellium announced a $15,000 grant for a program it is specifically promoting as a way to look at iPhones under a microscope and hold Apple accountable. On Tuesday, Apple filed an appeal continuing the lawsuit.
In an interview with MIT Technology Review, Corellium’s chief operating officer, Matt Tait, said that Federighi’s comments do not match reality.
Joseph Menn (via MacRumors):
The appeal came as a surprise because Apple had just settled other claims with Corellium relating to the Digitial Milennium Copyright Act, avoiding a trial.
Experts said they were also surprised that Apple revived a fight against a major research tool provider just after arguing that researchers would provide a check on its controversial plan to scan customer devices.
Previously:
Mike Isaac (via Hacker News):
Reddit, the virtual town square of the consumer internet, has raised a fresh $410 million in funding, valuing it at more than $10 billion, the company said on Thursday.
The financing, which was led by Fidelity Investments, increases Reddit’s valuation from the $6 billion it achieved six months ago, when it raised $250 million. Reddit said it expected existing investors to participate in the latest financing as well, so the round is likely to grow and close out at around $700 million.
[…]
Reddit surpassed $100 million in quarterly revenue for the first time in the second quarter this year, up 192 percent from the same period in 2020.
More than 50 million people now visit Reddit daily, and the site has more than 100,000 active subreddits.
On a visitor perspective, this seems undervalued. This is a 50% or more discount to the market-cap/unique visitor ratio of Twitter, Snap, and Facebook. And all the stats I’ve seen indicate much higher user engagement and time on site for Reddit compared to the other social platforms.
The biggest issue is of course that their monetization is horrible. Like 95% lower per user than the other socials.
Previously:
There was never any question what Epic Games wanted when it took Apple to court: the 48-second “Nineteen Eighty-Fortnite” made it clear App Store hypocrisy was the agenda. But the justification for a parallel case against Google wasn’t as clear-cut until today — it’s only now we’re learning about the most damning accusations against the Android giant.
On Thursday, Judge James Donato unsealed a fully unredacted version of Epic’s original complaint against Google (via Leah Nylen), and it alleges the company was so worried about Epic setting a precedent by abandoning the Play Store that it unleashed a broad effort to keep developers from following the company’s lead. That included straight-up paying top game developers, including Activision Blizzard to stick around, and sharing additional chunks of its revenue with phone makers if they agreed not to preinstall any other app stores.
[…]
And that’s on top of the dealings Google had with Epic directly in July 2018, when Alphabet’s CFO and other senior Google executives reportedly offered up to $208 million in “special benefits” over three years to bring Fortnite to Google Play — in what would effectively be Google taking 25 percent of the game’s revenue instead of the standard 30 percent. Google allegedly tried to convince Epic to take the deal by pointing out the “frankly abysmal” 15+ step process gamers would have to endure to sideload Fortnite on Android.
Nick Statt (tweet):
Google executives discussed approaching Chinese gaming giant Tencent about purchasing shares in Epic Games or potentially orchestrating a hostile takeover of the company, according to court documents in the ongoing Epic v. Google antitrust case that are no longer redacted as per a court order issued on Wednesday.
[…]
Other unredacted sections of Epic’s complaint reveal more new details, including those from a meeting between Apple and Google that took place in 2018 to discuss increasing search revenue growth; Google pays Apple large sums of money to make Google Search the default search engine on the mobile Safari for the iPhone. Following the meeting, an Apple representative suggested to a Google senior member that the two companies team up and “work as if we are one company” to combat efforts like Epic’s to undermine mobile app store commission rates and restrictions against alternative app stores.
[…]
Other now-unsealed portions of the complaint deal with new information surrounding Google’s so-called “Project Hug,” an initiative designed to sway top Android app makers and game developers not to leave the Play Store using kickbacks, commission reductions and other financial incentives.
The unredacted details, highlighted in a separate redlined filing [PDF] and incorporated into an amended complaint filed on Friday [PDF], suggest Google has gone to great lengths to discourage competing app stores and to keep developers from making waves.
[…]
These agreements allegedly included the Premiere Device Program, launched in 2019, to give OEMs financial incentives like 4 per cent, or more, of Google Search revenues and 3-6 per cent of Google Play spending on their devices in return for ensuring Google exclusivity and the lack of apps with APK install rights.
[…]
The unidentified senior Google executive is quoted as acknowledging that such a move would be “difficult move in the face of the EC [European Commission] decision but we have good privacy/security arguments about why sideloading is dangerous to the user).”
When Epic refused the deal, the complaint contends that Google did indeed try to generate fear of sideloading by releasing data about insecure sideloaded Android apps during Fornite’s Android debut.
In October 2010, Jobs declared at a corporate strategy presentation that a key company aim would be to use the cloud to “tie all of our products together, so we further lock customers into our ecosystem.”
In 2013, Apple’s senior vice president of software and services, Eddy Cue, lauded the potential of bundling iTunes gift cards with new Apple devices instead of putting them on sale to lock customers further into the company’s ecosystem and dissuade them from switching to a different brand. He also raged at the Apple Retail team for its disinterest in selling iTunes Store gift cards.
[…]
In 2016, Apple’s Elizabeth Lee said that “Although they may be our best and the brightest apps, Matt feels extremely strong about not featuring our competitors on the App Store ,” when asked why the company does not want to highlight apps from Google and Amazon. The email thread suggested that this was standard App Store practice, with some competing apps being seen “through a slightly different lens than most.”
[…]
Apple’s Tom Reyburn seemingly admitted that “LinkedIn has been rejected for using the same language on their subscription call to action button that Apple uses in our own apps.”
[…]
Apple also realized that it had erroneously allowed two separate games that featured school shootings on the App Store , seven months after they were approved. Discussions put the error down to the fact that “it took a total of 32 seconds to review both apps.”
Previously:
Update (2021-09-08): See also: Hacker News.
White claims that “most Android devices ship with two or more app stores preloaded”, and cites Samsung’s Galaxy Store as an example. But Epic’s lawyers claim, beginning on page 45, that Google pressured Samsung into only allowing the Google Play and Samsung Galaxy Store on its phones, thereby scuttling a distribution deal with Epic for its own app store. This was a shift away from what the suit describes as Google’s intent since 2011 to altogether prevent Samsung from running its own app marketplace.
The results of Google’s deals with Samsung allegedly became part of “Project Agave”, thus forming the basis for the suit filed by the attorneys general this year.
To our partners: on Aug 1st we are removing all revenue share on your first $1M. That means you will keep 100% of your first $1M when you sell on @Shopify’s app store.
The best part? At the end of every year, the numbers reset. Every single year, your 1st million is all yours.
After the antitrust bills and the Epic case:
- Microsoft has reworked Windows store terms to take 0
- Facebook announced 0 percent on creator payments
- Twitter announced lower rate for Super Follows
- Shopify lowering rates
Kind of amazing to see
Abner Li (via James O’Leary):
In exchange for building apps on all those Google platforms and integrating with specific features/APIs, these media companies see Google’s cut on user purchases drop from 30% to 15%. This is independent of the upcoming change where Google is reducing its commission to 15% on the first $1 million in revenue for all – but primarily to the benefit of small/medium – Android developers. Apps in the Play Media Experience Program have to continue using the Play Store’s in-app billing system.
Honestly this $1M business feels like it’s almost turning into a subsidy for cross platform development; you want to do exactly $1M in revenue (taking care not to go even $1 over that on iOS) on as many platforms as possible rather than focusing on just one.
Seriously, if you’re an iOS developer with a successful app, Apple is effectively threatening you with financial penalties if you reinvest your profits in growing that business instead of investing them in an Android port.
So, a “small” app maker is planning to go out of business. He has plenty of app that others could adopt and continue to develop. However, any established small dev would be punished by doing that because (s)he’d lose the 15% Apple commission and go back to 30% :(
So, any product that is not highly profitable will thereby die out, even if it were to be offered for free, just because of how Apple treats us small businesses. So wrong.
Sean Hollister (via Damien Petrilli):
If Apple pursued its original idea, the App Store Small Business Program might have only given developers credit to spend on Apple’s own App Store search ads — not 15 percent back in their pockets.
Plus there are no search ads in the Mac App Store.
Previously:
Jeffrey Knockel and Lotus Ruan (tweet, Hacker News):
We analyzed Apple’s filtering of product engravings in six regions, discovering 1,105 keyword filtering rules used to moderate their content.
Across all six regions we analyzed, we found that Apple’s content moderation practices pertaining to derogatory, racist, or sexual content are inconsistently applied and that Apple’s public-facing documents failed to explain how it derives their keyword lists.
Within mainland China, we found that Apple censors political content including broad references to Chinese leadership and China’s political system, names of dissidents and independent news organizations, and general terms relating to religions, democracy, and human rights.
We found that part of Apple’s mainland China political censorship bleeds into both Hong Kong and Taiwan. Much of this censorship exceeds Apple’s legal obligations in Hong Kong, and we are aware of no legal justification for the political censorship of content in Taiwan.
We present evidence that Apple does not fully understand what content they censor and that, rather than each censored keyword being born of careful consideration, many seem to have been thoughtlessly reappropriated from other sources.
Apple, which says it will refuse government demands to expand its on-device image scanning, currently blocks people from getting the phrase “Human Rights” or “Freedom of the Press” engraved on their iPhone because China doesn’t like it
Previously:
Joseph Cox et al. (Slashdot, Hacker News, Reddit):
On Wednesday, GitHub user AsuharietYgvar published details of what they claim is an implementation of NeuralHash, a hashing technology in the anti-CSAM system announced by Apple at the beginning of August. Hours later, someone else claimed to have been able to create a collision, meaning he tricked the system into giving two different images the same hash.
In a statement to Motherboard, Apple said that the version of the NeuralHash that Yvgar reverse-engineered is not the same as the final implementation that will be used with the CSAM system.
[…]
Matthew Green, who teaches cryptography at Johns Hopkins University and who has been a vocal critic of Apple’s CSAM system, told Motherboard that if collisions “exist for this function,” then he expects “they’ll exist in the system Apple eventually activates.”
“Of course, it’s possible that they will re-spin the hash function before they deploy,” he said. “But as a proof of concept, this is definitely valid,” he said of the information shared on GitHub.
“Early tests show that it can tolerate image resizing and compression, but not cropping or rotations.”
Like every other perceptual image hash. It’ll also have collisions. Keep in mind that the matching is fuzzy (you have to allow some wrong bits).
It’s not hard at all to attack such a hash to make it produce false positives.
Say I am law enforcement and I want access to your photos. I send you >30 messages with non-CSAM but colliding images. Your phone now thinks you have CSAM and grants Apple access to your data.
Then I just have to subpoena Apple for the data they already have, and I have your photos.
Meanwhile the people who actually have CSAM just have to add a frame to their images to completely neuter the system.
A lot rests on how much we can trust Apple’s human reviewers.
Also, apparently Apple’s neural network, by virtue of having 200+ (!) layers and due to floating point rounding issues, actually produces wildly different hashes on different hardware (9 bits difference between iPad and M1 Mac!). That’s... garbage. That’s 9 bits of match noise.
[…]
Actually, how does this even work at all? You have to do fuzzy matching of perceptual image hashes like NeuralHash. But they’re doing some PSI crypto stuff after that that would seem to be incompatible with it, and at no point do they talk about this.
This is not a thing. This cannot mathematically be a thing. There is no way to design a perceptual image hash to always result in the same hash when the image is altered in small ways. This is trivial to prove.
This was a bad idea from the start, and Apple never seemed to consider the adversarial context of the system as a whole, and not just the cryptography.
In a call with reporters regarding the new findings, Apple said its CSAM-scanning system had been built with collisions in mind, given the known limitations of perceptual hashing algorithms. In particular, the company emphasized a secondary server-side hashing algorithm, separate from NeuralHash, the specifics of which are not public. If an image that produced a NeuralHash collision were flagged by the system, it would be checked against the secondary system and identified as an error before reaching human moderators.
[…]
But actually generating that alert would require access to the NCMEC hash database, generating more than 30 colliding images, and then smuggling all of them onto the target’s phone.
Previously:
Update (2021-08-21): See also: Hacker News.
I’m not convinced that this secondary system was originally part of the design, since it wasn’t discussed in the original specification.
The Apple system dedupes photos, but burst shots are semantically different photos with the same subject - and an unlucky match on a burst shot could lead to multiple match events on the back end if the system isn’t implemented to defend against that.
We wrote the only peer-reviewed publication on how to build a system like Apple’s — and we concluded the technology was dangerous. We’re not concerned because we misunderstand how Apple’s system works. The problem is, we understand exactly how it works.
Brad Dwyer (via Hacker News):
In order to test things, I decided to search the publicly available ImageNet dataset for collisions between semantically different images.
[…]
There were 2 examples of actual collisions between semantically different images in the ImageNet dataset.
Update (2021-09-08): thishashcollisionisnotporn.com (via Hacker News):
Given that it’s possible to generate a false positive, it is also possible to deliberately create images that match a given hash. So, for example, someone who wants to get another person in trouble can send them innocent-looking images (like images of kittens) and manipulate those images to match a hash of known CSAM.
This site is a proof of concept for collision attacks. The images of the kittens are manipulated to match the hash of the image of the dog (59a34eabe31910abfb06f308). As a result, all images shown on this page share the same hash. When these images are both hashed with the Apple NeuralHash algorithm, they return the same hash.
Over the years the system’s support for Touch Bar changed in ways that made Touché slowly become less reliable and, finally, to more or less not work on any Macs at all.
I recently had some insights about how the Touch Bar support on the system has changed, and was able to put together an update to Touché that restores functionality.
Previously:
Juli Clover (tweet):
Throughout the beta testing period, Apple has been tweaking the design of the Safari browser on the iPhone and in beta 6, there are further refinements. The bottom tab bar has been redesigned to appear below page content, and Apple has also added a toggle to show the address bar at the top of the iPhone rather than the bottom.
[…]
With the bottom view option toggled on, Safari offers a dedicated toolbar with buttons at the bottom of the interface, which is also an improvement over the prior floating design.
Apple has also introduced new setting options to remove the website tinting and to enable a Tab Bar while in landscape mode. There was previously a “Show Color in Tab Bar” accessibility setting, which appears to be the same as the new “Allow Website Tinting” toggle.
It takes 5 seconds to see that this new Safari design is much better. Instantly clicked with me.
Bringing back a toolbar allows easier access to controls. More views are using the half-sheet style. Putting the URL bar at the top reverts to the old Safari. All of this is great.
Wow. That…looks like crap.
Either do a new feature or don’t. I feel like this option to leave everything the way it was is more confusing.
I’ve been trying to tell y’all that fast tab switching is clearly a design requirement for iOS 15 Safari!
I like this design – looks at home and works well.
Very strong “fine, whatever” vibes from the new ‘below’ mode. And the new ‘option’ is basically “ i’m sorry I’m sorry I’m trying to remove it”
I think the way to read this is ‘intermediate step to where we want to go eventually and we’ll see you in iOS 16’. My bet is that the ‘url bar at top’ option stays for years though.
Hopefully the one lesson learned that sticks though is not to bury high traffic actions under additional layers for very little win aside from aesthetics.
These are incremental changes to a big redesign, and I think they create the most successful iteration yet. Bringing the toolbar to the bottom is undeniably a muscle memory breaker, but I think it is worth the cost because it keeps a user’s hands in the same position more often. You can go from scrolling through a webpage to entering a URL without once shuffling the device in your hands.
This new version still has some rough edges. The huge drop shadow around the address bar is a nonstandard effect that confuses me. I guess it is supposed to indicate that the element is floating and interactive, but it creates a kind of blurry grey mess. The drop shadow also visually disconnects the address bar from the page or tab it represents.
You can tell this is the mostly final Safari design for iOS because the animations are all polished AF.
Safari Technology Preview release 130 includes bug fixes and performance improvements for Web Inspector, CSS, JavaScript, Media, Web API, and IndexedDB.
Previously:
Update (2021-08-21): John Gruber:
In a very real sense, the system worked. It’s good that Apple tried something ambitious and original with the layout for Safari on iPhone. The reason for the trend toward moving more navigation controls to the bottom of the screen is obvious: our phones are bigger than ever (iPhone 12 Mini aside), and our hands aren’t growing. It’s also good that Apple was receptive to the feedback from those using the developer and public betas. They listened, they fixed the design to address the problems, and here we are, with a layout for Mobile Safari that I think is better than ever. (I hedge with “I think” only because it just shipped — my opinions aren’t fully formed.)
The unusual part is that we got to see Apple’s design process play out in public.
Update (2021-09-08): Jason Snell:
The design of Safari 15 on the iPhone has gone to a better place, but Stephen Hackett reminds us that trouble on the Mac and iPad remain[…]
sudo xcodebuild -runFirstLaunchThis works fine locally, but when updating remote CI machines, entering the password can be troublesome. Furthermore if you want to support having CI machines automatically switch between Xcode versions when testing upcoming changes, you may not have the opportunity to be prompted at all. Lucky for us, the sudoers file format, which configures the
sudo
command, allows us to skip password entry for specific commands with a bit of configuration.[…]
- We specify just the
xcode-select
binary, using the absolute path. This allows all subcommands handled byxcode-select
to be run without a password.- The
xcodebuild
command also contains the one subcommand we want to be able to run without a password. Limiting this is important because otherwise you could runsudo xcodebuild build
without a password, which could execute malicious run scripts or do other terrible things.
25 years ago Microsoft released Internet Explorer 3.0, its first real salvo in the “Browser Wars”. This launch taught taught me how a giant corporation could move at the speed of a startup.
[…]
To motivate us more, I plastered the hallways with quotes from Netscape’s founder, Marc Andreessen: “Netscape will soon reduce Windows to a poorly debugged set of device drivers.” It reminded us that this new startup threatened to destroy all of Microsoft.
[…]
The Internet Explorer team was the hardest-working team I’ve ever been on. And I’ve worked at multiple start-ups. It was a sprint, not a marathon. We ate every meal at the office. We often held foosball tournaments at 2 am, just to get the team energy back up to continue working!
Sadly, there were divorces and broken families and bad things that came out of that. But I also learned that even at a 20,000-person company, you can get a team of 100 people to work like their lives depend on it.
[…]
This wasn’t a toxic pressure cooker of working against one’s will. The leadership worked hardest of all. Most of us were in our early twenties and it was a launch point for many careers.
Every member of this team considered it a highlight of their career.
Chrome was delivered without any sprints at all. The team came in at 9 and left at 5 (figuratively, people actually kept their own ~8h schedules) every workday for a couple years like clockwork. No drama. No broken marriages, no broken families.
[…]
How did chrome-team manage to deliver high quality software without death marches?
Funny you ask... Turns out that software projects actually benefit strongly from having senior technical leadership deeply involved.
[…]
Software engineering is engineering. Like other kinds of engineering, it’s a skill you develop over a lifetime, not a decade.
When I joined chrome-team I was in my early 30s. And I was on the junior side.
Most of the core team had already worked on one or two browsers before!
Having strong technical leadership has lots of advantages, but one of them is it naturally leads to a healthier cadence. These folks typically have to be home for dinner, and they’re old enough to know that death marches don’t work.
Update (2021-08-18): See also: Hacker News.
Today, we released a few changes to the way Twitter looks on the web and on your phone. While it might feel weird at first, these updates make us more accessible, unique, and focused on you and what you’re talking about
In the history of the company we’ve either relied on someone else’s typeface, from SF Pro and Roboto, to Helvetica Neue in our brand.
[…]
So, that brings us to “Chirp”, our first ever proprietary typeface.
[…]
Rounded tittles and punctuation introduce a humanist character. The result is a versatile, contemporary family (82 styles across Standard and Display!) with international sensibilities. It accomplishes exactly what we need and it has made itself the hero of our refresh.
Gael Fashingbauer Cooper (via Hacker News):
Almost immediately, users began to complain -- with many saying the new font gave them headaches. (This writer is getting them too.)
mcc:
Like seriously though how do you look at this and not see it in sPoNgEbOb tExT? It’s just bouncing a pixel up and down at random with every letter. I recognize some or this may be some kind of bad interaction with the Android font renderer but: I use Android!
After spending a some time in the official Twitter app today, I think I like Chirp in use. It reminds me of Franklin Gothic — a good version — and, at the weight and size I have set it to, engenders a feeling of precision and clarity that Twitter frankly does not deserve.
Update (2021-08-18): Jeffrey Jose (via Ashley Bischoff):
Twitter’s new font “Chirp” appears to be white-labeled GT America.
Update (2021-08-21): Jeff Johnson:
OMG I just discovered that “Reduce motion” in Accessibility System Preferences disables the Twitter Chirp font.
This also works on iOS!
Say hello to iOS Dev Jobs version 2.0! 🚀 It’s entirely new, and it now has native apps! 🎉
[…]
Whether a position is remote or on-site is by far the most important factor affecting whether you’ll consider it, so you can now filter by that. Then, you can set preferences on the time zones you’re available for remote work and what countries/states are convenient for on-site work. The complete set of filters look like this. You’ll only receive an email when jobs match where you’re able to work.
[…]
For companies, pricing for listing your job opportunities remains the same. You can post standard job listings for free and featured job listings for a reasonable fee.
Apple’s desire to move to a single jack that could do double-duty for power and communications was the beginning of the end for MagSafe. USB-C offers those capabilities with a generally well-designed connector that is both slim and bidirectional. The only thing USB-C is not is magnetic.
[…]
Magnetic charging nubbins, which are readily available on Amazon from a variety of random Chinese manufacturers, have two parts. A tiny USB-C nubbin sticks out slightly from the side of the laptop, and an L-shaped magnetic connector connects to your existing USB-C charging cable on one side and grabs onto the nubbin with the other.
[…]
In all honesty, the user experience with the magnetic nubbin isn’t as good as Apple’s MagSafe. Either the magnets aren’t quite as strong, or the “outie” design of the magnetic nubbin means that it’s more readily subjected to shearing forces that break the connection. The old MagSafe ports were “innies,” which made their connections a bit more secure. The other problem is that the standard Apple USB-C charging cable is thicker and less flexible than the old MagSafe charging cable. That makes it a little harder to connect successfully since the magnetic connector has to align perfectly with the nubbin, and it’s more likely to be disconnected by movement.
But, overall, he recommends them.
Apple has released a new macOS Big Sur 11.5.2 update, delivering unspecified bug fixes for Mac users running the latest major operating system version. The update comes a little over two weeks after Apple released macOS 11.5.1.
See also: Mr. Macintosh, Howard Oakley.
Previously:
Update (2021-08-13): Mr. Macintosh:
Apple quickly responded.
No further details on the Big Sur 11.5.2 update will be released.
See also: Sami Fathi.
Update (2021-08-18): Howard Oakley:
So far, I have been comparing hashes between 11.5.1 and 11.5.2 across a limited range of directories, but here are the changes in 11.5.2 that this has already revealed[…]
[…]
I think the evidence points to Apple having made significant security fixes in 11.5.2, as well as fixing bugs across important frameworks.
Dave Teare (tweet, MacRumors, Reddit):
Categories now sit atop your item list as a simple dropdown filter, giving the sidebar plenty of room to show all your vaults and their accounts.
You’ll also notice an indicator next to each shared vault, making it easier to see which vaults are private and which are shared. No guesswork. And items show who they’re being shared with.
Throughout the app you’re in more control, with more contextual information available at all times. Try dragging-and-dropping an item from a personal vault to a shared vault. When you do, 1Password will show you who will gain access to the item so there’s no doubt about what’s happening.
[…]
I personally use Collections to hide family vaults that I only need access to in case of emergency and don’t want to see every day. It’s also great for hiding production work accounts until I explicitly require them.
[…]
[The] next generation of 1Password gives you more power to recover data, starting with item drafts, the ability to restore recently deleted items, as well as being able to revert to previous versions of an item.
Dave Teare (tweet):
What makes this [Linux] release even more amazing is it was created from scratch and developed using new languages and techniques most of our team never used before.
[…]
The backend is written in Rust, a true systems programming language known for its safety and performance. Rust compiles directly to native code and avoids the overhead associated with runtimes or garbage collection.
On the frontend side of things we used web technologies to allow us to create an entirely new design language for 1Password.
The new Mac app uses Electron, too, as you can immediately see from how the fonts and controls look. A two-person team can write a native AppKit app, but a team of 473 starting with a mature AppKit codebase has other priorities.
bgentry (also: Miguel de Icaza):
The most disturbing part about this is that their support team has been misleading people on Twitter all morning, not truthfully answering straightforward questions about whether the app is Electron
The language and compilation status of the backend are not relevant to whether the frontend is native.
The blog post screenshot had me all “yay, looks like it matches the new sidebar style in macOS, wonder if it is Catalyst or SwiftUI?”, then I opened the preference “window” … which is an Electron-style modal inside the main window.
Ben:
To minimize file size and maximize performance, we’re offering separate Apple silicon and Intel builds.
A hallmark of Electron.
Of course, you can see why a company would want a cross-platform solution to reduce the number of codebases that need to be developed and kept in sync. It’s interesting that, even though there’s already an iOS version, they decided not to go with Catalyst. As to Apple’s other cross-platform technology…
We have a large Apple dev team and had a parallel SwiftUI codebase being developed for about 6 months. It had some advantages but overall it underperformed on macOS and the UX was worse.
What’s more concerning is the shortcut change. ⌘\ is deep in my muscle memory, but more importantly it’s in my (less tech savvy) family’s muscle memory. I strongly urge you to consider retaining the default shortcut bindings in the final release.
1Password 8 has a new Quick Access feature that’s activated by ⌘⇧Space and supports Go & Fill.
Even though memberships won by a long shot, our existing apps already supported both so we continued to offer standalone licenses. This included support as well as new features and updates for license holders.
In our new apps, however, we needed to revisit this approach…
[…]
We’d like to thank you for supporting us all these years and provide a special trade-in discount for your license. Simply email us your license and enjoy 50% off your first 3 years.
The new version drops non-subscription licenses, standalone vaults, and support for Dropbox, iCloud, 1Password mini, and 1PasswordAnywhere.
I’m not sure what I’ll do from here. I’ve been using PasswordWallet myself since the writing was on the wall for standalone vaults and my favorite feature in 2017. But the rest of my family is still on 1Password/Dropbox. Much as I don’t like these changes, I’m not sure there’s a multi-user product that’s better.
No matter what anyone else does with their offerings, iOS and macOS have a built-in, free password manager. I love our new, Mac-native interface in macOS Monterey, which has clear, helpful security recommendations (including breach warnings!) and a verification code generator. :)
This is a temping option because it’s fully integrated and built on iCloud Keychain. Although it’s not inherently multi-user, you can configure a single Mac with separate accounts for different users.
Previously:
Update (2021-08-13): Kristoffer Forsgren:
Please stay native on macOS! 😬 (no electron)
1Password (in June):
Don’t worry, we’re all about the native apps. ❤️
Having used 1Password since its very beginning, I grew increasingly distrustful of their product management and roadmap (the key point for me being that I will not subscribe to their cloud syncing service), so this is an attempt at putting together a systematic list of decent alternatives for my own use.
[Someone] pointed out that Discord, League of Legends, Docker for Desktop, Epic Games client, and many bits of Steam are “basically Electron”, and we don’t hate them.
But all of those have /awful/ UX on a Mac (except LoL, I don’t know LoL).
I get why Electron is winning (won?). Signal Desktop is Electron too. But macOS had an actual design language and accessibility and interoperability between apps basically for free, and Electron apps lose 90% of that because it’s not how web pages work. So I’m gonna resent it.
(I also blame Apple for not placing value on this in their own apps. I miss Mac-assed Mac apps.)
This is a big step backwards. No native UI, no local vaults, no wifi sync, shortcuts not modifiable, no (fast) vault switching, quick access is much more restricted and shows less information.
[…]
Oh and the UI is not faster. Only a mess of different font sizes, too much whitespace and indistinguishable buttons :(
I find it really strange that people are way more concerned with @1password using Electron than dropping local vaults and going subscription-only. I care way more about owning my passwords that connect me to every service in my life than what framework the app is written in.
VC isn’t why 1Password switched to Electron, though. That’s a simplistic conclusion, and I’m someone who is deeply unimpressed by “Tech” VC.
It’s almost always the little indie devs who have native apps on Mac, iOS, and maybe even Windows and Android, almost always the BigCos who use ugly cross-platform frameworks.
Apple had the chance to solidify their dev-lead by unifying their situation. E.g. make catalyst really first class and evolve the APIs nicely.
Instead they tried inventing their own language (swift) and make the future dev landscape so confusing people go electron and x-platform.
I don’t understand a company with a mature codebase built on AppKit since forever, turning to a cross-platform framework like Electron. I mean, it’s not like if they didn’t have a Mac app, right? They do, it’s already supporting Apple silicon too. I guess they didn’t like SwiftUI or Catalyst.
SwiftUI is too little, too late, if prime Mac apps are dropping it in favor of Electron. All the years it took to mature Swift(aka adding fancy language-geek features) might have been spent better elsewhere. Not sure if Apple cares, or what they should have done differently.
All these years where we waited for a common base in AppKit and UIKit. And then we got this abysmal Marzipan thing. With SwiftUI it’s getting better.
Downside to Catalyst is the lack of backwards compatibility - it only got really good in 10.15. 1PW probably needs to deploy much further back.
And SwiftUI itself is even rougher prior to macOS 11.
Seems like a good time to post a link to the presentation that @mitchchn did this year at @NorthSec_io conference about all the security problems with Electron apps[…]
See also: 1Password Community Discussions, Reddit.
Michael Fey (tweet):
However, with four full stacks of client implementations of our server APIs, any changes needed to be coordinated across four teams. Four teams that were still operating independently. Each time our server team lead would come to the client leads and ask us how long until we could support some new feature, each of us said the same thing: “Now’s not a good time, we’re busy. Maybe in a few weeks?” And that estimate of a few weeks was different for each team. We kept advancing our apps with cool new features, but we weren’t advancing our service-based features. We were paralyzed.
[…]
A small team, using existing pieces of various apps and projects, put together a proof of concept of a brand new 1Password app running on top of what we now call the 1Password Core.
[…]
On April 1st, 2020 we officially put our existing 1Password apps into maintenance mode, opened up our source code editors, and clicked File > New Project… on five new 1Password apps.
[…]
We could support as many versions of macOS as we wanted using Apple’s AppKit framework, but that meant adding another frontend toolkit to the mix. We could go all in on SwiftUI, but that meant reducing the number of operating system versions we could support. We could go all in on the same approach we were using for Linux and Windows, but that made it very difficult to create an app that looked and felt at home on macOS.
Ultimately we decided for a two-prong approach. We would build two Mac apps. One written in SwiftUI that targeted the latest operating systems and another using web UI that allowed us to cover older OSes.
[…]
However with a self-imposed ship date of September 2021, our timeline to bring these apps to stable was starting to look a bit tight.
[…]
Despite the fact that SwiftUI allowed us to share more code than ever between iOS and macOS, we still found ourselves building separate implementations of certain components and sometimes whole features to have them feel at home on their target OS.
Ultimately we made the painful decision to stop work on the SwiftUI Mac app and focus our SwiftUI efforts on iOS, allowing the Electron app to cover all of our supported Mac operating systems.
This is exactly what I mean when I’ve said going SwiftUI-native leaves you little better off than before, still having to write distinct iOS & Mac apps and dividing your time & resources.
Jason Snell (tweet, Hacker News):
What’s really causing all this consternation, I think, isn’t 1Password moving to Electron. Electron is a bit of a bogeyman. The root problem is this: 1Password, originally a Mac-forward software developer, has simply decided that the Mac isn’t important enough.
[…]
Fey’s post clearly spells out AgileBits’s priorities. Android and iOS apps are built with native platform frameworks in order to create the best app experience possible on mobile. For iOS, AgileBits decided to use Apple’s new SwiftUI framework rather than the venerable UIKit, in order to skate “to where the puck was going.” Their plan was to use SwiftUI on the Mac, too. In doing so, AgileBits was buying into the vision Apple has for SwiftUI as a tool to build interfaces across all of Apple’s platforms. Unfortunately, it seems that SwiftUI didn’t measure up on the Mac[…]
[…]
I find AgileBits’s decision-making process incredibly sad. Because as Fey’s post makes clear, at no point did the company consider keeping the Mac-only version of 1Password alive. AgileBits, once a major Mac developer, decided (for legitimate business reasons, of course) that the Mac’s not a platform that deserves its own bespoke app.
Update (2021-08-18): Roustem Karimov:
I don’t agree with the title. Mac is super important to us, both @dteare and I rely on it. I just refuse to believe there is only One True Way of making great apps and everything else should be burned with 🔥
[…]
When we made the decision it was between having a subpar SwiftUI experience (many reasons) or an amazing but (scary!) Electron experience.
AppKit can’t be in the picture, sadly. We don’t want to go back developing 5 different frontends.
Apple marketed itself as the privacy company and is confused when customers are mad that they’re scanning photos.
Agilebits marketed itself as making quality native apps and is confused when customers are mad that they’re no longer making native apps.
If they think Electron is best, don’t have the support team lie to us that they’re native. Don’t say this is good for users. It’s neither. Own the decision.
I was going to take a screenshot to show that @1Password 8 beta doesn’t respect my “Double click an app’s titlebar to [minimize]” when I realized IT HIJACKS Shift+Cmd+4+Spacebar!
1Password is still gonna be the best, despite going non-native on Mac.
The product is immensely strong - top-notch integration on all platforms, team accounts, shared vaults with access control, file storage, etc. I can even see the technical issues.
I’m just a liddle sad 😢
Have to confess, I didn’t really understand the 1Password VC investment at the time. In hindsight, I missed that it isn’t really a password app anymore, but has pivoted to an account management proxy. In that context it makes much more sense. Clever pivot.
Apple’s in a transitional period where it’s not at all obvious which of their THREE competing Mac frameworks any serious developer should choose.
It’s super-obvious to me as a long-time Mac developer that sticking with AppKit is the safe bet right now. But for everybody else, the people who should be flooding the ever-growing Mac customer base with new apps? It’s a shit show.
And yes, having any of your most passionate and experienced developers (i.e. me) choosing the most antiquated of your proposed frameworks is not a great place to be in.
Apple sort of “Osborned” AppKit, and it was probably a mistake.
I’d add that the 1Password devs are competent, well-known devs who have participated in the native Apple developer community for over a decade. SwiftUI’s inability to back-deploy, something I think Apple should clearly reconsider, left them writing two apps for Mac — one too many
Also proves what I’ve suspected for a while: SwiftUI is still writing cross platform cheques that the toolkit can’t quite cash. Especially on macOS.
[…]
The weirdest part of all this is that a stock SwiftUI app (where you don’t write custom UIKit or AppKit bits) feels non native to me
I found that part the most weird one, because “Write Once Run Anywhere” was openly communicated as a non-goal for SwiftUI. It was “Learn Once Apply Anywhere”. SwiftUI is not a cross platform framework and Apple was upfront about this part.
Whether that meant speed, or compactness, or both. Because it mattered. It still does[…] So, when I download an app that I and a lot of my peers use in the ordinary course of getting work done, and I get info on it, and I see that it’s 180MB (Discord), or 400MB (Dropbox), or 200MB (Slack), or 280MB (Skype), and I know it’s a front-end to a web service, well honestly, I cringe, every time.
[…]
I am firmly of the opinion that “cross platform” means that users of each (desktop) platform that the product supports can enjoy a platform-optimized experience which is tuned in both performance/efficiency and UI behavior for the desktop platform on which it is running.
[…]
To me, a successful cross-platform product consists of three layers:
- The high-level UI/UX: platform-tuned, natively implemented using the platform-provided native framework.
- The “business logic” which is platform-portable by design and intent, and which implements the non-user-facing parts of the product’s feature set;
- The bare-metal interface between the business logic and the platform services (file I/O, networking, etc). As with #1 this is platform-tuned and natively implemented using code that is optimized to be as fast and as efficient as possible.
[…]
There’s no reason to buy a Mac if you don’t get to use the unique advantages of the platform (because Electron apps don’t expose them).
As someone who hasn’t forgotten Mac Word 6, I wholly endorse this thread. There are a couple of things I would say differently. His “three layers,” for example. I talk about sandwiches. The OS is two slices of bread; your app is the meat, lettuce and tomato in between.
Something like 80% of the size of the Office suite is due to the fact that we can no longer share common elements between related apps. All frameworks, fonts, proofing tools, must be duplicated N times for N apps. Not to mention being required to ship fat^H^H^Huniversal binaries.
And all that is due to policy.
So, in since I dabble in a lot of cross platform stuff, here’s my thread on all these cross platform framework things going on.
[…]
Management thinks these things look great. In a few weeks you go from nothing to a minimum-viable-product that runs everywhere! Amazing! Why would anyone ever do native development! These frameworks are so fast and economical! It’s amazing!
[…]
So the problem really is that teams can make amazing initial progress, and then have the cross platform tooling work against them more and more as the product grows. This cost isn’t obvious, and typically goes unmeasured by stakeholders who pushed cross platform to begin with.
Without a public platform roadmap that articulates where they’re going, UI toolkit decisions and development time become large risks for businesses.
On Apple platforms, new toolkit features are almost never back-ported. You concede platform capabilities by choosing to support older OSes. When you choose newer toolkits, target audience is limited, and you’re beholden to the grinding annual cadence of surprise changes.
If you choose Electron, you get support for old OSes, modern features, and a clear roadmap of what’s going to be supported when. Businesses need that predictability.
Electron is effectively a means to mitigate development risk (a service quite a few vendors have made good money on for decades).
One might argue that Apple is effectively forcing businesses to use Electron because of their secrecy.
I think the history of tabs serves as a fascinating case study of how Apple’s neglect for its own UI frameworks assisted the rise and acceptance of cross-platform frameworks like @electronjs and the corresponding decline in the importance of “nativeness” and “the HIG”.
[…]
Although Apple had shipped tabs in Safari, developers were left to their own devices to figure out tabs for the individual apps. Critically, this was true both in terms of implementation and behavior. With no AppKit component, there was no strong Apple direction for tabs.
[…]
Apple did eventually end up adding an API for tabs to AppKit, but not until 2017, 12 years after they shipped their first tabs in Safari! By that point, it was more of a pain to convert legacy code than anything, and it wasn’t helped by a woeful lack of documentation.
For a long time, the best resource on how to use AppKit’s tabs was a single WWDC video. And of course, being new, it lacked many of the features existing apps had already implemented. It was too late, now it was a “chore,” not a “feature”, and with little perceived benefit.
What’s important here is that there are two critical components in achieving a consistent OS: the vendor must offer both a strong vision of how things should work, and also provide an easy avenue for making things work that way. Apple has been failing at both on the Mac.
Developers end up choosing Electron partially because the Mac’s vendor isn’t offering up leadership on how or why you should build native apps for the platform.
Years ago, when I was a 1Password user, I remember it being among my favourite apps to use. Who knew that something as boring as a password manager could be fun and beautiful? If a company like 1Password feels like the Mac can share an Electron app with Windows and Linux, that seems like a concerning state of affairs.
[…]
The fact of the matter is that there has never been a good cross-platform framework — not when developers only had to worry about Windows and Mac OS X, and not now when they are trying to cover at least twice as many operating systems. Apple’s attempts — SwiftUI and Catalyst, the latter of which 1Password’s Fey does not mention — have not corrected that problem, and they only cover half of the platforms developers commonly support. When even premiere Mac developers think Electron is the best option they have, it makes me worried.
I just realized while others are moving to Electron, Apple released a native password manager for Windows [iCloud Keychain Password Manager]
See also:
Update (2021-08-21): Matt Birchler:
All of those are great native Mac apps, but they’re using custom UI elements all over the place. Things has custom everything, Reeder has an iPad-style interface, Craft’s preferences window does not follow macOS conventions, and iStat Menus has some native-ish things with plenty of custom stuff too.
On this week’s ATP episode, Casey Liss referred to 1Password 7 as a “Mac-assed Mac app,” but what part of this UI is using Mac conventions or stock UI? It’s all custom (something Marco did mention).
There are really two different levels here: using a native API and using native (vs. custom) controls and behaviors. Personally, I dislike a lot of the custom stuff, even in apps like 1Password 6 and iStat Menus that are overall good apps. Reeder, to me, looks and feels alien on macOS.
See also: TidBITS Talk.
Update (2021-09-07): Core Intuition (tweet):
Daniel and Manton talk about 1Password’s decision to use Electron for its next major update, and what that says about Apple’s confusing and conflicting platform frameworks. They talk about the continuing role of AppKit as a the only way to achieve many expected macOS behaviors, and wonder if the next-generation standard for desktop apps might actually be a “web technology.”
Steve Tibbett (includes video):
Flipping tabs in the new Safari is terrible. The flipping of the chrome colour makes it even harder to see what tab is selected. This isn’t an artificial test, these were the tabs I had open.
Mind-boggling that anyone thinks this looks good, or isn’t bothered by the fact that it’s very hard to see which tab is selected.
Remember the auto-color playlist headers in iTunes? This is like that. They’ll leave it enabled by default for a while because they can, and because some designer feels strongly about it.
Then in a year or two, they’ll disable it by default. Eventually, it will silently go away.
Vivaldi may not look as fancy as Safari, but I think it’s hard to ague it’s any less usable.
- The active tab is always the same color, so it always stands out.
- Most UI elements live in the always-white part of the app so they’re always equally visible.
- The active table is always white so the black text is always maximally contrasy.
Plus, it gets the hierarchy right by putting the URL—which is tab-specific—inside the tab rather than above all the tabs.
Previously:
This, from Bryan Braun, is great (via Hacker News).
Younger readers may not be familiar, but After Dark was a piece of software you could get for your Mac that had a bunch of screensavers you could enjoy. The most iconic, as far as I can tell, are the flying toasters.
The earlier days of computers and the internet were really bad in some ways (matters of inclusion come to mind as something we didn’t even think of back then), but there are so many incredible things abut that time as well. Flying toasters perfectly symbolizes these days for me; it’s weird, it’s kinda stupid, and its iconic.
Previously:
jskidpix (via Hacker News):
JS Kid Pix / Kid Pix 1.0 was released in to the public domain and this is an HTML/JS reimplementation.
[…]
Just like the original Kid Pix, there’s no guide—have fun! Most of the tools support Shift (^) to enlarge. There are a handful of hidden tool features behind various modifier keys (⌘, ⌥, ⇧). The modifier keys can also be combined.
Previously:
Google considered buying Epic Games as the companies sparred over Epic’s Fortnite Android app, according to newly unsealed court filings.
[…]
Epic claims Google was threatened by its plans to sidestep Google’s official Play Store commission by distributing Fortnite through other channels, and in an unredacted segment, it quotes an internal Google document calling Epic’s plans a “contagion” threatening Google.
[…]
In another unsealed section, the complaint describes a Google Play manager reaching out to Epic about its plans to sideload Fortnite — and apparently admitting that sideloading is a “frankly abysmal” experience in the process.
[…]
Another section says that “staff members have acknowledged internally that the difficulty Google imposes on consumers who wish to direct download leads to a ‘[p]oor user experience,’ in that there are ‘15+ steps to get app [via sideloading] vs 2 steps with Play or on iOS.’”
This was unbeknownst to us at the time, and because of the court’s protective order we’re just finding out now about Google’s consideration of buying Epic to shut down our efforts to compete with Google Play.
Whether this would have been a negotiation to buy Epic or some sort of hostile takeover attempt is unclear.
Google really giving away the game here; they a) admit that sideloading UX sucks, b) admit that the effect of that is to drive people to Play, c) recognize efforts by Epic et al to break up Play are bad for business, and d) are willing to spend lots of money to shut those down.
Previously:
The Child Sexual Abuse Material (CSAM) Scanning Tool allows website owners to proactively identify and take action on CSAM located on their website. By enabling this tool, Cloudflare will compare content served for your website through the Cloudflare cache to known lists of CSAM. These lists are provided to Cloudflare by leading child safety advocacy groups such as the National Center for Missing and Exploited Children (NCMEC).
Financial Times (via Hacker News, reprint):
Apple plans to scan US iPhones for child abuse imagery
Matthew Green (via Hacker News):
I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea.
These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear.
[…]
This sort of tool can be a boon for finding child pornography in people’s phones. But imagine what it could do in the hands of an authoritarian government?
[…]
The way Apple is doing this launch, they’re going to start with non-E2E photos that people have already shared with the cloud. So it doesn’t “hurt” anyone’s privacy.
It’s implied but not specifically stated that they are not scanning the contents of iCloud Backup (which is not E2E), only iCloud Photo Library.
But you have to ask why anyone would develop a system like this if scanning E2E photos wasn’t the goal.
[…]
Hashes using a new and proprietary neural hashing algorithm Apple has developed, and gotten NCMEC to agree to use.
We don’t know much about this algorithm. What if someone can make collisions?
Or what if the AI simply makes mistakes?
Chance Miller (Apple, Hacker News, MacRumors):
Apple is today announcing a trio of new efforts it’s undertaking to bring new protection for children to iPhone, iPad, and Mac. This includes new communications safety features in Messages, enhanced detection of Child Sexual Abuse Material (CSAM) content in iCloud, and updated knowledge information for Siri and Search.
[…]
If there is an on-device match, the device then creates a cryptographic safety voucher that encodes the match result. A technology called threshold secret sharing is then employed. This ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content.
[…]
Apple isn’t disclosing the specific threshold it will use — that is, the number of CSAM matches required before it is able to interpret the contents of the safety vouchers. Once that threshold is reached, however, Apple will manually review the report to confirm the match, then disable the user’s account, and sent a report to the National Center for Missing and Exploited Children.
There’s a technical summary here.
Many other cloud storage services are already doing that, in a much less privacy-preserving way. In a way, it’s their responsibility given that they’re storing the data and it is illegal to possess such content in many parts of the world.
I have always been concerned that this system could be weaponized as a way gain access to someone’s account. For example:
- Add the hash of a non-pornographic image to the database
- Using a burner email address, email the non-pornographic image to the target’s Gmail address. The target wouldn’t think anything of it.
- The innocent image would trigger a CP alert, giving law enforcement the pretense it needs to access the account
I wonder how easy it is to add a photo to someone’s iCloud Photo Library.
What they say: “This algorithm will scan your images for potential child abuse”
What it will actually do: Looks at your nudes without your consent and sends them to a team who will of course have people who save them and share them when they see its not child abuse.
That would never happen, of course. Apple would probably argue that you don’t really have to trust their team because threshold secret sharing will prevent them from needing to review the images, anyway. But who knows what threshold they’re using or how reliable the perceptual hashing actually is.
One takeaway is that, CSAM detection aside, Apple already has access to these photos. You shouldn’t upload anything to the cloud that you want to keep private. But Apple isn’t giving users much choice. It doesn’t let you choose a truly private cloud backup or photo syncing provider. If you don’t use iCloud Photo Library, you have to use Image Capture, which is buggy. And you can’t use iCloud to sync some photos but not others. Would you rather give Apple all your photos or risk losing them?
And, now that the capability is built into Apple’s products, it’s hard to believe that they won’t eventually choose to or be compelled to use it for other purposes. They no longer have the excuse that they would have to “make a new version of the iPhone operating system.” It probably doesn’t even require Apple’s cooperation to add photo hashes to the database.
Previously:
Update (2021-08-06): Nick Heer, regarding my question about adding a photo to someone else’s iCloud Photo Library:
AirDropped images are automatically added to the photo library, aren’t they?
Because Apple is scanning iCloud Photos for the CSAM flags, it makes sense that the feature does not work with iCloud Photos disabled. Apple has also confirmed that it cannot detect known CSAM images in iCloud Backups if iCloud Photos is disabled on a user’s device.
I think a fair counterargument is that Apple’s more proactive approach to child safety takes away one of law enforcement’s favourite complaints about commonplace encryption.
But it represents a similar trade-off to the aforementioned iCloud backups example. Outside of the privacy absolutist’s fictional world, all of privacy is a series of compromises. Today’s announcements raise questions about whether these are the right compromises to be making. What Apple has built here is a local surveillance system that all users are supposed to trust. We must believe that it will not interfere with our use of our devices, that it will flag the accounts of abusers and criminals, and that none of us innocent users will find ourselves falsely implicated. And we must trust it because it is something Apple will be shipping in a future iOS update, and it will not have an “off” switch.
Perhaps this is the only way to make a meaningful dent in this atrocious abuse, especially since the New York Times and the NCMEC shamed Apple for its underwhelming reporting of CSAM on its platforms. But are we prepared for the likely expansion of its capabilities as Apple and other tech companies are increasingly pressured to shoulder more responsibility for the use of their products? I do not think so. This is a laudable effort, but enough academics and experts in this field have raised red flags for me to have some early concerns and many questions.
Andrew Orr (in 2019, MacRumors):
Occasionally I like to check up on Apple’s security pages and privacy policies. I noticed something new in the privacy policy, which was last updated May 9, 2019. Under the “How we use your personal information” header, one of the paragraphs now reads (emphasis added):
We may also use your personal information for account and network security purposes, including in order to protect our services for the benefit of all our users, and pre-screening or scanning uploaded content for potentially illegal content, including child sexual exploitation material.
Apple may have even been doing this for years, but this is the first time this has appeared in its privacy policy. And I checked earlier versions using the Wayback Machine.
[…]
Speaking at CES 2020, Apple’s chief privacy officer Jane Horvath mentioned photos backed up to iCloud in terms of scanning.
[…]
A search warrant revealed that Apple scans emails for this content.
Apple’s scanning does not detect photos of child abuse. It detects a list of known banned images added to a database, which are initially child abuse imagery found circulating elsewhere. What images are added over time is arbitrary. It doesn’t know what a child is.
Apple thinks photo scanning is non-negotiable — that for legal and PR reasons, you can’t be a major consumer tech company and not scan users’ photos — so the only way to encrypt photos on-device was to develop & implement client-side scanning.
My read is that the FBI keeps harping about CSAM and “going dark”. It’s the hardest thing to defend, so now they can say “no one can use iCloud to store CSAM and I won’t build a backdoor into iCloud encryption”
They are if they are moving server-side scanning to “client-side hashing then matching on the server-side”. If this is a pre-req for encrypted iCloud data, then this is potentially a win. But, this is all negated by absence of auditability of the hash DB.
If it came out that Apple was adding anything other than CSAM fingerprints to the database, it’d be ruinous to the company’s reputation. As bad as if they were pilfering from Apple Cash accounts.
It sounds like Apple is not adding anything to the database, so it’s not in a position to make any guarantees. It’s just using an opaque list of hashes supplied by a third party.
The hash databases used by CSAM scanning methods have little oversight.
[…]
In any case, all of this requires us to place trust in automated systems using unproven machine learning magic, run by technology companies, and given little third-party oversight. I am not surprised to see people worried by even this limited scope, never mind the possibilities of its expansion.
Government: <adds images known to be from target to database>
Apple: <matches, uploads contents of target’s phone to government server for further inspection>
Government: thanku appl
Whoever controls this list can search for whatever content they want on your phone, and you don’t really have any way to know what’s on that list because it’s invisible to you (and just a bunch of opaque numbers, even if you hack into your phone to get the list.)
The theory is that you will trust Apple to only include really bad images. Say, images curated by the National Center for Missing and Exploited Children (NCMEC). You’d better trust them, because trust is all you have.
[…]
This means that, depending on how they work, it might be possible for someone to make problematic images that “match” entirely harmless images. Like political images shared by persecuted groups. These harmless images would be reported to the provider. […] And the problem is that none of this technology was designed to stop this sort of malicious behavior. In the past it was always used to scan unencrypted content. If deployed in encrypted systems (and that is the goal) then it provides an entirely new class of attacks.
[…]
Regardless of what Apple’s long term plans are, they’ve sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content.
That’s the message they’re sending to governments, competing services, China, you.
EFF (tweet, Hacker News, MacRumors):
All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.
[…]
Apple and its proponents may argue that scanning before or after a message is encrypted or decrypted keeps the “end-to-end” promise intact, but that would be semantic maneuvering to cover up a tectonic shift in the company’s stance toward strong encryption.
But knowing this uses a neural net raises all kinds of concerns about adversarial ML, concerns that will need to be evaluated.
Apple should commit to publishing its algorithms so that researchers can try to develop “adversarial” images that trigger the matching function, and see how resilient the tech is.
I am vehemently opposed to scanning of personal information, be it in the cloud (under end-to-end encryption), or on our local devices. The long term risk for misuse of such technology far outweighs any short term benefit.
[…]
There are world governments of all kinds, and they all have questionable policies of varying degrees. As soon they tell a corporation implement their dubious dragnet or suffer the consequences, the corporation will promptly give them access to your photos, emails, any other data.
The reason Apple’s approach is going far too far comes down to one thing: the difference between law enforcement, where an agency needs good reason to access private data, and surveillance. Apple’s approach is surveillance. (And from the company that made the 1984 ad.)
A narrowly defined backdoor is still a backdoor. “Partial” digital privacy isn’t a thing -- you either have it or you don’t.
If you think you can design a system that violates privacy only for some people, you can’t. I don’t care who you are.
Apple has won enormous amounts of goodwill by declaring that privacy is a human right, and is about to destroy all of it at once by building a technology to have your phone scan your pictures and turn you over to law enforcement if they’re the wrong sort of pictures.
It doesn’t matter what sort of pictures motivated this feature; eventually governments will force its use for all sorts of things, and many governments do not respect human rights. I’m completely aghast that this is being contemplated.
Here’s the thing about “slippery slope” arguments: a slope is rarely slippery, but it still goes downhill.
It took 12 years to go from “your Mac app needs to be code signed for the keychain and firewall” to “you need to upload every build of your Mac app to Apple for approval”.
It is difficult for me to reconcile the Apple that makes ostensibly clever machine learning stuff that can match child abuse imagery, even after it has been manipulated, with the Apple that makes software that will fail to sync my iPhone for twenty minutes before I give up.
Same with the iMessage scanning feature and iMessage itself.
Now that Apple has willingly built spyware into iOS and macOS, within 10 years this tech will:
(1) be mandated by government in all end-to-end encrypted apps; and
(2) expand to scan for terrorism, disinformation, "misinformation", then eventually political images and memes.
This is not a drill.
Police are already misusing location data gathered for COVID contact tracing even though everyone SWORE it wouldn’t be used for anything by health purposes.
Clearly a rubicon moment for privacy and end-to-end encryption.
I worry if Apple faces anything other than existential annihilation for proposing continual surveillance of private messages then it won’t be long before other providers feel the pressure to do the same.
[…]
If Apple are successful in introducing this, how long do you think it will be before the same is expected of other providers? Before walled-garden prohibit apps that don’t do it? Before it is enshrined in law?
Really seems like Apple tried to protect customer data in the cloud by scanning for illegal material locally on the phone, thereby creating a new kind of risk for customer data on the phone.
To address these concerns, Apple provided additional commentary about its plans today.
Apple’s known CSAM detection system will be limited to the United States at launch, and to address the potential for some governments to try to abuse the system, Apple confirmed to MacRumors that the company will consider any potential global expansion of the system on a country-by-country basis after conducting a legal evaluation.
[…]
Even if the threshold is exceeded, Apple said its manual review process would serve as an additional barrier and confirm the absence of known CSAM imagery. Apple said it would ultimately not report the flagged user to NCMEC or law enforcement agencies and that the system would still be working exactly as designed.
I wonder how much manual review Apple is planning to do, given that it says there’s only a 1 in 1 trillion probability of incorrectly flagging an account.
In an internal memo distributed to the teams that worked on this project and obtained by 9to5Mac, Apple acknowledges the “misunderstandings” around the new features, but doubles down on its belief that these features are part of an “important mission” for keeping children safe.
It’s hard not to feel that a bait and switch is being presented. Apple announced that disabling iCloud Photos bypasses CSAM detection. This practically ensures failure, as anyone involved in child exploitation will of course disable iCloud Phots. So then what? Set up to fail...
So we already have the on-device detection, and limiting it to iCloud Photos will fail. This means that further measures will be required, i.e., scanning regardless of whether iCloud Photos is enabled.
Seems like Apple’s idea of doing iCloud abuse detection with this partially-on-device check only makes sense in two scenarios: 1) Apple is going to expand it to non-iCloud data stored on your devices or 2) Apple is going to finally E2E encrypt iCloud?
But if it is to enable end-to-end iCloud encryption and it is not applied to purely local files, that seems like an overall privacy benefit.
If we follow that line of speculation further, it makes me wonder why Apple would create so much confusion in its communication of this change. Why drop this news at the beginning of August, disconnected from any other product or service launch? Why not announce it and end-to-end iCloud encryption at the same time, perhaps later this year?
Update (2021-08-09): John Gruber:
The database will be part of iOS 15, and is a database of fingerprints, not images. Apple does not have the images in NCMEC’s library of known CSAM, and in fact cannot — NCMEC is the only organization in the U.S. that is legally permitted to possess these photos.
[…]
All of these features are fairly grouped together under a “child safety” umbrella, but I can’t help but wonder if it was a mistake to announce them together. Many people are clearly conflating them, including those reporting on the initiative for the news media.
[…]
In short, if these features work as described and only as described, there’s almost no cause for concern. […] But the “if” in “if these features work as described and only as described” is the rub. That “if” is the whole ballgame. If you discard alarmism from critics of this initiative who clearly do not understand how the features work, you’re still left with completely legitimate concerns from trustworthy experts about how the features could be abused or misused in the future.
Glenn Fleishman and Rich Mogull:
The problem is that exploitation of children is a highly asymmetric problem in two different ways. First, a relatively small number of people in the world engage in a fairly massive amount of CSAM trading and direct online predation.
[…]
The other form of asymmetry is adult recognition of the problem. Most adults are aware that exploitation happens—both through distribution of images and direct contact—but few have personal experience or exposure themselves or through their children or family. That leads some to view the situation somewhat abstractly and academically. On the other end, those who are closer to the problem—personally or professionally—may see it as a horror that must be stamped out, no matter the means. Where any person comes down on how far tech companies can and should go to prevent exploitation of children likely depends on where they are on that spectrum.
[…]
(Spare some sympathy for the poor sods who perform the “manual” job of looking over potential CSAM. It’s horrible work, and many companies outsource the work to contractors, who have few protections and may develop PTSD, among other problems. We hope Apple will do better. Setting a high threshold, as Apple says it’s doing, should dramatically reduce the need for human review of false positives.)
[…]
Apple’s head of privacy, Erik Neuenschwander, told the New York Times, “If you’re storing a collection of C.S.A.M. material, yes, this is bad for you. But for the rest of you, this is no different.”
Given that only a very small number of people engage in downloading or sending CSAM (and only the really stupid ones would use a cloud-based service; most use peer-to-peer networks), this is a specious remark, akin to saying, “If you’re not guilty of possessing stolen goods, you should welcome an Apple camera in your home that lets us prove you own everything.” Weighing privacy and civil rights against protecting children from further exploitation is a balancing act. All-or-nothing statements like Neuenschwander’s are designed to overcome objections instead of acknowledging their legitimacy.
What happens when China announces its version of the NCMEC, which not only includes the horrific imagery Apple’s system is meant to capture, but also images and memes the government deems illegal?
The fundamental issue — and the first reason why I think Apple made a mistake here — is that there is a meaningful difference between capability and policy. One of the most powerful arguments in Apple’s favor in the 2016 San Bernardino case is that the company didn’t even have the means to break into the iPhone in question, and that to build the capability would open the company up to a multitude of requests that were far less pressing in nature, and weaken the company’s ability to stand up to foreign governments. In this case, though, Apple is building the capability, and the only thing holding the company back is policy.
[…]
Apple is compromising the phone that you and I own-and-operate, without any of us having a say in the matter. Yes, you can turn off iCloud Photos to disable Apple’s scanning, but that is a policy decision; the capability to reach into a user’s phone now exists, and there is nothing an iPhone user can do to get rid of it.
@Apple now circulating a propaganda letter describing the internet-wide opposition to their decision to start checking the private files on every iPhone against a secret government blacklist as “the screeching voices of the minority.”
The NCMEC database […] contains countless non-CSAM pictures that are entirely legal not only in the U.S. but globally. […] Increasing the scope of scanning is barely a slippery slope, they’re already beyond the stated scope of the database.
This is where the human reviewers come in. In theory, it doesn’t matter if the database contains non-CSAM pictures—either because they were collected along with CSAM ones or because a government deliberately added them to the database—because the reviewers will see that the user did not actually have CSAM and so will decline to make a report. However, this assumes (1) a quality of review that Apple has not previously demonstrated, and (2) that Apple will not be pressured or tricked into hiring reviewers that are working towards another purpose.
What would you say if Apple announced that Siri will always listen and report private conversations (not just those triggered by “Hey Siri”) but only if a really good neural network recognizes them as criminal, and there’s PSI to protect you?
RE: Apple’s plan to scan every photo in iMessage with machine learning and alert parents to nudity. […] Let me share so you can imagine how it will be misused.
Steve Troughton-Smith (also Paul Haddad):
I feel like Apple could easily have built these new features to outright prevent explicit/illegal material from being viewed or saved on its platforms, while sidestepping the slippery slope outcry entirely. […] I mean why are they letting this stuff onto iCloud Photos in the first place?
Perhaps the thinking is that the matching needs to remain hidden so that people can’t learn how to evade it.
We’re past the point where giving Apple the benefit of the doubt can be interpreted as anything other than willful ignorance from a place of Western privilege. These aren’t hypotheticals, we already have examples of Apple’s policies failing people in other countries.
So end-to-end encryption means nothing?
Device maker can log/view/save your content right before it gets sent (encrypted) or right after it’s received (unencrypted), but your content was still E2E encrypted!
In my opinion, there are no easy answers here. I find myself constantly torn between wanting everybody to have access to cryptographic privacy and the reality of the scale and depth of harm that has been enabled by modern comms technologies.
[…]
I have friends at both the EFF and NCMEC, and I am disappointed with both NGOs at the moment. Their public/leaked statements leave very little room for conversation, and Apple’s public move has pushed them to advocate for their equities to the extreme.
[…]
Likewise, the leaked message from NCMEC to Apple’s employees calling legitimate questions about the privacy impacts of this move “the screeching voices of the minority” was both harmful and unfair.
[…]
One of the basic problems with Apple’s approach is that they seem desperate to avoid building a real trust and safety function for their communications products. There is no mechanism to report spam, death threats, hate speech, NCII, or any other kinds of abuse on iMessage.
As a result, their options for preventing abuse are limited.
Say you’re a big Apple fan who is really upset with the photo scanning announcement. In order to send a market signal by switching phones, you would also have to buy a new watch, give up AirDrop / iMessage with your friends, not watch Ted Lasso on your new phone, etc etc etc
At some point ecosystem lock-in creates to many different switching costs that the market can no longer send meaningful signals about what’s important, leaving only public opinion and government regulation to shape a company’s behavior. That feels real icky to me!
Apple’s dark patterns that turn iCloud uploads on by default, and flip it back on when moving to a new phone or switching accounts, exacerbate the problem.
More specifically, the concern involves where this type of technology could lead if Apple is compelled by authorities to expand detection to other data that a government may find objectionable. And I’m not talking about data that is morally wrong and reprehensible. What if Apple were ordered by a government to start scanning for the hashes of protest memes stored on a user’s phone? Here in the U.S., that’s unlikely to happen. But what if Apple had no choice but to comply with some dystopian law in China or Russia? Even in Western democracies, many governments are increasingly exploring legal means to weaken privacy and privacy-preserving features such as end-to-end encryption, including the possibility of passing legislation to create backdoor access into messaging and other apps that officials can use to bypass end-to-end encryption.
So these worries people are expressing today on Twitter and in tech forums around the web are understandable. They are valid. The goal may be noble and the ends just—for now—but that slope can also get slippery really fast.
While child exploitation is a serious problem, and while efforts to combat it are almost unquestionably well-intentioned, Apple’s proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products.
[…]
Apple’s current path threatens to undermine decades of work by technologists, academics and policy advocates towards strong privacy-preserving measures being the norm across a majority of consumer electronic devices and use cases. We ask that Apple reconsider its technology rollout, lest it undo that important work.
Most of the heat RE: neuralMatch is rooted in ignorance of what it does. I’m not here to educate.
But there’s a valid worry that hostile governments could use it to rat out their citizens for non-CSAM offenses.
Some concrete actions Apple could take to fix that[…]
[…]
Guarantee the database is global, not a localized resource.
[…]
Publish neuralMatch as an all-purpose image matching API, so third parties can audit it on a technical level.
[…]
Allow third parties to test the neuralMatch API specifically against the CSAM hashes, so they can audit it for the kinds of politically-motivated matches people are worried about.
Looks like the NeuralHash is included in the current beta in the Vision framework.
Oliver Kuederle (via Hacker News):
At my company, we use “perceptual hashes” to find copies of an image where each copy has been slightly altered. This is in the context of stock photography, where each stock agency (e.g. Getty Images, Adobe Stock, Shutterstock) adds their own watermark, the image file ID, or sharpens the image or alters the the colours slightly, for example by adding contrast.
[…]
It shouldn’t come as a surprise that these algorithms will fail sometimes. But in the context of 100 million photos, they do fail quite often. And they don’t fail in acceptable ways[…]
The laws related to CSAM are very explicit. 18 U.S. Code § 2252 states that knowingly transferring CSAM material is a felony. (The only exception, in 2258A, is when it is reported to NCMEC.) In this case, Apple has a very strong reason to believe they are transferring CSAM material, and they are sending it to Apple -- not NCMEC.
It does not matter that Apple will then check it and forward it to NCMEC. 18 U.S.C. § 2258A is specific: the data can only be sent to NCMEC. (With 2258A, it is illegal for a service provider to turn over CP photos to the police or the FBI; you can only send it to NCMEC. Then NCMEC will contact the police or FBI.) What Apple has detailed is the intentional distribution (to Apple), collection (at Apple), and access (viewing at Apple) of material that they strongly have reason to believe is CSAM. As it was explained to me by my attorney, that is a felony.
The problem with any take on the Apple/CSAM stuff is that there are so many horrible people in the world that do horrible things to people, and so many governments that do horrible things to people, and any pretty much any tech that thwarts one of them enables the other one.
There’s an argument, with support from Game Theory, that says that Apple can set a high threshold for the number of matches, and only detect and report a few cases of CSAM. Indeed, even that may be unnecessary to drive anyone currently sharing CSAM to abandon the use of iCloud Photos altogether.
That would be a win for Apple but not really help solve the problem as a whole.
The worst case scenario for the initial implementation isn’t necessarily false positives, though those would certainly be awful.
Worst case scenario is child abusers don’t use iCloud Photos, and Apple’s NCMEC report #s don’t increase much.
CyberTipline is the nation’s centralized reporting system for the online exploitation of children, including child sexual abuse material, child sex trafficking and online enticement. In 2020, the CyberTipline received more than 21.7 million reports.
Only 265 were from Apple. I’m not sure how to square this with Apple’s chief privacy officer stating in January 2020 that it was already scanning photos server-side. Are the criminals already avoiding iCloud, or is Apple’s matching not very effective?
Stefano Quintarelli (via Hacker News):
The point I try to make is that it will do little to protect children (while weakening users’ privacy and pushing criminals to hide better) but it will be used as an excuse to justify a tight control of the devices in order to perpetuate their apparent monopolistic power through the app store in a time when such behavior is under the fire of competition authorities.
The whole point of end-to-end encryption is to prevent the provider of the service to itself be coerced into giving off information about its users. Apple is building exactly the opposite of that.
Will you even know when the system is abused? The US government already forced companies into coercion while preventing them from telling their users that this is happening.
This is about an infrastructure which can be put to use for any and all of your data. It doesn’t matter what Apple claims it is limited to doing now. What matters is that this is a general purpose capability.
[…]
And what is incredibly stupid about this approach is that only technology-ignorant child-abusers will fail to turn off iCloud photo syncing, which at the moment is what the Apple system counts on. Everyone else gets spied on.
Aral Balkan (via Hacker News):
If Apple goes ahead with its plans to have your devices violate your trust and work against your interests, I will not write another line of code for their platforms ever again.
[…]
When I wrote The Universal Declaration of Cyborg Rights, I wanted to get people thinking about the kind of constitutional protections we would need to protect personhood in the digital network age.
This document serves to address these questions and provide more clarity and transparency in the process.
Apple’s FAQ is really disingenuous.
Why is Apple doing this now?
One of the significant challenges in this space is protecting children while also preserving the privacy of users. With this new technology, Apple will learn about known CSAM photos being stored in iCloud Photos where the account is storing a collection of known CSAM. Apple will not learn anything about other data stored solely on device.
Existing techniques as implemented by other companies scan all user photos stored in the cloud. This creates privacy risk for all users. CSAM detection in iCloud Photos provides significant privacy benefits over those techniques by preventing Apple from learning about photos unless they both match to known CSAM images and are included in an iCloud Photos account that includes a collection of known CSAM.
This answer makes no sense in light of the facts that Apple was already doing server-side scanning and that the photos to now be scanned on device are ones that Apple would have access to via the cloud, anyway. [Update (2021-08-10): See the update below.]
Can the CSAM detection system in iCloud Photos be used to detect things other than CSAM?
Our process is designed to prevent that from happening.
The answer is clearly “yes,” because it relies on hashes, which Apple has not vetted; and depends on human review, which may not work as intended.
Could governments force Apple to add non-CSAM images to the hash list?
Apple will refuse any such demands.
This is not the right question. We don’t really care whether Apple is the one adding the hashes, but simply whether they can be added. And the answer to that is clearly “yes.” There are already non-CSAM hashes in the NCMEC database. Apple has no ability to “refuse” because it never even sees the images. It trusts the hashes that it’s been given by the government.
Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it.
Apple has already compromised user privacy in response to Chinese law. If, say, US law compelled them to scan non-iCloud photos, what choice would they have but to accede? Would they stop selling iPhones? Have every single engineer resign? I don’t see how this is a promise any company could keep, even if it wanted to.
Yes, I fully believe that Apple will refuse when asked, and I don’t question their motives for why this feature should exist. The problem is that I don’t believe it’s remotely enough. Some states do not have a record of taking no for an answer, and when recent history shows impactful decisions, going against those same values and morals, that are the result of either successful pressure or regulatory capture, the situation recalls the words of a quite different Marx: “Those are my principles, and if you don’t like them… well, I have others.”
Apple isn’t “throwing a bone” to law enforcement. Apple is giving them an appetizer. When the biggest computer vendor in the US says it’s ok to put spyware on their own devices, this gives the green light to all legislators and agencies to start demanding everything they want.
Apple said that while it does not have anything to share today in terms of an announcement, expanding the child safety features to third parties so that users are even more broadly protected would be a desirable goal. Apple did not provide any specific examples, but one possibility could be the Communication Safety feature being made available to apps like Snapchat, Instagram, or WhatsApp so that sexually explicit photos received by a child are blurred.
Another possibility is that Apple’s known CSAM detection system could be expanded to third-party apps that upload photos elsewhere than iCloud Photos.
Update (2021-08-10): John Gruber and Rene Ritchie say that, actually, Apple’s servers have never scanned iCloud photo libraries for CSAM, only photos attached to certain messages stored on iCloud’s mail servers. Many sources reported Apple’s chief privacy officer saying at CES 2020 that photos uploaded to iCloud were scanned. However, some of these seem to be based on an article that has since been updated:
This story originally said Apple screens photos when they are uploaded to iCloud, Apple’s cloud storage service. Ms Horvath and Apple’s disclaimer did not mention iCloud, and the company has not specified how it screens material, saying this information could help criminals.
I have not found any official Apple statements saying what was scanned before.
In any case, this changes how I interpret Apple’s FAQ, as well as speculation for the future. If photo library scanning is new, Apple is not reimplementing a previously working system in a way that is potentially less private (since it could be easily tweaked to scan non-cloud photos). It also seems less likely to imply a switch to making iCloud Photos E2EE. It could simply be that Apple wanted to implement the fingerprinting in a way that took advantage of distributed CPU power. Or that it wanted to avoid having a server scanner that it could be compelled to use. This also explains why Apple only made 265 reports in 2020.
Apple’s Chief Privacy Officer seemed to say CSAM scanning of iCloud servers was already happening back in January 2020 and Apple’s Privacy Policy has allowed it since May 2019. However, it is now unclear whether iCloud server CSAM scanning has actually been happening.
Apple now seems to be telling media that server-based CSAM scanning will start when on-device scanning starts.
Or maybe it’s all done on-device when the old photos sync down from the cloud?
John Gruber (tweet):
I do wonder though, how prepared Apple is for manually reviewing a potentially staggering number of accounts being correctly flagged. Because Apple doesn’t examine the contents of iCloud Photo Library (or local on-device libraries), I don’t think anyone knows how prevalent CSAM is on iCloud Photos.
[…]
If the number is large, it seems like one innocent needle in a veritable haystack of actual CSAM collections might be harder for Apple’s human reviewers to notice.
Notice Apple changing the definition of “end-to-end encryption.” No longer is the message a private communication between sender and receiver.
Perhaps feeling left out by the constant communication own-goals by Facebook, Apple set up the mother of all self-owns. It’s hard to think of a more massive communication fuck up, honestly. Again, because this topic is so big, so important, and so sensitive. Apple probably should have had an event, or at the very least a large-scale pre-brief with journalists and bloggers to talk through these issues.
[…]
Second, this is all more than a little ironic given the whole “backdoor” debate Apple forcefully stood up against when government agencies sought to force Apple to build in a way to get into iPhones. Tim Cook was adament that Apple had no way to do this, and should not build it. If they didn’t exactly just create a way, they created a huge loophole that officials are going to test like velociraptors against an electric fence. Until they find the weakness… That’s what Apple set up here. The thing they stood up against! Apple can say all the right things. They also have to abide by laws. And laws are man-made things. Which change.
Apple commit to challenging requests to expand their CSAM detection to other material. So did UK ISPs, but they lost in court and did it anyway. Will Apple leave a market if put in the same position?
How would Apple not be able to add things to the hash list/ change which list they use? NMEC would need to publish some root hash of their list and Apple would have to bind it into their client software in a way even they couldn’t change. Thats a tall order.
It is also deeply disappointing to see so many tech journalists make inferences for Apple when all of the pressure should be on Apple to answer the questions directly and on the record, instead of collecting concerns on background
Matthew Panzarino (tweet, TechCrunch, MacRumors):
I spoke to Erik Neuenschwander, head of Privacy at Apple, about the new features launching for its devices.
[…]
The voucher generation is actually exactly what enables us not to have to begin processing all users’ content on our servers, which we’ve never done for iCloud Photos.
[…]
Well first, that is launching only for U.S., iCloud accounts, and so the hypotheticals seem to bring up generic countries or other countries that aren’t the U.S. when they speak in that way, and the therefore it seems to be the case that people agree U.S. law doesn’t offer these kinds of capabilities to our government.
But even in the case where we’re talking about some attempt to change the system, it has a number of protections built in that make it not very useful for trying to identify individuals holding specifically objectionable images. The hash list is built into the operating system, we have one global operating system and don’t have the ability to target updates to individual users and so hash lists will be shared by all users when the system is enabled.
He does not address Apple’s lack of ability to audit the hashes that it receives.
Update (2021-08-13): Nick Heer:
This note was appended one day after the Telegraph published its original report — that is, one day after it was cited by numerous other outlets. Unfortunately, none of those reports reflected the Telegraph’s correction and, because the Telegraph has a soft paywall and the title of the article remained “Apple scans photos to check for child abuse”, it is not obvious that there were any material changes to correct. Robinson’s Law strikes again.
Matthew Green (also: Edward Snowden):
People are telling me that Apple are “shocked” that they’re getting so much pushback from this proposal. They thought they could dump it last Friday and everyone would have accepted it by the end of the weekend.
Apple spent years educating the public on privacy for use as a marketing pitch and is now shocked that people care about privacy.
In a sense, it’s already too late. Apple hasn’t shipped the spyware yet, but Apple has already told the governments of the world that they will ship spyware in the operating system.
This is in stark contrast to what Apple said in the San Bernardino case.
Jokes aside, though, as engineers we regularly deal with complex systems that can be difficult for our users to understand. Having a hard time explaining how they work is one thing, but regardless of your position on this technology @Apple’s messaging has been unacceptable.
Their reluctance to clearly describe how the software works, their seeming inability to be straightforwards with the fact that it fundamentally detects CSAM using filters that they control and uploads it to them, is very concerning. This isn’t how you inspire trust.
“Encrypted” and “on device” and “hashed” are not magic words that magically grant privacy. You can’t say “nothing is learned about the content on the device” if you can take the vouchers it sends you and decrypt them–even if you are “sure” they are CSAM. That’s just incorrect.
Being better “compared to the industry standard way” does not mean the technology is automatically “private”. And when you say you’re better than the industry standard from the perspective of being auditable, don’t be in a place where you can’t verify you are doing any better.
You may be wondering why Apple includes this manual step of reviewing images before they are reported; the answer is U.S. v Ackerman. In this case, it was found that NCMEC is effectively a government actor due to the power that Congress has granted them. As a result, if NCMEC reviews a file, it is considered a 4th Amendment search; however, if Apple views the file and informs NCMEC of the content (conducting a private search that isn’t covered by the 4th Amendment), then NCMEC is free to view the file to confirm the accuracy of the report.
By manually reviewing the content prior to reporting, the search isn’t considered to be a violation of constitutional rights in the U.S., and thus can be used as evidence in court.
[…]
Based on how the system is designed, there doesn’t appear to be any need for the full image to be uploaded, only the Safety Voucher. Based on this design choice, it’s logical to conclude that the intention is to move beyond just iCloud into other areas.
[…]
Scanning images uploaded to iCloud for known CSAM is unlikely to have a notable impact. In a memo (discussed further below) to Apple employees from Marita Rodriguez, the Executive Director of Strategic Partnerships at NCMEC said, “…I hope you take solace in knowing that because of you many thousands of sexually exploited victimized children will be rescued…” - which sounds great, but is entirely unrealistic. This scanning system only looks for known CSAM that has been reported and added to the hash database; this system targets those collecting and trading CSAM. It’s not targeted to those producing new CSAM. While putting the criminals that traffic in this awful material in prison is a laudable goal, the impact is unlikely to resemble the goals NCMEC has expressed.
[…]
The fact that NCMEC hasn’t issued an apology and clarification is telling; they are doing little to work with privacy advocates to find solutions that meet these complex challenges, and instead attack and demean.
One can not reconcile these two things: 1.) Apple rolling out an automated, warrantless, opt-out surveillance tool to all US iCloud customers — and 2.) iPhone owners around the world having arbitrary data pushed to their devices by powerful nation-state adversaries who want them ruined.
The Pegasus story does not have a bookend. As it stands, it is very reasonable to assume that a hacker could push arbitrary data to your phone, including pictures. We have proof (and acknowledgement from Apple) that this is still happening. Because of the broken security of Apple devices, it is irresponsible to be rolling out an automated surveillance system, and frankly – exceedingly arrogant.
[…]
Apple’s CEO Tim Cook said at a Fortune event in 2017, when asked about its compliance with China’s censorship and problematic laws: “Each country in the world decides their laws and their regulations. And so your choice is: Do you participate, or do you stand on the sideline and yell at how things should be? You get in the arena, because nothing ever changes from the sideline.” Apple has been “in the arena” for well over a decade now, time for a scorecard.
But just because Apple has done its due diligence and made some careful choices in order to implement a tool to stop the spread of heinous material doesn’t mean that it’s off the hook. By making our phones run an algorithm that isn’t meant to serve us, but surveils us, it has crossed a line. Perhaps it was inevitable that the line would be crossed. Perhaps it’s inevitable that technology is leading us to a world where everything we say, do and see is being scanned by a machine-learning algorithm that will be as benevolent or malevolent as the society that implemented it.
Even if Apple’s heart is in the right place, my confidence that its philosophy will be able to withstand the future desires of law enforcement agencies and authoritarian governments is not as high as I want it to be. We can all be against CSAM and admire the clever way Apple has tried to balance these two conflicting needs, while still being worried about what it means for the future.
EFF (via Hacker News):
For example, the Five Eyes—an alliance of the intelligence services of Canada, New Zealand, Australia, the United Kingdom, and the United States—warned in 2018 that they will “pursue technological, enforcement, legislative or other measures to achieve lawful access solutions” if the companies didn’t voluntarily provide access to encrypted messages. More recently, the Five Eyes have pivoted from terrorism to the prevention of CSAM as the justification, but the demand for unencrypted access remains the same, and the Five Eyes are unlikely to be satisfied without changes to assist terrorism and criminal investigations too.
[…]
All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, the adoption of the iPhoto hash matching to iMessage, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. Apple has a fully built system just waiting for external pressure to make the necessary changes.
You wouldn’t think a US company could be forced to scan all of it’s customers data, but Yahoo was. Don’t make the same mistake Apple.
Been there, didn’t do that, got the t-shirt.
Here’s an op-ed @alexstamos and I co-authored about the risks of Apple’s content scanning plan. It’s short and easy to read, and I’m hoping it makes the issues digestible to non-technical people.
[…]
My personal proposal to Apple is to limit this tech to photo sharing rather than whole libraries, and release their hash function design. And ideally wait until researchers have time to vet it before launching to 1bn users.
There’s a crucial difference between possessing photos and sharing photos. The former is expected to be private, the latter not. This is why iCloud and Facebook are not comparable.
This issue is nuanced and Apple’s decisions involve concessions. Personally, I think Apple have done well here. They probably could have handled the communication surrounding the announcement better, but the actual functionality and policy decisions are reasonable.
[…]
You have to assume that privacy issues are a key reason why Apple has historically been so lax in this department. It’s not that Apple has sympathy for the people spreading child pornography. Why right now? That is still unclear. Perhaps, behind closed doors, someone was threatening lawsuits or similar action if Apple didn’t step up to par soon. Either way, it’s crunch time.
[…]
The weakest link in the chain on the technical side of this infrastructure is the opaqueness of the hashed content database. By design, Apple doesn’t know what the hashes represent as Apple is not allowed to knowingly traffic illicit child abuse material. Effectively, the system works on third-party trust. Apple has to trust that the database provided by NCMEC — or whatever partner Apple works with in the future when this feature rolls out internationally — does only include hashes of known CSAM content.
All the conversations the community has been having are mirrored inside Apple; I think it’s an understandable worry that Apple is prepared to sell out all of its users despite knowing — and informing them — predators can avoid the system by turning off iCloud Photos. No wins here
Joseph Menn and Julia Love (Hacker News, MacRumors):
A backlash over Apple’s move to scan U.S. customer phones and computers for child sex abuse images has grown to include employees speaking out internally, a notable turn in a company famed for its secretive culture, as well as provoking intensified protests from leading technology policy groups.
Apple’s senior vice president of software engineering, Craig Federighi, has today defended the company’s controversial planned child safety features in a significant interview with The Wall Street Journal, revealing a number of new details about the safeguards built into Apple’s system for scanning users’ photos libraries for Child Sexual Abuse Material (CSAM).
I see the Apple PR line on photo scanning is that you don’t understand what’s going on. Your tiny brain cannot comprehend the splendor of this technology.
Apple Inc. has warned retail and online sales staff to be ready to field questions from consumers about the company’s upcoming features for limiting the spread of child pornography.
In a memo to employees this week, the company asked staff to review a frequently asked questions document about the new safeguards, which are meant to detect sexually explicit images of children. The tech giant also said it will address privacy concerns by having an independent auditor review the system.
Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week, including design principles, security and privacy requirements, and threat model considerations.
[…]
The document aims to address these concerns and reiterates some details that surfaced earlier in an interview with Apple’s software engineering chief Craig Federighi, including that Apple expects to set an initial match threshold of 30 known CSAM images before an iCloud account is flagged for manual review by the company.
[…]
Apple also said that the on-device database of known CSAM images contains only entries that were independently submitted by two or more child safety organizations operating in separate sovereign jurisdictions and not under the control of the same government.
[…]
Apple added that it will publish a support document on its website containing a root hash of the encrypted CSAM hash database included with each version of every Apple operating system that supports the feature.
This bit about multiple organizations is interesting, but it raises additional questions. Apple previously said that the feature will start out as US-only. So they’re only going to report images to NCMEC and only images that are in the intersection of NCMEC’s database and some other foreign database? That would seem to drastically reduce the chances of finding legitimate matches, unless the organizations are all working together to exchange data, which of course raises more questions. And, if you’re in the US, does that mean Apple could be reporting images to NCMEC that are not even in the US database, but rather in two separate foreign ones?
See also:
Previously:
Update (2021-08-18): Joseph Menn and Stephen Nellis:
Apple Inc said on Friday that it will hunt only for pictures that have been flagged by clearinghouses in multiple countries.
That shift and others intended to reassure privacy advocates were detailed to reporters in an unprecedented fourth background briefing since the initial announcement eight days prior of a plan to monitor customer devices.
I’m glad that Apple is feeling the heat and changing their policy. But this illustrates something important: in building this system, the only limiting principle is how much heat Apple can tolerate before it changes its policies.
I’m not sure this is actually a shift, as it was hinted at in the original documents Apple released.
Wait, so Apple wanted to ensure that the Child Sexual Abuse Material (CSAM) tech is understood to be totally separate from the iMessage photo scanning feature, and yet they’re calling it “Communication, Safety, And Messages”? 🥴👏
This whole thing would not be happening if Katie Cotton were still in charge of corporate communications.
I’m not even kidding here: Apple screwed up the messaging on this so completely that I wonder if a certain key person or two is on an extended vacation or personal leave and wasn’t around to oversee this.
It is also striking how difficult it is for even a media-trained executive to clearly articulate these features. In Stern’s interview, there are several moments when she has to pause the interview to explain, in layperson terms, what is happening with the assistance of some effective graphics. I appreciate Stern’s clarifications and I understand them to be accurate, but I wish those words came from Apple’s own representative without needing interpretation. I think Apple’s representatives are still using too much jargon.
Issues with the scope of things that CAN be done with some power cannot be resolved by voluntary choices made by the holder of that power. So long as they hold that power, they can revise their choices at any time, and they can be compelled to do things at any time.
Eva:
I’d like to take this moment to make it clear to poor Craig that no, I don’t misunderstand Apple’s plans to check photos in iCloud against NCMEC’s database of CSAM. It’s well-meaning but it’s also creating a mechanism that Apple will be forced to use for other things.
The company who makes my CPU, RAM, and hard drive don’t have any right or privilege to see my information, nor does the company who provides the locks on the door of my home. A smartphone is no different. This is not a radical position.
While I agree that this is a major privacy issue and that alone should be sufficient to call for a halt to this, I am surprised I’m not hearing more property & property rights arguments: this is Apple assigning work to user owned devices for jobs which do not benefit the user.
Apple truly screwed this up in a way that is almost beyond comprehension. All their effort on establishing an image of respecting privacy out the window.
I seriously hope they reconsider the entire thing, but knowing Apple there’s no chance at all.
Eva:
In their new FAQ, Apple says they will refuse govt requests to use their CSAM-scanning tech to scan for other forms of content. How exactly will they refuse? Will they fight it in court? Will they pull out of the country entirely? This is not a time to get vague.
Central to its case is for us to trust Apple not to use this same mechanism for other purposes. When we can’t even trust Apple to tell us what it has changed on our own Macs, we should be rightly suspicious. If it is to work at all, trust must work both ways: if Apple wants our trust, it has to trust us with the knowledge of what’s in a macOS update.
All I want to do here is convey what I think is a strong case against co-opting personal devices for law enforcement purposes, so that people who have done nothing wrong and don’t have anything to hide can see where we’re coming from when as a tech community we push back on these things.
[…]
Apple will certainly comply rather than withdraw from the markets, as they have done so far in China. It is likely that no more powerful tool for surveillance authoritarianism has ever been conceived by humans.
Member of the German parliament, Manuel Höferlin, who serves as the chairman of the Digital Agenda committee in Germany, has penned a letter to Apple CEO Tim Cook, pleading Apple to abandon its plan to scan iPhone users' photo libraries for CSAM (child sexual abuse material) images later this year.
Sign the petition and email Apple leadership to tell them to drop these plans and recommit to never opening any sort of backdoor to monitor our communications.
Update (2021-08-21): Malcolm Owen (via Kosta Eleftheriou, MacRumors):
“It’s the reality. If you put back doors in a system, anybody can use a back door. And so you have to make sure the system itself is robust and durable; otherwise you can see what happens in the security world,” said Cook.
Update (2021-09-08): Ben Lovejoy (Hacker News):
Apple confirmed to me that it has been scanning outgoing and incoming iCloud Mail for CSAM attachments since 2019. Email is not encrypted, so scanning attachments as mail passes through Apple servers would be a trivial task.
Apple also indicated that it was doing some limited scanning of other data, but would not tell me what that was, except to suggest that it was on a tiny scale. It did tell me that the “other data” does not include iCloud backups.
Christina Warren returns to the show to discuss Apple’s controversial child safety initiatives, the tumultuous summer of Safari 15 beta UI designs, and a bit more on MagSafe battery packs.
Gordon Kelly (via Hacker News):
iPhone users have put up with a lot in recent months but the company’s new CSAM detection system has proved to be a lightning rod of controversy that stands out from all the rest. And if you were thinking of quitting your iPhone over it, a shocking new report might just push you over the edge.
John Koetsier (via Hacker News):
Apple fraud executive Eric Friedman told colleague Herve Sibert that Apple is the greatest platform for distributing child pornography. The comment sheds light on why Apple is now pursing a controversial program and automating checks for child porn on customers’ phones and in their messages.
In preceding messages, Friedman writes about a presentation the two managers have been working on to be shown to Eddy Cue later that morning. Friedman shows a slide describing features within iOS that have revealed fraud and safety issues. The two relevant concerns are reports of child grooming in social features — like iMessages and in-app chat — and in App Store reviews, of all places. Subsequent messages indicate that this is partly what Friedman was referring to.
Edward Snowden (via Hacker News):
You might have noticed that I haven’t mentioned which problem it is that Apple is purporting to solve. Why? Because it doesn’t matter.
Having read thousands upon thousands of remarks on this growing scandal, it has become clear to me that many understand it doesn’t matter, but few if any have been willing to actually say it. Speaking candidly, if that’s still allowed, that’s the way it always goes when someone of institutional significance launches a campaign to defend an indefensible intrusion into our private spaces. They make a mad dash to the supposed high ground, from which they speak in low, solemn tones about their moral mission before fervently invoking the dread spectre of the Four Horsemen of the Infopocalypse, warning that only a dubious amulet—or suspicious software update—can save us from the most threatening members of our species.
[…]
Apple’s new system, regardless of how anyone tries to justify it, will permanently redefine what belongs to you, and what belongs to them.
[…]
I can’t think of any other company that has so proudly, and so publicly, distributed spyware to its own devices—and I can’t think of a threat more dangerous to a product’s security than the mischief of its own maker. There is no fundamental technological limit to how far the precedent Apple is establishing can be pushed, meaning the only restraint is Apple’s all-too-flexible company policy, something governments understand all too well.
Previously:
The App Store feature on the Australian App Store, first highlighted by Beau Nouvelle on Twitter, is called “Slime relaxations” and reportedly features apps that are non-functional and seek to charge disproportionately costly in-app purchase subscriptions.
If Apple is going to promote an app like this, why would we trust them when they say the App Store leads to a safer software environment for users?
Assuming that Apple employees didn’t select these apps on purpose, they must have incorrectly slipped through App Review and then were featured because they were selling well. Either no human review was required for featuring, or that second round of curation also failed.
I’ve recently written about legitimate developers being blocked from development and having their accounts terminated, while the privacy labels that Apple touts as a benefit of the store have been shown to be a facade. We’re 13 years in, and it seems like the gap between the promise of the App Store and the reality is only increasing.
Of course the real victim here is the user who is being scammed by these apps. I’m just saying that’s not all there is to it.
I’ve seen indie developers putting blood and sweat into their apps hoping to get featured. Seeing scam apps getting featured instead is frustrating.
Previously:
Update (2021-08-06): Jeff Johnson:
One problem with App Store is that consumers trust it too much. They’re told Apple curates the store, but that’s just false.
For example, #23 (iBuy from Amazon) and #29 (Open With) top paid in Mac App Store have abysmal ratings and reviews. Do consumers even look before buying?
We developers can complain about review fraud in the App Store, but “i should’ve read the other reviews” is telling. A lot of consumers are not even paying attention to the reviews. They fail to exercise “caveat emptor” because Apple has told them App Store is safe.
Make no mistake, Apple is promoting these apps. #4 and #5 in the Utilities category. The top charts are visibility.
The Store page, launched yesterday, seems to fix the clunkiness of that previous redesign. You can tell it is important because it is the first menu item after the Apple logo. It also, unfortunately, makes liberal use of horizontal scrolling. It feels like a page that was laid out before the designer knew what would be in it. At least it again exists.
I like having a separate store page, but it wish it worked more like a Web page. And I wish Apple’s store apps didn’t rely on horizontal scrolling, either. I like lists and column views, not grids of rounded rectangles.
Previously:
Update (2021-08-09): Ken Segall:
Imagine if the physical Apple Stores replicated the “improved” online buying experience.
Every Apple product would be in a separate room with its own private entrance. Visitors to each room would be effectively shielded from what Apple has spent decades building—a rich ecosystem of products and services all designed to work together.
This “streamlining” of the online buying experience was a classic case of overthink. Thankfully, that’s all behind us now.
[…]
Why did adding Buy buttons on product pages require blowing up the entire online Apple Store in the first place?
Why did this voyage to the Land of the Blatantly Obvious take six long years?
[…]
Not “Apple Store”? Not “ Store”? Just ………… Store?
Update (2021-08-10): See also: Upgrade.
Jon Brodkin (MacRumors, Bruce Schneier):
Zoom has agreed to pay $85 million to settle claims that it lied about offering end-to-end encryption and gave user data to Facebook and Google without the consent of users. The settlement between Zoom and the filers of a class-action lawsuit also covers security problems that led to rampant “Zoombombings.”
The proposed settlement would generally give Zoom users $15 or $25 each and was filed Saturday at US District Court for the Northern District of California. It came nine months after Zoom agreed to security improvements and a “prohibition on privacy and security misrepresentations” in a settlement with the Federal Trade Commission, but the FTC settlement didn’t include compensation for users.
Previously:
Arkadiy Tetelman (via Hacker News):
As part of the investigation, Amnesty International wrote a blog post with their forensic analysis of several compromised phones, as well as an open source tool, Mobile Verification Toolkit, for scanning your mobile device for these indicators. MVT supports both iOS and Android, and in this blog post we’ll install and run the scanner against my iOS device.
After studying the Mobile Verification Toolkit’s Python code, my colleagues and I quickly realized how uniquely positioned we were to facilitate the process even further. iMazing is built on a toolkit which was developed and refined over a decade for the purpose of simplifying iOS backups, file transfers and local device management tasks. It would therefore be possible to relatively quickly re-implement MVT’s methodology in our toolkit, and integrate a user-friendly ‘wizard’ in iMazing’s user interface. And because iMazing can already perform iOS backups and decrypt backup files, the tool we envisaged had the potential to dramatically reduce the technical barrier of entry whilst enhancing performance and promoting backup encryption.
At the same time, we started getting Pegasus-related requests from current iMazing users, and noticed increasing interest in MVT from a public not always tech-savvy enough to successfully run its command-line tools. We took the plunge, shifting most of our Windows and macOS development resources to the realisation of a fully integrated equivalent in iMazing. Today, we are releasing the result of that work as a free feature in iMazing 2.14. No setup or prior backup is required – all it takes to get started is to launch iMazing, connect an iPhone and select the Detect Spyware action[…]
Previously:
As of macOS Big Sur, instead of shipping the system libraries with macOS, Apple ships a generated cache of all built in dynamic libraries and excludes the originals. This tool allows you to extract these libraries from the cache for reverse engineering.
[…]
This tool loads the private
dsc_extractor.bundle
from Xcode, meaning whichever it should always be able to extract the newest versions of the file for beta OS versions.This logic is based on the function at the bottom of
dyld3/shared-cache/dsc_extractor.cpp
from thedyld
source dump.
Previously:
There are two kinds of attributes in Swift—those that apply to declarations and those that apply to types. An attribute provides additional information about the declaration or type. For example, the
discardableResult
attribute on a function declaration indicates that, although the function returns a value, the compiler shouldn’t generate a warning if the return value is unused.
Underscored Attributes Reference (via Slava Pestov):
This document is intended to serve as a counterpart describing underscored attributes, whose semantics are subject to change and most likely need to go through the Swift evolution process before being stabilized.
There are also @inline(__always)
and @inline(never)
, which are not documented above, but which are discussed here and here.
Previously:
guard
captures behave likeweak
captures (e.g.guard
captures do not retain the captured value), but the closure body is only executed if the captured objects still exist.[…]
As of SE-0269, strong and
unowned
captures ofself
enable implicitself
calls within the body of escaping closures. This is not straightforward to support forweak
closures in the general case, and was intentionally excluded from SE-0269.[…]
guard let value = value else { return }
is quite a bit of boilerplate in this context.
Previously:
Griffin Jones (via John Gruber):
The Apple Desktop Bus Keyboard is first to include a power button, the Snow White design and the ADB port, three welcome additions. The lower key travel makes sustained typing a little easier. The mechanism has a very cleanly defined click, although it feels more brittle than premium. I rate it 4⁄5 stars.
This was a terrific keyboard that I first used with an Apple IIGS. It arguably feels better than the Apple Extended Keyboard II, but it’s missing the page navigation keys and function keys, and it had the arrow keys arranged in a line. So I ended up using the latter with my Macs, even into the USB era.
The AppleDesign Keyboard is a cheap cost-cutting imitation of the Extended Keyboard. It doesn’t even have an embedded Apple logo, just its silhouette punched into the mold of plastic. The symbolism that Apple was only a shadow of its former self in the mid-90s could not be any clearer. I rate it 2⁄5 stars.
This one just felt bad. The keys sprung up slowly like it was sticky inside, and they didn’t click enough when pressed down.
From 2007 to 2016, this [Aluminum Keyboard] keyboard design reigned supreme across all Macs. The flat black keycaps are more attractive and higher contrast, for sure, but at the expense of usability. I rate it 4⁄5 stars.
This is what I’ve been using since encountering Bluetooth flakiness with the wireless version and macOS 10.12, along with missed keystrokes when logging in even on later releases. (These problems seem to affect all Bluetooth keyboards, not just Apple’s.) I’m not sure why he says it has black keycaps. I still like this keyboard. The only flaw has been that the letters completely wear off.
For notebook keyboards, I still think the generation before the butterfly (e.g. on the 2012 Retina MacBook Pro) was better than the post-butterfly scissor design (e.g. on the 2019 MacBook Pro).
Previously:
When you track a friend or a family member using the Find My app, it now shows continuous streaming updates on their location rather than updating with a new location every few minutes.
[…]
Devices that have been turned off can still be tracked by the Find My network in iOS 15.
[…]
If someone steals your iPhone and then erases it, in iOS 15, it’s still going to show up in the Find My app, and it will be trackable even after it’s been wiped.
[…]
With Separation Alerts, the Find My app can let you know if an iPhone or iPad is left behind by alerting you on one of the other devices with you.
[…]
AirPods have always shown up in the Find My app, but until now, functionality has been limited.
[…]
There’s now a Find My widget that you can add to the Home screen or the Today View to track items at a glance without having to open up the Find My app.
I’ve missed having the widget on my Mac since it was removed a few versions ago.
Previously:
Once we had a list of the flaky tests, we tried to go through each one and determine why they were failing. We found that some UI elements such as menus and popovers were particularly prone to flakiness — they would sometimes be dismissed by the system for no discernable reason!
[…]
Since we already had the JUnit parsing code, we decided to build on top of that and rerun only the failed tests. By using the
xcodebuild
command’s-only-testing
flag, we ran only the failed tests again. Another optimization we made was to build the project only once, even when testing multiple times. We accomplished that by using thexcodebuild build-for-testing
andxcodebuild test-without-building
commands.[…]
Flaky tests still exist, but they no longer slow down the workflow of our developers. CI automatically retries any failing tests, and almost all flaky tests pass when run again. If a test actually fails three times in a row, only then it is considered an actual failure and the build is marked as failed.
Xcode 13 has a built-in option to do this. But why are these tests flaky?
Ian Carlos Campbell (via Hacker News):
Amazon’s Kindle e-readers with built-in 3G will begin to lose the ability to connect to the internet on their own in the US in December, according to an email sent to customers on Wednesday. The change is due to mobile carriers transitioning from older 2G and 3G networking technology to newer 4G and 5G networks. For older Kindles without Wi-Fi, this change could mean not connecting to the internet at all.
Jim Salter (via Hacker News):
The MuseScore app itself is licensed GPLv3, which gives developers the right to fork its source and modify it. One such developer, Wenzheng Tang (“Xmader” on GitHub) went considerably further than modifying the app—he also created separate apps designed to bypass MuseScore Pro subscription fees.
[…]
It’s important to note that the application itself and the sheet music to which it provides access are not the same thing, and they are not provided under the same license. The application itself is GPLv3, but the musical works it enables access to via musescore.com have a wide variety of licenses, including public domain, Creative Commons, and fully commercial.
In the case of commercial all-rights-reserved scores, Muse Group is not generally the rightsholder for the copyrighted work—Muse Group is an intermediary that has secured the rights to distribute that work via the MuseScore app.
[…]
Bypassing those controls leaves Muse Group on the hook either for costs it has no way to monetize (e.g., by ads for free users) or for violating its own distribution agreements with rightsholders (by failing to properly track downloads).
[…]
[While]
musescore-downloader
facilitates unlicensed downloads of DMCA-protected works, it does not itself contain those works, which means GitHub itself can ignore DMCA takedown requests.
Previously:
In my defense, you really cannot tell normal packages from distribution packages in the default configuration of Suspicious Package, but if I had bothered to read the manual and/or explore the Preferences window, I would have found this option[…]
This will show the
Distribution
xml file at the top of the list of the ‘All Scripts’ pane for distribution packages. When you see no Distribution file there, the package is a component package.The second checkmark in that preference window is also very useful. With “Component package and bundle info” enabled you can see which component contains the selected file in the info pane[…]
Previously:
Despite activating Apple’s App Tracking Transparency feature (launched in 2021 with iOS 14.5), along with our review explicitly asking Yelp to “Do Not Track”, the app still attempted to reach out to multiple known third-party trackers. From our experiments, we found that Apple’s App Tracking Transparency neither stops tracking, nor provides any real transparency, and instead gives users a false sense of privacy.
Previously: