Uber Used Private API to Access iPhone Serial Number
Mike Isaac (Hacker News, MacRumors):
For months, Mr. Kalanick had pulled a fast one on Apple by directing his employees to help camouflage the ride-hailing app from Apple’s engineers. The reason? So Apple would not find out that Uber had been secretly identifying and tagging iPhones even after its app had been deleted and the devices erased — a fraud detection maneuver that violated Apple’s privacy guidelines.
But Apple was onto the deception, and when Mr. Kalanick arrived at the midafternoon meeting sporting his favorite pair of bright red sneakers and hot-pink socks, Mr. Cook was prepared. “So, I’ve heard you’ve been breaking some of our rules,” Mr. Cook said in his calm, Southern tone. Stop the trickery, Mr. Cook then demanded, or Uber’s app would be kicked out of Apple’s App Store.
Why did they do this?
At the time, Uber was dealing with widespread account fraud in places like China, where tricksters bought stolen iPhones that were erased and resold. Some Uber drivers there would then create dozens of fake email addresses to sign up for new Uber rider accounts attached to each phone, and request rides from those phones, which they would then accept. Since Uber was handing out incentives to drivers to take more rides, the drivers could earn more money this way.
[…]
Mr. Kalanick told his engineers to “geofence” Apple’s headquarters in Cupertino, Calif., a way to digitally identify people reviewing Uber’s software in a specific location. Uber would then obfuscate its code for people within that geofenced area, essentially drawing a digital lasso around those it wanted to keep in the dark. Apple employees at its headquarters were unable to see Uber’s fingerprinting.
Some thoughts:
If Uber was just transmitting the device serial number (obtained via private API) to their servers, and only using it for this type of fraud prevention, that doesn’t seem very harmful. It’s a far cry from what people were initially speculating about—personal location tracking and messing up the phone’s software in a way that persisted after it was erased. Some sloppy wording in the original version of Isaac’s article has since been revised, so it no longer sounds like Uber hacked anything or that iOS has a major security flaw.
I can sympathize with Uber here. They had a business problem with a simple technical solution that was unavailable due to Apple’s rules. What they wanted to do was not really against the spirit of the rules. That is a frustrating situation to be in.
But this wasn’t an innocent mistake. Uber knew it was violating the rules—IOKit is definitely private API—and went to great lengths to try to prevent Apple from figuring out what they were doing.
Despite the blatant violation and deceit, there was no punishment (that we know of). Uber’s app remained in the App Store the entire time, even though the guidelines say that “If you attempt to cheat the system[…] for example, by trying to trick the review process[…] your apps will be removed from the store and you will be expelled from the Developer Program.” There’s little incentive for any (big) developer to follow the rules if the worst case scenario is that you temporarily gain a competitive advantage, until caught, and then nothing further happens.
App Review was not able detect the private API usage, even though the IOKit calls seem to be directly visible in the compiled code.
I wonder how well Uber is currently addressing the fraud problem, without access to the serial numbers.
It seems like there could be multiple legitimate reasons to want to know whether an app has previously been installed on a given device. Apple likes to talk about how privacy is not at odds with features. Perhaps that could be the case here, too, if iOS had an API to answer this question without the app needing to access or store a device identifier.
Apple didn’t say anything publicly, so even though this happened in 2015 (with the offending behavior likely in 2014), customers didn’t find out about it until now.
There definitely seem to be different rules for different developers. Smaller developers get their apps pulled from the store, with an automated e-mail giving a day or week’s notice (or even none). Without proof of wrongdoing, they get slimed in the press and banned from the store even if they write a blog post absolving Apple. But if you’re Uber, you get a one-on-one meeting with Tim Cook, your app stays in the store, and your customers are kept in the dark. Uber probably needs iPhone users more than Apple needs Uber, but this may not be universally true. What would happen if Facebook, which has also been accused of various hijinks, got into a dispute like this with Apple?
That said, it’s got to be a tough situation for Apple to be in. They’re trying to protect their customers, but denying them access to an important transportation service would harm them far more than what Uber did. And what if this were an app that provided an essential medical function? The store is full of apps that flout the rules, but I don’t think Apple could ignore the geofencing. It looks like it tried to thread the needle by getting Uber to comply with the rules but then being lenient.
I, too, wonder what Uber’s Android app does.
Update (2017-04-30): See also: Accidental Tech Podcast.