Covert Web-to-App Tracking via Localhost on Android
Local Mess (via Dan Goodin):
We disclose a novel tracking method by Meta and Yandex potentially affecting billions of Android users. We found that native Android apps—including Facebook, Instagram, and several Yandex apps including Maps and Browser—silently listen on fixed local ports for tracking purposes.
These native Android apps receive browsers’ metadata, cookies and commands from the Meta Pixel and Yandex Metrica scripts embedded on thousands of web sites. These JavaScripts load on users’ mobile browsers and silently connect with native apps running on the same device through localhost sockets. As native apps access programatically device identifiers like the Android Advertising ID (AAID) or handle user identities as in the case of Meta apps, this method effectively allows these organizations to link mobile browsing sessions and web cookies to user identities, hence de-anonymizing users’ visiting sites embedding their scripts.
This web-to-app ID sharing method bypasses typical privacy protections such as clearing cookies, Incognito Mode and Android’s permission controls. Worse, it opens the door for potentially malicious apps eavesdropping on users’ web activity.
Jorge García Herrero (via Hacker News):
Meta faces simultaneous liability under the following regulations, listed from least to most severe: GDPR, DSA, and DMA (I’m not even including the ePrivacy Directive because it’s laughable).
[…]
The Pixel script in your browser tries to send information to the Facebook/Instagram app that’s “listening” in the background.
It uses a technique called WebRTC, normally used for voice or video calls (like Zoom or Google Meet), but here it’s being used to secretly transmit data between the browser and the app.
Additionally, a technical trick called “SDP Munging” allows the browser to insert data (like the _fbp cookie identifier) into the WebRTC “initial handshake” message.
What they’ve done here may not have broken any laws, but there certainly should be laws against it. And in terms of simple common sense, the entire elaborate scheme only exists to circumvent features in Android meant to prevent native apps from tracking you while you use your web browser.
The difference between targeted advertising and spyware is there is no difference.
After Girish, et al., disclosed this behaviour, Meta’s apps ceased tracking users with this method, and Goodin said Yandex will also stop.
I’ll note that among the so-called “interoperability” requirements the European Commission is demanding of iOS is for third-party apps to run, unfettered, in the background, because some of Apple’s own first-party software obviously runs in the background.
I think the problem is the IPC, not the running in the background. The user should have control over whether apps can open up ports for listening and whether Web sites can connect to 127.0.0.1.
Every one of the sites that includes these tracking scripts is complicit to some extent in the theft of hundreds of millions of Android users’ web browsing privacy.
This sort of bullshit is why I use the web instead of native apps from Meta/Facebook/Instagram.
Previously:
10 Comments RSS · Twitter · Mastodon
As per norm, Gruber with the most idiotic take.
This is already not possible on macOS by default by way of TCC and sandboxing. But don't let technical facts colors your retarded takes, Gruber.
@Léo How does TCC play into this? Sandboxing does have entitlements for being a network client (not that useful since most apps use the network) and server (which should help with this).
Both iOS and macOS display an alert to the user, I assume by TCC, asking if they would like to enable access to local network, which includes creating a server on 127.0.0.1/localhost, in addition to Bonjour discovery, Wi-Fi discovery and other scenarios. One could argue that perhaps this "Local Network" category is way too broad, but it's there. If you disable SIP, I don't think it's enforced, but that's beside the point.
Sandboxing I think prevents a webserver from exposing these ports altogether, but there might be exceptions or Info.plist keys to break through.
@Léo Local Network Privacy is not part of TCC. But I agree with your overall point that Mac/iOS users are overall pretty well protected by default, though there is room for improvement.
Ah interesting, thank you for that link! But yeah, my "TCC" comment is more of the system's privacy machinery. Gruber's takes, as well as "mainstream" Apple media in general, always conveniently forget that sideloading, which already exists on iOS, does not forgo all of Apple's security, which is often implemented at the kernel level. iOS has had "Enterprise Deployment" for decades now, which is pretty much the sideloading dream. At most, you get to call private APIs unchallenged, and even that, since most security happens at the kernel or in XPC daemons (like TCC check for contact access, for example), the damage of private API use is at most crashing your app due to OS update and the said API being changed or removed. That's up to developers. But users' security and privacy remains protected. And private API can be hidden so easily, it is already littered in the App Store by anyone that is not a complete beginner. See for example what I currently use to hide API very conveniently:
https://x.com/LeoNatan/status/1830377457603493930
https://gist.github.com/LeoNatan/999282bfc53084f6c70e43b08dddc281
It wasn't like that 15 years ago, but I'd say 9-10 years ago, when Apple started taking their security measures to the kernel, access to sensitive data became very difficult, if impossible, unless Apple forgot to add hooks for specific endpoints.
Also, on iOS, you do not have the concepts of a root user, a root daemon, etc., so this simplifies everything by a lot. Even if Apple is "forced" by the "communist" EU to allow background daemons, those would be limited just as much as a sandboxed XPC daemon is already limited, so user is still protected, but daemon misbehavior might impact battery life. So just show me this info in the UI and let me decide. You know, like a grown up.
Of course, there is always the caveat of "there could be a sandbox escape exploit", but I would consider this a malware software, which Apple is supposed to catch with notarization.
So basically, the user is covered. There is no technical reason not to allow sideloading, like Apple has allowed for decades on iOS.
@Léo Yeah, I think they should add sideloading. I think the best argument against it is that there would be no real mechanism to prevent apps asking for entitlements that they don’t actually need. But I’m not sure that App Review does a good job of policing this, and most of the interesting capabilities are gated by TCC or other user switches, anyway.
Which "entitlements" do you mean? Do you mean simply asking for contacts, location, etc. permission, or do you mean actual entitlements embedded in the binary signature to allow certain functionality?
Apple splits entitlements into several categories, some of which require an Apple-signed provisioning profiles, or else a binary is rejected by the loader at the kernel level. Other entitlements are validated against team ID, bundle ID, etc., and are easier. Some entitlements require explicit Apple approval before they add it to a specific provisioning profile. Other entitlements are private and are granted by Apple if your business is large enough and/or you know people in Apple. 🤡 (Back in the day, when Cisco was the only "official" VPN provider, we at Check Point used to have a private VPN entitlement and private vpnd plugin headers; the need for these went away once Network Extensions were introduced).
With enterprise deployment, you basically go through Apple's website, so you have access to the full suite of tools to request entitlements. There is no reason why it couldn't still work that way, but that could also be seen as gatekeeping, and a sufficiently motivated EU could strike the use of entitlements as illegal also, which would have significant impact on security but also would allow a much broader categories of software to run on iOS. I run my macOS with AMFI disabled so that I am not artifically limited by entitlements. This cannot currently be disabled on iOS.
With regards to simple permissions, apps on the App Store are already spuriously asking for many unnecessary permissions, and even block the app's functionality if not granted permissions, something that used to be disallowed on the Store, but it seems nobody cares anymore. So every silly Apple or pundit reasoning either doesn't apply or is already so broken on the current Store, users would see no actual difference.
@Léo I mean the actual ones in the binary signature. There are some that you can just claim yourself and you get them so long as App Review doesn’t reject you. But, yes, the more serious ones need the provisioning profile so there is already another mechanism for gatekeeping. I agree that the “best argument” kind of falls down in practice because Apple isn’t doing its job very well. But they’ll tell the courts that every app is very carefully reviewed, and so far I don’t think they’ve been challenged much on that in front of a judge.
Yeah, not challenged in court, and not challenged by media either. Do you imagine Stern doing an expose at WSJ over scams on the store, or technically shooting down inaccurate claims by Apple? If she did, she wouldn’t get that Federighi interview next year!
With the courts, it’s slightly more understandable, but ugh. I would like to see some unexpected Apple exec try to explain how a side loaded binary would get around the entitlement requirements or escape kernel checks. It’s not even that difficult to explain to a judge.