The Potential Advantages of a JavaScript Whitelist
What I want is two related and similar things:
- The ability to turn off JavaScript by default, and turn it on only for selected sites. (For me that would be sites like GitHub.)
- The ability to turn off cookies by default, and, again, turn them on only for selected sites.
If it‘s the opposite — if I have to blacklist instead of whitelist — then I’d be constantly blacklisting. And, the first time I go to a site, it gets to run code before I decide to allow it.
When you think about it, it’s pretty nuts that we allow the automatic execution of whatever code a web developer wrote. We don’t do that for anything else, really — certainly not to the same extent of possibly hundreds of webpages visited daily, each carrying a dozen or more scripts.
[…]
It’s baffling to me that trackers, ad networks, cryptocurrency miners, and image lightboxes are all written for the web in the same language and that there is little granularity in how they’re treated. You can either turn all scripts off and lose key functionality on some websites, or you can turn everything on and accept the risk that your CPU will be monopolized in the background.
What if pages were allowed a certain amount of JavaScript CPU time, beyond which they had to request more from the user?
I would also like to see a report of what the JavaScript is doing, i.e. which information it’s reading and which servers it’s contacting. Part of the reason things have gotten so out of hand is that users can’t see what’s happening. I like how the iCab browser would always report whether a page had valid HTML, and how the macOS battery menu shows which apps are using significant energy.
My number one feature request for Safari, a whitelist for Javascript use, defaults to disabled when whitelist enabled. Battery life doubled in one feature!
Previously: Intelligent Tracking Prevention 2.2.
Update (2019-05-21): It looks like Chrome already implements what Simmons is suggesting.
3 Comments RSS · Twitter
One of the fairly terrible results of 'Web is now JavaScript' is now we're relying on the client (browser) to render literally all of the view, and the original HTML delivered to the client is simply a fairly empty HTML file that calls a craptonne of webpack'd JS. This has huge implications for accessibility, and even SEO.
The NoScript add-on for Firefox supports a whitelist model. The problem is nowadays a lot of JS isn’t served from the website’s domain, but rather from [long-UUID].cloudfront.net or similar. So in practice, tending to the whitelist and making accurate decisions can be very tedious and error-prone.
I wouldn't hold my breath waiting for Apple to add something like that to Safari. Use an open source browser like Firefox and install some good plugins.