Summary
- Google’s proposal, Web Environment Integrity (WEI), aims to send tamper-proof information about a user’s operating system and software to websites.
- The information sent would help reduce ad fraud and enhance security, but it also raises concerns about user autonomy and control over devices.
- The authors argue that implementing WEI could lead to websites blocking access for users not on approved systems and browsers.
- They express worries about companies gaining more control over users’ devices and the potential for abuse.
- The authors emphasize that users should have the final say over what information their devices share.
- Remote attestation tools, like WEI, might have their place in specific contexts but should not be implemented on the open web due to potential negative consequences.
- The authors advocate for preserving user autonomy and the openness of the web, emphasizing that users should be the ultimate decision-makers about their devices.
Joke:
Two pieces of string walk into a bar. The first piece of string asks for a drink. The bartender says, “Get lost. We don’t serve pieces of string.”
The second string ties a knot in his middle and messes up his ends. Then he orders a drink.
The bartender says, “Hey, you aren’t a piece of string, are you?” The piece of string says, “Not me! I’m a frayed knot.”
The problem is that Google has such a monopoly over web browsers that Firefox will most probably have to follow and implement this shit as well.
Smells like “this website is only compatible with Internet Explorer 7 or higher” kind of stuff, those were bad back then, it will be a lot worse now.
> it will be a lot worse now
On the other hand: A website implementing such a functionality does not want me as a user. That’s fine. I’ll find the information elsewhere or give them useless date from within a VM. Starting and stopping minimalist single-purpose VMs isn’t hard nowadays.
It’s easy for us as we are tech literate, but I mostly think of the average person that “doesn’t care about privacy and personal data”. We’re also not Google’s main demographic. When most websites use this kind of shit, it will be extremely hard for everyone to get away from it.
> but I mostly think of the average person that “doesn’t care about privacy and personal data”
I stopped thinking of them. But yes, those people will have their data stolen by Google, as usual. But those people also don’t care one single bit about that.
To be fair, those people are my girlfriend, her parents, mine, my friends and such. When you see the damage a company like Facebook has done to the world, I would definitely try not to continue giving them any more power to fuck shit up. Giving a DRM like tool to Google could be absolutely devastating for the free web and the open internet.
According to this comment, the changes Google is making will tell websites if you’re in a vm or not.
Comment text if there are linking problems:
The idea is that it would be similar to hardware attestation in Android. In fact, that’s where Google got the idea from.
Basically, this is the way it works:
You download a web browser or another program (possibly even one baked into the OS, e.g. working alongside/relying on the TPM stuff from the BIOS). This is the “attester”. Attesters have a private key that they sign things with. This private key is baked into the binary of the attester (so you can’t patch the binary).
A web page sends some data to the attester. Every request the web page sends will vary slightly, so an attestation can only be used for one request - you cannot intercept a “good” attestation and reuse it elsewhere. The ways attesters can respond may vary so you can’t just extract the encryption key and sign your own stuff - it wouldn’t work when you get a different request.
The attester takes that data and verifies that the device is running stuff that corresponds to the specs published by the attester - “this browser, this OS, not a VM, not Wine, is not running this program, no ad blocker, subject to these rate limits,” etc.
If it meets the requirements, the attester uses their private key to sign. (Remember that you can’t patch out the requirements check without changing the private key and thus invalidating everything.)
The signed data is sent back to the web page, alongside as much information as the attester wants to provide. This information will match the signature, and can be verified using a public key.
The web page looks at the data and decides whether to trust the verdict or not. If something looks sketchy, the web page has the right to refuse to send any further data.
They also say they want to err towards having fewer checks, rather than many (“low entropy”). There are concerns about this being used for fingerprinting/tracking, and high entropy would allow for that. (Note that this does explicitly contradict the point the authors made earlier, that “Including more information in the verdict will cover a wider range of use cases without locking out older devices.”)
That said - we all know where this will go. If Edge is made an attester, it will not be low entropy. Low entropy makes it harder to track, which benefits Google as they have their own ways of tracking users due to a near-monopoly over the web. Google doesn’t want to give rivals a good way to compete with user tracking, which is why they’re pushing “low-entropy” under the guise of privacy. Microsoft is incentivized to go high-entropy as it gives a better fingerprint. If the attestation server is built into Windows, we have the same thing.
Won’t a User Agent Switcher be enough? Firefox has an extension like this and they even recommend it