It depends how websites choose to implement it, and how other browsers choose to implement it.
If Firefox et.al chooses not to implement browser environment integrity, then any website that chooses to require strict integrity would completely cease to work on Firefox as it would not be able to respond to a trust check. It is simply dead. However, if they do implement it, which I imagine they would if this API actually becomes widespread, they should continue to work fine even if they're stuck with the limitations on environment modification inherent to the DRM (aka rip adblockers)
Websites will vary though. Some may not implement it at all, others may implement a non-strict integrity check that may happily serve browsers that do not pass the check. Third parties can also run their own attestation servers that will report varying levels of environment data. Most likely you will see all Google sites and a majority of "big" websites that depend on ad revenue implement strict integrity through Google attestation servers so that their precious ads don't get blocked, and the internet will become an absolutely horrid place.
Frankly I'll just stop using anything and everything that chooses to implement this, since we all know Google is going to go full steam ahead with implementation regardless of how many users complain. Protecting their ad revenue is priority 1 through 12,000 and fuck everybody else.
I feel like I fully lack the words to describe what I mean here, although I’m confident in my understanding of the idea. (Which is to say, please give me charity when untangling my rambling.)
I share your sentiment and I’ve been thinking about this the past few days.
I’ve read in a few places that Musk is trying to turn twitter into a ‘one-app’ in the same way that WeChat is. The common pushback against that is that we already have that - it’s the web browser. The web browser isn’t going anywhere.
But turning the browser into a closed ecosystem that Google gets to set the standard for, harvest the data for, advertise through, and ensure that users are locked in to their version of the experience/data that they collect essentially makes Chrome the one-app.
In much the same way that google killed XMPP, Microsoft used its weight to hamstring open document formats - this seems like an effort to thread a rope around the neck of the open internet and use google’s considerable market share to close off the open internet.
Somewhat ironically, we may find ourselves in search of a ‘new, open internet’ if corporations continue to define our current internet.
Maybe we’ll call it “Web 1.0.”
This. Like for real. I might be in a minority here but but I'm not going to just accept this crap and deal with it. If you implement these changes and your site is not absolutely essential for me then I'm going elsewhere. If 90% of big websites become unusable with my browser then I'm going to hang in the rest 10% with my like-minded folks. I don't care that it's quiet and much more slow paced, it's still better than the shit everyone else is serving and frakly better for my mental health aswell.
I spent like 2 to 3 hours on reddit every single day for 10 years. Then they killed my favourite app and I just quit then and there and haven't looked back. I have no problem doing that again.
I have a weak grasp of this, but a developer working on this responded to some criticism.
If the developers working to implement this are to be believed, they are intentionally setting it up so that websites would have an incentive to still allow untrusted (for lack of a better term) clients to access their sites. They do this by intentionally ignoring any trust check request 5% - 10% of the time, to behave as if the client is untrusted, even when it is. This means that if a website decides to only allow trusted clients, they will also be refusing trusted clients 5% - 10% of the time.
The relevant part of the response is quoted here:
WEI prevents ecosystem lock-in through hold-backs
We had proposed a hold-back to prevent lock-in at the platform level. Essentially, some percentage of the time, say 5% or 10%, the WEI attestation would intentionally be omitted, and would look the same as if the user opted-out of WEI or the device is not supported.
This is designed to prevent WEI from becoming “DRM for the web”. Any sites that attempted to restrict browser access based on WEI signals alone would have also restricted access to a significant enough proportion of attestable devices to disincentivize this behavior.
Additionally, and this could be clarified in the explainer more, WEI is an opportunity for developers to use hardware-backed attestation as alternatives to captchas and other privacy-invasive integrity checks.
Thats such a weird clause to include and is likely just a honeypot. Why even bother allowing unverified browsers to connect, since it invalidates the entire purpose of the trust system? If any bad actor can simply choose to not use the trust system while still having full access, then the system is less than useless for its stated purpose (catch bots/bad faith traffic, ensure no malware) and serves only to decrease the speed and experience of legitimate users.
That opt-out clause won't last a year once it's mandatory in Chromium.
An attestation method that randomly fails in 5-10% of cases makes no sense. It's not attestation anymore, it's a game of dice. This is blatant rhetoric in response to the DRM criticism. Nobody sane would ever use such a method.
That sounds nice but there's no guarantee they'll implement it, or if they do, that they won't just remove it someday down the road. This could just be a way for them to avoid criticism for now, and when criticism has died down a bit, they can just remove it.
Maybe this thing will evolve into two webs. One where the majority using Chrome will be, mostly busy watching ads and reading the shitty sites Google has picked for them.
But another where browsers who don't support this can be. Stuff like Lemmy and forums and other things run by individuals with an interest and passion.
We would still need to use chrome for the official stuff like our bank's or office websites, but there would be another world out there for people who refuse to accept being subjected to this shit. Alternative websites would shoot up and became popular.
However, if they do implement it, which I imagine they would if this API actually becomes widespread,
The problem is, is not really possible to implement it in a truly open-source browser, since anyone compiling it themselves (including distro maintainers) would fail the check unless they perfectly match a build approved by the attestor. If it differs from the approved version, that's specifically what WEI is intended to prevent.
Sure, they're against it, but if it gets implemented by Chrome and by many major websites, they won't have a choice but to implement it as well. Otherwise, their browser just won't work and people will have to use Chromium browsers or nothing at all.
Honestly, they could have good grounds for an antitrust lawsuit if this API comes to pass and everyone uses Google attestation servers. It's gardenwalling the browser space just like Microsoft was.
Firefox will be in a tight corner assuming every other browser vendor picks this up. They can decide to go against it but Firefox does not live in isolation.
What everybody seems to be forgetting is that there is a ton of web-content fetching being done right now which is not done by browsers.
For example, all the webcrawlers doing indexing for search engines.
Consider the small possibility that any major website that does this either becomes inaccessible for any webcrawler which does not implement this (say, those indexing sites for search engines other than Google's) or has exceptions for webcrawlers which are one big backdoor for browsers to also come in (in fact a number of paywall-bypassing solutions relly on sending the right HTTP headers to use exactly existing exceptions for webcrawlers).
Even webcrawlers implementing this are relying on "integrity validation" servers from a 3rd party (I bet that's going to be Google) so think about how Google can interfere here with 3rd party webcrawlers by merelly throttling down integrity validation responses for those.
Oh, and open source webcrawler implementations can forget all about being validated.
(By the way, this would also impact any AI-data gathering webcrawlers that don't use APIs to get the data but rather go in via the web interface)
This is quite possibly a far bigger play by Google than just for browser and Ad dominance.
Only if they proceed AND websites enforce it. The last reply I read from the Googler that was part of the draft spec said they were building in a guardrail that prevents sites from outright blocking non-compliant clients without also blocking a not insignificant portion of their desired userbase.
To me, it sounded like they'd just randomly not send the DRM information sometimes. So, the fix for web sites would be to tell the user to reload until the information is passed along.
That's pretty terrible UX, though. I think it's more likely that websites will continue integrating a CAPTCHA service and that service will simply try to short-circuit its decision by asking for attestation. If none is given the user gets to click on pictures of street lights.
The devs responsible for this say their goal is to detect bots, but make sure it doesn't harm people not using this tech. I'm actually inclined to be believe them. The problem is that those guardrails could turn out to be ineffective, or Google could decide to just disable them at some point.
Sites that implement it will drop traffic which will lower ad revenue which will mean less money.
If browsers that do not implement it gain a higher market share it might have them miss out on so much money that they don't block the browser or not implement the drm.
it's a battle of browsers now and i am happy to stand on the site of firefox.
You know those movies where aliens attack earth and we always win? I think these outcomes are mostly true because I've said it before and I'll say it again, there's nothing humans can't ruin. Whether it's meeting your family at the arrival gate or alien societies we'll destroy it. The internet is just the next thing.
It will affect for some sites, not for others. You'll no longer be able to bypass paywalls to read news, for example, because those sites will most likely adopt the DRM. Some streaming services may do the same, maybe even some social networks. But places like lemmy will still be generally unnaffected.
Why would you not be able to bypass news paywalls? As long as one user pays for the service they can then crawl the site and host the content on a separate site.