Why do I need to know all of this stuff, why isn’t the web safe by default?
The answer to questions like this is often that there was no need for such safety features when the underlying technology was introduced (more examples here) and adding it later required consensus from many people and organizations who wouldn't accept something that broke their already-running systems. It's easy to criticize something when you don't understand the needs and constraints that led to it.
(The good news is that gradual changes, over the course of years, can further improve things without being too disruptive to survive.)
He's not wrong in principle, though: Building safe web sites is far more complicated than it should be, and relies far too much on a site to behave in the user's best interests. Especially when client-side scripts are used.
It's easy to criticize something when you don't understand the needs and constraints that led to it.
And that assumption is exactly what led us to the current situation.
It doesn't matter, why the present is garbage, it's garbage and we should address that. Statements like this are the engineering equivalent of "it is what it is shrug emoji".
Take a step back and look at the pile of overengineered yet underthought, inefficient, insecure and complicated crap that we call the modern web. And it's not only the browser, but also the backend stack.
Think about how many indirections and half-baked abstraction layers are between your code and what actually gets executed.
It doesn’t matter, why the present is garbage, it’s garbage and we should address that. Statements like this are the engineering equivalent of “it is what it is shrug emoji”.
I don't think your opinion is grounded on reality. The "it is what it is" actually reflects the facts that there is no way to fix the issue in backwards-compatible ways, and it's unrealistic to believe that vulnerable frameworks/websites/webservices can be updated in a moment's notice, or even at all. This fact is mentioned in the article. Those which can be updated already moved onto a proper authentication scheme. Those who didn't have to continue to work after users upgrade their browser.
Okay, and how would you address it? The limitation is easy to criticize when you can think in a vacuum about it. But in the real world, we'd need to find a way to change things that can actually be implemented by everyone.
Take a step back and look at the pile of overengineered yet underthought, inefficient, insecure and complicated crap that we call the modern web....
Think about how many indirections and half-baked abstraction layers are between your code and what actually gets executed.
Think about that, and then...what, exactly? As a website author, you don't control the browser. You don't control the web standards.
I'm extremely sympathetic to this way of thinking, because I completely agree. The web is crap, and we shouldn't be complacent about that. But if you are actually in the position of building or maintaining a website (or any other piece of software), then you need to build on what already exists, unless you're in the exceedingly rare position of being able to near-unilaterally make changes to an existing platform (as Google does with Chrome, or Microsoft and Apple do with their OSes) or to throw out a huge amount of standard infrastructure and start as close to "scratch" as possible (e.g. GNU Hurd, Mill Computing, Oxide, Redox OS, etc; note that several of these are hobby projects not yet ready for "serious" use).
It doesn't matter, why the present is garbage, it's garbage and we should address that.
The problem is fixing it without inadvertently breaking for someone else. Changing the default behavior isn’t easy.
There’s probably some critical systems that relies on old outdated practices because that’s the way it worked when it was written 20 years ago. Why should they go back and fix their code when it has worked perfectly fine for the past two decades?
Unless I'm missing something, the post is plain wrong in some parts. You can't POST to a Cross-Site API because the browser will send a CORS preflight first before sending the real request. The only way around that are iirc form submits, for that you need csrf protection.
Also the CORS proxy statement is wrong if I don't misunderstand their point. They don't break security because they are obviously not the cookie domain. They're the proxy domain so the browser will never send cookies to it.
Anyways, don't trust the post or me. Just read https://owasp.org/ for web security advice.
‘’’
Note: When I say “top-level” I am talking about the URL that you see in the address bar. So if you load fun-games.example in your URL bar and it makes a request to your-bank.example then fun-games.example is the top-level site.
‘’’
Meaning explicit creds won’t be sent. Even if fun-games knows how to send explicit creds, it can’t because fun-games does not have access to creds which stored for your-bank. Say suppose your-bank creds stored in local store. Since current URL is fun-games it can only access local storage of fun-games, not your-bank.
Thank you! I was always wondering why the heck this (mostly) useless and broken mechanism exists. I had hesitations about disabling it but had doubts about my understanding. Now I know I’m right