Before Windows 10, NVidia and others had this button Detect what thing suits me best on their websites. Now many of them just look it up in one's fingerprint without asking.
Fuckin oath. If we cater to the stupid too much the folks who are middling just get lazy. Make people think. It’s important that we know how to use our brains.
Microsoft hides their links if they see you run linux. So you need to manually set your OS in the browser settings to see the download link. Very convenient.
That's a fair perspective, but most people strive for as few clicks between users and their targets as possible. Forcing a user to become semi-tech-competent by sending them on a fetch quest to figure out their os, while not an inherently bad thing, does work against this overall goal....
Idk, it's like education vs service industry goal setting, that's all I'm trying to get at here lol
Edit: plus, there's no guarantee that it will remain just the big 3 for forever. There was a time before Linux, maybe we'll see a time after windows.... Unlikely, but one can dream lol
Ideally, to save bandwidth on both sides, the server would only want to serve you the JS and CSS you need. I'm not sure how frequently that optimization is made, however.
I’m a bit rusty on this, but I think you’d need to split your Sass/SCSS/etc before Webpack will perform tree-shaking or allow lazy-loading. I don’t think many devs wrote it that way: personally, I like my mobile rules beside my desktop ones, since my styling is component-wise.
Fair point, there could be reasons, and I'd say there's no privacy concerns if that's all they get, but I know it's part of fingerprinting. I said 99% so they don't even need to know that
as a front end web developer, I've found it useful to know what user agent is requesting a page in order to load conditional styling. For example, to compensate for Safari's god-awful outlines support (pre-version 16).
The biggest offender is, surprisingly, cloudflare. They will straight up refuse to serve you any site if your user agent is not one of the mainstream ones. It's not even "find the traffic light to prove you're human", but a page basically saying "fuck you, go away".
what is more likely to be a bot? a unique and trackable useragent for a semi-niche browser engine, or a vanilla Chromium+Windows which half of everyone uses ?
what about malicious/unwanted bots? if cloudflare is trying to block bots, the bots will want to not look like bots. the easiest way to do that is to use a common user agent.
If I was a Firefox dev I'd start looking into building in user agent spoofing right into the browser.
It already opens Facebook pages in a special isolated tab. They could have apple.com open in it's own special "safari" tab. I wonder if there's anything preventing them from doing that. I guess it could be bad because it would make their market share appear even smaller.
Broken webpages might be a good thing. There are too many browsers that aren’t adhering to standards. Stop coding around it and start publicly shaming these megacorps.
JavaScript as it is today also need to be thrown in a trash of history.
Website should not contain additional code. If someone wants to send me an app hacked on top of website rendering, it should be a popup asking me first if I want to run this.
No problem with sending some JavaScript module extending browser's capability. But the problem I see is sending whole sites this way, sometimes even rendering HTML on the visitor's browser, yack..
This is absolutely not true and just a myth.
Images, video playback, "show more", forms, tabbing, animations, custom icons, hover effects, popups, background images and videos, light/dark mode, hamburger menus...
It's hard to count things you can do with advanced format that is HTML+CSS. Saying JavaScript is nessesary for anything other than block of text is like saying that in Minecraft command blocks are nessesary for anything other than making voxel art.
For basic things like interacting with your bank or goverment, running any additional code should be unnessesary. And I believe this needs to be a law targeting accessibility and compatibility.
For maps, dynamic updating, OK.
But look at the web now, most sites are apps requiring 99% of web standards implemented to work. No wonder it's now impossible to actually make a new browser.
HTML was made to last. If browser do not support some tag it would try and render it anyway. Meanwhile with today's webapps browsers in 2033 will be required to have so much technical debt that for now was exclusive to operating systems.
i don't want them knowing desktop or mobile either. we all have good enough phones now to handle a proper website on mobile -- mobile sites are fucking garbage.
steve jobs during the original iphone keynote did a whole segment on how you could load the full rich widescreen NYT website and zoom in and out and look at that rich text rendering. apps are ass, mobile sites are ass.