1,200,000 / 30 days / 24 hours / 60 minutes / 60 seconds is 0.46 requests per second.
That is crazy low and is nothing to shout about. I notice people like to this in months to inflate the number to looks bigger. But calculating it down to RPS puts it to a perspective.
So why not create a website out of really, really old technology?
PHP 8.0 is no longer supported so I hope they update the "really, really old technology" to at least PHP 8.1 today.
Either way that VPS will cost 10-20$ depending on CPU form a good provider. You can't get that cheap with a bunch of AWS services for that number of requests.
Also, if they were using an hyped tech stack (nodejs or wtv) they wouldn't be able to handle the request spikes like they do. 0.46 requests per second means nothing because I'm sure they've hours of full inactivity and others serving 10 requests / second that would totally obliterate 2GB of ram if done on nodejs and a Mysql DB.
$10-20 is what that VPS costs at a cloud provider. You could also dockerize and use a container service like GCP Cloud Run combined with cloud storage within that budget.
I'm not a big node guy, but I also kind of doubt nodejs would fail to handle 10RPS on 2gb of memory. I guess it all depends on what the requests are doing.
Not to mention - they have regular deals, where you can get them for a permanent 50% off (during black friday and winter sales) I have been paying 17€ per year for the 2GB version.
[XHTML] never took off on the web, in part because in a website context so much HTML is generated by templates and libraries that it’s all too easy to introduce a syntax error somewhere along the line; and unlike HTML, where a syntax error would still render something, the tiniest syntax error in XHTML means the whole thing gets thrown out by the browser and you get the Yellow Screen of Death.
This confuses me; don't you want to make sure you are always generating a syntactically valid document, rather than hoping that the browser will make something suitable up to work around your mistake?
The thing with XHTML is that even a minor problem will make the page refuse to render and display a full page error message instead of any content. Having the browser guess how to handle the malformed HTML isn't ideal, but it's a lot better than showing nothing at all.
As an end result, maybe. But it also means that you get specific feedback on how to properly author it correctly and fix it before pushing it live.
IDK, I lived through that whole era, and I’d attribute it more to the fact that HTML is easy enough to author in any text editor by complete novices. XHTML demands a hell of a lot more knowledge of how XML works, and what is valid (and, more keystrokes). The barrier to entry for XHTML is much, much, higher.
I feel the idea was that anyone should be able to make a webpage by just copy pasting snippits and to help with that html and Javascript will attempt to continue as best as it can, even if there are glaring issues.
well, no. because broken html can still function sometimes. but most importantly most of html is not even "broken", just not "adhering to the complete standards".
html is just formatting around the content. even completely devoid of html you can still see things. we're not writing latex here and no one cares things are a little fucky.
as far as generated html go, you're more likely to break it further if you fuck with it anyways.
Sure, but shouldn't you want your generated markup to adhere to the complete standards so that you know it will be interpreted correctly, rather than hoping that the browser will make the correct guess about what you really meant?
Finally someone who know how to do things properly.
Modern PHP isn’t half bad, and it has at least two major benefit over some of its competitors:
Each request is a totally independent request that rebuilds the world. There’s no shared state (unless you want there to be).
big benefit is that you’re not stuck with having to learn and maintain a huge bells-and-whistles 3rd-party framework in perpetuity. I think people really underestimate the burden of maintaining a 3rd-party framework even after development of the website is complete
Starting on a cloud provider cedes one’s independence because it often leads to vendor lock-in.
The big benefit of running a basic Linux box on our own VPS is that everything is just files on a generic, well-understood platform (...) a VPS is a low-cost, simple, and lock-in-free way to go. Very classic-web.
At the end of the day...
All of this goes to show that you don’t need a whole lot to build a performant, useful website, capable of serving millions of requests a month, on a tiny server that also handles other resource-intensive tasks.
Modern PHP isn’t half bad, and it has at least two major benefit over some of its competitors: Each request is a totally independent request that rebuilds the world. There’s no shared state (unless you want there to be).
Anything JS / NodeJS doesn't work like that and that's precisely one of the issues with it. Node will also keep running a process in the background even if the website/app isn't ever accessed wherever PHP won't be running anything until a request comes.
Isn't that the same as modern languages? For example in ASPCore / C#, you can just register all your services with a lifetime scoped to the request, and then there's no shared state.
If you want there to be a shared state, you'd just have to register your services with a higher lifetime scope, like with a singleton scope
You can still use CGI with Apache. Apache will execute your program on each request and return its output from stdout as webserver response. If you have a form, it'll get POSTed to stdin when Apache execute your program. You can write your program with whatever language you want as long as you can read stdin and write to stdout. It's just tedious af so no one really use it these days. PHP was basically born because people got tired writing CGI program with pearl or C and want something more convenient. But with modern programming languages, perhaps CGI is not too bad, except the one process per request which will absolutely kill your server the moment you have visitors spike.
And there is... its called PHP. JS doesn't have this model because it is complete garbage slow and wouldn't ever run fine and reasonable in that model.