I work as an embedded developer and when a new board is produced and half tested, every one expects fully functional code off the bat, right away.
Motherfucker, you didn't even qualify your hardware and you expect my code, that hasn't hit the new board, to be 100% functional, based on your mind map? We will find hardware issues that will inevitably be blamed on the code and spend hours "debugging" the code to eventually find out the hardware is shit.
I have this great idea for an app, we can go 70/30 on it! 70 for me because the idea is the hardest part after all. So basically it's Twitter plus Facebook plus Tinder with a built in MMO. You can get that done in a couple weeks, should be pretty easy right?
most things seem to have settled on this, but tabs are so much better for accessibility. programmers with bad vision can have trouble differentiating smaller indentation levels, while some of them just bump the font size up so high that 4 spaces takes up too much screen space. each one can set a tab width that is comfortable for them. https://alexandersandberg.com/articles/default-to-tabs-instead-of-spaces-for-an-accessible-first-environment/ has some good arguments
with a forced formatter and a configured editor there really isn't any argument for spaces
Hard tabs are much better as someone who works with an age diverse team where vision issues are a serious issue. Four space tabs are optimal for you but there are other lived experiences.
I've only ever heard raging between the two, but never why. I'm guessing there were competing languages with different standards, or maybe historic hardware limiting input sets that kicked this debate off or?
I think what people generally mean when they say programming language (other than just a language to write a program) is that the language is turing complete. Even with this more limited definition, JavaScript, Bash, and Powershell are turing complete and therefore programming languages.
Replace a semicolon (;) with a Greek Question mark (ΝΎ), provided they're working in a language that uses semicolons at the end of every line, and their IDE doesn't highlight the difference (which some do now)
Correctly highlight when a programmer is being assumptive as a brick, even when assumptions are one of the biggest sins in programming. Done, you've triggered a lot of programmers.
Micromanagement to speed up product release date. Daily meeting and status reports, work breakout categories such as βcode designβ, βcode developmentβ, βcode documentationβ, etc,etc (flash back gif of Apache helicopters flying over a jungle
I inherited an old Japanese codebase. Tons of stuff was just single-letter variables. Apparently, this used to be at least somewhat common here. I spent a lot of time just updating code to replace vars with something meaningful (and found bonus bugs due to improper scoping with same var names as a bonus). Didn't have an IDE that would easily do it for me at the time and running something like sed felt too risky.
Compile your kernel with a different version of ld then shipped with your distro. I have this right now in Debian testing and it's enraging. I'm not even sure if that's the source of my error