That programming as a career means you're going to spend writing nice, clean code 80% of the time.
It's rather debugging code or tooling problems 50% of the time, talking to other people (whether necessary or not) about 35% of the time and the rest may be spent on actually spending time doing the thing you actually enjoy.
In my experience, you're rather inaggerating. I'm not even 10y into my career and if I get to actually code for 2h a day, that's already a success. Most of my time nowadays is documentation, meetings, jira, research and calls with the clients.
I think it heavily depends on the size and (management) culture of your employer. My most recent gig had me sit in way too many meetings that were way too long (1hr daily anyone?), dealing with a lot of tooling issues and touching legacy code as little as possible while still adding new features to our main product on a daily basis. Obviously "we don't need a clean solution. We're going to replace that codebase anyways, next year™".
The job before that had me actually code for about 80% of the time, but writing tests is annoying and slows you down and we don't have time for that. Odd how there was always time for fixing the regressions later.
But programming is definitely more open to the idea of people just showing up and claiming to know stuff. You wouldn't trust Steve to build a bridge just because he watched a bunch of engineering videos on YouTube.
Eh, I'm naturally good at it. I got shoved into the programming UIL group in school with absolutely no background in programming and tied for 3rd place.
Exact same thing happened to me. Group project needed a programmer, I was a gamer with a nice computer so I volunteered. 15 years later and I’m a software engineer at a huge company.
That's the difference between a programmer and a computer scientist, but even I (a computer scientist) I'm not an expert in hardware, networking, or OS level operations because that's not my daily focus.
I call that the "nerd equivalency problem". I think it's the source of much (most? all?) of the problems with software that comes out of organizations that are not programming shops by nature.
"We're not moving fast enough (or, "I have this great idea!"), hire another nerd!"
The problem also exists within individual programmers ("sure, I can do that UX/UI thingy, just let me finish building this ray-tracing thingy"), but that's just an ordinary cognitive weakness that affects us all (thinking that being expert in one field makes one expert in all). It's the job of proper leadership to resist that, not act as though it's true.
That a "working" prototype with no tests is just as good as a carefully-designed and well-tested feature. I see this happen so often that a coder puts a prototype in front of a product manager or exec and they are like, "this is exactly what we need, now! Ship that!" And then misery ensues for all of the engineers that need to maintain this piece of garbage. As managers pressure the engineers to build new features on top, they inevitably break fundamental parts of it, and without a confident leader to demand that tech debt is paid off, that product will consume the souls of many desperate coders.
In contrast, if you do it right the first time, there will be significant parts of code that never need to change, and the parts that do need to change will be much easier, because it will be obvious if it breaks the tests.
Anyway, a prototype is not a bad thing, if the managers know the difference.
It's easier said than done to "do it right the first time" if you don't know how / what to build.
Prototypes can be built to validate hypotheses and generally figure out what works, then build the real thing afterwards.
Yea I should have clarified. Prototypes are a great idea. The problem occurs when you say, "this is good enough we can improve on it as we go." Yea good luck balancing priorities when everything breaks from tapping your keyboard too hard. You MUST NOT MERGE the prototype.
I like puting my prototype code in namespaces like "garbage" "trash" "throwaway" etc to emphasize how unfit for production. I've no concrete evidence of it's success, but I like to think it dissuades other team members from using it where they shouldn't.
As my first job out of college (when I didn't know what I didn't know) I was hired to build a bespoke inventory system for a manufacturing company. My prototype became a production system the second I showed it to one of the engineers. The next three months of my life were a living hell as I frantically fixed bugs on a live system. Lesson learned.
oh yeah and the overt emphasis by suits on frontend development because it feels more tangible. like yeah sure we can add a follow button in a couple lines of code... granted you want to allow duplicate requests by non-signed in users or users that block each other with no manual approvals support, no protection against CSRF and the followee not getting notified
Programming != Computer Science. Programming is just a tool used in computer science. Computer Science is so much more and follows scientific theory and methodology.
CS is also what most problems on leetcode and the like are about. Programming is just application of CS concepts, usually wrapped in several layers of abstraction, to domain specific problems. But I've never seen a job posting for a computer scientist specifically, yet we all know how it often looks like.
Programming is not doing leetcode problems all day long. Those problems can be a good brain exercise or a good prep for a [misguided] technical interview but in a real programming job you have next to no chance of running into problems like those. Even if you do, you're an idiot if you spend hours toiling away at a problem that somebody else already solved much more efficiently than you will. Your boss doesn't give a crap if you pulled all of the code straight from your brain.
Programmers are not hackers. The reverse might be true but hacking is about finding problems (and exploiting them) while programming is about fixing problems.
A programmer can do anything that involves code. Maybe not quite this succinct but I think most will assume you can write a mobile app or a website just because you say you can code. Websites, games, apps, and so on are written in code but they all involve different technologies, toolsets, and standards. I'm sure I could fumble my way through any kind of software but don't expect it done quickly if it's not my area of expertise.
Especially regarding the first one: this seems like a very US-centric thing - or maybe a non-german thing. I've been in a bunch of interviews on both sides of the table here in Germany and I've literally never encountered a single leetcode question. At all.
I'm pretty sure that when programmers and other techies call themselves "hackers", they don't mean in the security-breaching sense. It means that you can "hack together" something.
Programmers are not hackers. The reverse might be true but hacking is about finding problems (and exploiting them) while programming is about fixing problems.
You have to find a problem before you can fix it. All good programmers are hackers.
tbh the biggest upside of competitive programming sites was when I finally learned some Scala so that I can feel smug about my elegant one-line solutions dabs in a very specific way that makes my arms resemble a lambda /s
Programming is, first and foremost, understanding what the fuck you want/need the computer to do. That means that some programmers (mostly analysts) may understand workflows and processes better than the people whose job depends on their knowledge of said things.
People don't realize that as you get better at programming, the amount of code you write goes down. At least in my experience, my work day has shifted to 80% thinking about what I'm going to write and then about 20% actually writing it.
Requiring a candidate to know a specific programming language is stupid. Nearly all of the commonly used languages in industry are similar.
It's maybe more valuable to require knowledge in a specific framework, where knowledge is less transferrable between popular frameworks. Nonetheless, I personally rather hire an engineer that solves problems and learns flexibly rather than one that happens to know the right tech.
I'd say this is pretty dependent on the language. For example, with C++, you need to micromanage (or at least benefit from micromanaging) a lot of things that you can get away without knowing about at all with other languages. That stuff takes time to pick up if you're self-teaching as you can write stuff that looks like it works without knowing its half as fast as it could be because you aren't making use of move semantics, and if a colleague is teaching you, then that's time they're not spending directly doing their own work. On the other hand, someone with Typescript experience could write pretty decent Javascript from the get-go.
C++ is unique in that it is wildly dominant in its niche. I am sure that any developer who has worked with another object oriented, manually memory managed, systems programming language (are there any other popular ones out there?) should have no trouble picking up C++.
I used to agree, but now I'm not so sure. There are huge time savings in having someone already familiar with a specific technology. They've ran across an issue before and can quickly find the solution.
For example, I started learning Elixir a little over a year ago. I struggled with how to get it to change data in place, and the answer is that you don't. You work with data in an immutable way; you make a copy with the change made and throw away the original. Once you get used to it, this works very nicely, and Elixir has quickly become one of my favorite languages. However, few other languages force you to work immutably, and nobody does it voluntarily. It takes a bit to get your head around it, and you'll take a lot longer on any given task until you do.
I generally agree with this, there's specific circumstances but for the most part its true.
I went from a C# position to PHP, to Python, to perl all with little or no experience with what I was jumping in to. There's different nuances and the syntax might take a bit to get used to but as long as someone understands the how and why of what their code is doing that can be pretty easily transfer to most other languages. It's all about the fundamentals.
It's not a black and white issue. "Jack of all trades, master of none" vs "expert of one". Both have their place and I think it's better to have a mix than just one or the other.
I've seen python newcomers writing code as if they were writing in another language. They don't know about dataclasses, operator overrides, __init__ vs __new__, metaclasses, __init__.py vs __main__.py, @property, match, the walrus operator, or assignments, or the common pitfalls of python like mutable defaults, type hints, and a bunch of other things.
Knowing a language in-depth helps write DRY code, avoiding common pitfalls, handling things better like debugging, profiling, and other tooling, and avoiding pitfalls of the language, which newcomers have to first learn, regardless of how their experience with other languages.
A lot of stuff is transferable, for sure, but every language uses different idioms, covers different paradigms, and so on. It's good to have at least one expert on the term to teach others, and to have people flexible enough to switch of willing to learn. Having only experts can mean a static team unwilling to experiment or use better programming languages or technologies. Having only beginners or mediors of a language can produce functional, but sub-optimal code. YMMV
It is better to find a developer that has experience with the language features you use rather than one that is experienced in the exact language you use. For example, I work on distributed systems in Java/GoLang/Python. We want candidates that understand how to write concurrent logic and stay away from people who are just Java web developers.
The big issue is doing a coding interview with candidates. We have a standard straightforward problem that candidates need to solve by filling in a stubbed out method. We have it in Java and have ported it to GoLang. If we have to interview a candidate who does not know either of those languages, we would need to find a language that the candidate knows and we know well enough to port the problem to. We would also have some difficulty digging in to design specifics like choice of concurrency primitives.
What's funny about this comic now is the second one has become very attainable in the years since it was released. The concept still applies though. Some things are a lot harder than they seem on the surface.
I've had to work with it in three projects in the past five years and I consider it one of the hardest programming languages, for anything but very short scripts.
You don't get proper compiler assistance, unless you have 100% test coverage. You don't get a helpful text editor. You don't usually get helpful type hints in libraries you use, so you have to genuinely just study the documentation and/or code. You get tons of quirky behavior in the stdlib, build tools, async stack, imports. You get breaking changes in minor versions of the language.
I find writing code in Python extremely mentally taxing, because you just get so little assistance, that you have to think of everything yourself.
Yeah, we invested a lot of time into type hinting and checking, but mypy would never exit without warnings and errors, because many libraries we were using had no type hints.
It was also just exhausting/cumbersome, having to write type hints everywhere, as there's no type inference.
But yeah, we always joked that someone should create TypeScript for Python – Typhon.
I'm sorry to say this, but PyCharm is precisely what we were using. I do consider it the best Python editor, but it's several classes below IntelliJ for Java/Scala/Kotlin or even the extremely new RustRover for, well, Rust. And I'd say roughly at the level of KATE (a non-smart text editor) with just the rust-analyzer language server hooked up.
It is extremely impressive what PyCharm manages to analyze in Python, but other languages offer similarly good tooling out of the box, or make such analysis much easier by having static types.
Agree, also just in general I find many things Python very odd and syntactically isolated to some extent. Constructors, lamba, dictionaries in particular are extremly whack.
I don't know if i qualify as a full programmer, I'm an actuarie but 90% of my work is in python, 5% SQL and 5% excel. I love python because is flexible as fuck, I can connect to the SQL server, send the queries to a pd.DataFrame, process the information, scrap some webpage for adicional information needed, and finally export to an excel file that the accounting team can use. I don't write fully functional programs, but small specific scripts for different tasks. R is another popular programming language between actuaries and statisticians, but I haven't find anything that R can do, that I can't in python.
I don’t write fully functional programs, but small specific scripts for different tasks.
This is exactly why your experience is different and you like Python better than many others. You are using Python as it was meant to be used and where it excels; for small scripts.
When people say they don't like Python they mean that Python does a really, really bad job when it comes to larger systems. Static analysis becomes exponentially more important in larger systems and Python has basically 0 of that.
But as long as you stick to relatively small stuff (less than a few thousand lines), Python is pretty nice and fast to develop in.
I'd say if you program then you're a programmer. What you're thinking of is more of a software engineer, ie. someone who architects and creates software.
I'm a scientist that has been coding almost exclusively in Python for the past decade and I strongly disagree.
Python is great at being the glue that holds everything together, and everything crunchy part of the program is being handled by a library anyways.
I code with two terminals, one for iPython and one for vim. And you don't need anything else. The beauty of Python is that it's not a language that is so full of boilerplate that you need an IDE to type it for you to be remotely productive.
Overall, Python is a language made to be used by people that need to make something that just works and don't need to spend years learning programming paradigms and industry practices. Fortran and C are so unwieldy in comparison and everything more modern lacks the expansive and diverse libraries of Python.
Overall, Python is a language made to be used by people that need to make something that just works
This is why you find it easy, and why the person you replied to finds it a big pain. The friction other languages would give you exists to provide structure on a larger scale that makes that guy's work easier. Like you implied, different languages for different jobs.
Technical Leads are not rational beings and lots of software is developed from an emotional stand point.
Engineering is trade offs, every technical decision you make has a pro/con.
What you should do is write out the core requirements/constraints.Then you weigh the choices to select the option that best meets it.
What actually happens is someone really likes X framework, Y programming language or Z methodology and so decides the solution and then looks for reasons to justify it.
Currently the obvious tell is if they pitch Rust. I am not saying Rust is bad, but you'll notice they will extoll the memory safety or performance and forget about the actual requirements of the project.
I would amend that to "if they pitch any language".
The best language is almost universally "whatever we already use" or for new projects "whatever the team is most familiar with". It should occasionally be reconsidered, and definitely try out new languages, but actually switching to the new language after trying it out? That should be very very rare.
The team/organisations knowledge is a huge factor but its easy to fall into a trap where no matter what the problem is the solution is X language.
If I have an organisation that knows C# and we need to build a Web Application. I would suggest we need to learn Node.js and Typescript and not invest in a solution that turns C# into web pages.
especially if the other person uses some stupid bloat like MIUI. I assume under the hood it must be a real hot mess if in the process of adding new features they broke support for standard stuff if last time I needed to do something on two people's xiaomis these shitboxes didn't show a password below the wifi qr code and it has this thing with accent colors derived from wallpaper but absolutely no control over it unlike in standard android despite the fact that it landed two major releases ago.
No programming language, development philosophy, or technology can save you from projects and business lacking clarity. Your ability to communicate and be understood is as/perhaps more important than the quality of your ideas. Consistency is better than perfection.
I keep trying to "better myself" by learning programming, but I'm just a fucking moron, I'm not capable. That and I really have 0 interest in it, but I can't make enough to survive as a single individual being a fucking moron...
The people who really succeed are the ones so obsessed with tech that they wrote their first app at the age of 10 and were in the high school robotics club.
Some programmers are software engineers. They solve problems, sometimes problems with great ambiguity or non-straightforward solutions.
And some programmers are... code technicians? They understand and write code, but their job seldom involves problem solving. Often times, they're asked to code an already solved problem, or mostly solved.
This is not a diss. I was in the second camp for a while. But it hurts your career to stay in that. So be careful.
Totally agree, I had the fortune to read Domain Driven Deign by Eric Evans early in my career. While, the book may be outdated, it helped me understand that my job is to turn the unknown or ambiguous into code. I find that much more exciting than being a coder.
This one might be a bit controversial, but has rung true in my general experience. Probably a lot of exceptions to these rules, but here goes:
You don't really know a programming language until you understand a fair amount of the standard library and how packages/modules/dependencies work. Syntax is pretty easy, and any mainstream language will work just fine for solving basic leet-code style problems. But when you really spend a lot of time working with a language, you're going to spend more time learning about common libraries and how to manage dependencies. If you're working with a language like C++ or Java, this could also include build systems and how to use them.
Another precursor to being able to say that you know a language is that you should also be familiar with best practices (ie. how to name modules, how to write documentation, etc.) and common pitfalls (undefined behavior, etc.). This is one of the hardest parts about learning a new language in my opinion, because the language may not necessarily enforce these things, but doing them the wrong way can make your life very difficult.
JS has confusing behavior, not undefined behavior. Its specs are well defined and backwards compatible to a fault, making some things unintuitive and harder to learn if you don't learn the history of the language.
Problems with both should be avoided by learning and using standard practices. (Don't pretend C is object oriented, always use === instead of == in js, etc...)
In complete agreement:
Result types are awesome, all future languages should be designed around them.
Counterpoint: knowing a programming language doesn't matter if you can solve problems. A competent programmer can pick up a new language and be productive within a few months. That is, a new language within the same paradigm - going from a imperative language to a functional language can be a drastic shift, but going from one imperative language to another is easy. If you can't do that as a intermediate to senior developer, you're not a competent programmer IMO.
The real skills of a good programmer are things like problem solving, debugging, understanding how to write readable and maintainable code, etc. Having deep knowledge of a specific programming language or languages is helpful and enables you to work faster, but if you're only a skilled developer in the languages you know - if you aren't capable of pivoting those skills to another language - you aren't a skilled developer IMO.
Agreed overall, you will still be competent switching from one language to another, but intricacies and nuance matter a lot here. You may have enough knowledge to solve problems, but will you have enough knowledge to avoid creating new ones too? Like performance issues, or memory leaks, or other unwanted behavior? C++ is a great example here: someone that's smart but inexperienced might just be dangerous enough to start writing classes with dumb pointers without overriding the copy constructors, and this is just a recipe for disaster.
I think it would take more than a few months to develop the kinds of experience that you need to be aware of these issues and avoid them. And while C++ is a very easy example to point out here, pretty much all languages have their share of footguns to be aware of, and it just takes time to learn them. A "deep knowledge" of a language is not just about being faster and more productive; it's also about not creating more issues than the ones your solving.
OTOH, you need to be good at the same kinds of reasoning that leads one to be good at math. Not knowing much math isn't a problem, but not being able to learn math is probably a dealbreaker.
I'm bad at math and struggled heavily through calc 2 and barely passed with a D+ but had little issue with data structures and algorithms (except when the algorithms were written in math notation, but still got through it after being explained in a logical set of steps instead).
The reason programming curriculums are so math heavy is because of teaching logic.
You're either right or wrong in math. There is ONE answer to the formula. You can sometimes get there different ways though. The logic on your path is the key.
I'd argue that you do need to be good at math to be an effective programmer, it's just that that doesn't mean what a lot of people think it means. You don't need to know all the ins and outs of quadratics, integrals, and advanced trigonometry, but I think you do need to have a really solid, gut-level understanding of basic algebra and a bit of set theory. If you're the sort of person whose head starts to swim when you see "y=3x+2", you're going to find programming difficult at best.
Myth: code can be ugly as long as it works, don't spend company time on making it look good or on minor optimizations.
The truth is that you can tell when effort has been put into a job. Even if it just works, the lack of discipline means that in the end it will be difficult to maintain and probably will fail in unexpected situations.
Every language has its conventions, but if I spot more than a line of separation between blocks of code, that is a common telltale sign of noob. Run from that shit.
The idea is that often you could be using actual logical separations (functions etc.) instead of whitespace. IMO whitespace has its place though, including for this.
"that 2013 game runs at a smooth 60 fps. This medern game running at quadruple the resolution with raytracing sometimes dips to 58 fps on the same hardware. Devs must be lazy, they just need to add OPTIMIZATION to the game
I don't know what it's called, but it's a common phenomenon: available room will be exploited. It's exactly why computers nowadays don't feel faster than computers from a decade or two ago: they do so much more because they can.
Stuff like electron would've been impossible in 2000 or 2005: it's just a behemoth in terms of computational needs and power consumption. Earlier computers would've struggled endlessly with it. Current hardware however makes it seem as fast as previous tech.
When you release something, your work is not done. You have to maintain it, fix bugs, release patches, and probably the worst part, keeping it up to date.
For example, Apple decides to deprecate some API, or decides to switch cpu architecture, or for the millionth time change how app signing works, or add some new security feature that breaks your app. Now you need to make your app work properly on the new platform, switch APIs, all the fun. Or, there's some critical vulnerability in library you used and customers are deleting your app from their computers (a lot of companies use automated scanners that check against published CVEs). It's most fun when you learn that the new version that fixes the vulnerability completely breaks compatibility with the old one and now you have to rewrite all the code that used that library.
Also, maintaining open source projects is not fun. It's a lot of work, in most cases unpaid, thankless, and building a community around a project is really hard.
People think computer as magic. That would be nice to make people understand that it's not. That's pretty much a dumb machine where we put our intelligence to work.
Truth: It's highly probable you are neurodivergent.
While, accurate numbers are not available, I have seen people estimating that 20% of people working in FAANG are neurodivergent. If coding comes naturally to you but the laundry is your mortal enemy, it's worth learning about ADHD/ASD and other common disorders. Being a coder can be a sign, the immediate feedback helps a bunch of us, or as Russel Barkely says "when you solve a problem on a paper, NOTHING HAPPENS".
That Python is the most readable language. Merely forcing an indentation style is only part of the issue. Python programmers have a tendency to write a bunch of named parameters all on one line, and it's a mess.
This is not enlightening. It might as well go on the shelf next to the worst regex you've ever seen. The automated doc generator needs to break these up to put one arg on each line, or just omit it altogether and let the detailed docs handle it.
It's not just the doc generator, either. I see this kind of style all the time in Python code. It's unreadable and it also makes it harder to figure out diffs when a parameter in the middle is changed (though it's helped by color coding for pull requests on GitHub and the like).
It's almost like the language attracted a bunch of people who thought indentation was the only thing you needed to make readable code. No further thought put into it.
I don't disagree that this is hard to read, but I feel it's worth mentioning python has a pretty acceptable style guide. The problem is, it's far less common in python to bundle parameters into some holding object. So here you have massive function that has to accept a lot all at once. In use it's probably not as bad looking however.
And at least, it actually explains all the damn parameters. It's a lot nicer than seeing functions parameters you don't understand, and all you have is the name. This is not limited to python either
There are plenty of other languages that have named parameters but no holding object. You format it like this:
some_func(
foo: 1,
bar: 2,
baz: 3,
)
And this works fine. Of course, not everyone does that, but I almost never see it done in Python.
This style comes into conflict with rules that functions shouldn't be longer than 20 lines for whatever. The solution to that is to be relaxed about the line count rule. I'd rather see 40 trivial lines than 20 with everything crammed up.
There are no absolutes, and most of these “myths” are at least true to some extent. Much like any paradigm (worse is better, whitebox testing, lbyl vs eafp, etc), none are universally best. And all are helpful to know about.