Not until a self driving car can safely handle all manner of edge cases thrown at it, and I don’t see that happening any time soon. The cars would need to be able to recognize situations that may not be explicitly programmed into it, and figure out a safe way to deal with it.
"Coding" was never the source of value, and people shouldn’t get overly attached to it. Problem solving is the core skill. The discipline and precision demanded by traditional programming will remain valuable transferable attributes, but they won’t be a barrier to entry. - John Carmack
Catching up on what Carmack's been up to for the last decade has revived the fan in me. I love that 2 years after leaving Oculus to focus on AGI, this is all the hype he's willing to put out there.
Agreed! Problem solving is core to any sort of success. Whether you're moving up or on for more pay, growing tomatoes or nurturing a relationship, you're problem solving. But I can see AI putting the screws to those of us in tech.
Haven't used it much so far, last job didn't afford much coding opportunity, but I wrote a Google Apps script to populate my calendar given changes to an Excel sheet. Pretty neat!
With zero experience App scripting, I tried going the usual way, searching web pages. Got it half-ass working, got stuck. Asked ChatGPT to write it and boom, solved with an hour's additional work.
You could say, "Yeah, but you at least had a clue as to general scripting and still had to problem solve. Plus, you came up with the idea in the first place, not the AI!" Yes! But point being, AI made the task shockingly easier. That was at a software outfit so I had the oppurtuniy to chat with my dev friends, see what they were up to. They were properly skeptical/realistic as to what AI can do, but they still used it to great effect.
Another example: Struggled like hell to teach myself database scripting, so ignorant I didn't know the words to search and the solutions I found were more advanced answers than my beginner work required (or understood!). First script was 8 short lines, took 8 hours. Had AI been available to jump start me, I could have done that in an hour, maybe two. That's a wild productivity boost. So while AI will never make programmers obsolete, we'll surely need fewer of them.
"Guy who was fed a pay-to-win degree at a nepotism practicing school with a silver spoon shares fantasy, to his fan base that own large publications, about replacing hard working and intelligent employees with machines he is unable to comprehend the most basic features of"
They've been saying this kind of bullshit since the early 90s. Employers hate programmers because they are expensive employees with ideas of their own. The half-dozen elite lizard people running the world really don't like that kind of thing.
Unfortunately, I don't think any job is truly safe forever. For myriad reasons. Of course there will always be a need for programmers, engineers, designers, testers, and many other human-performed jobs. However, that will be a rapidly changing landscape and the number of positions will be reduced as much as the owning class can get away with. We currently have large teams of people creating digital content, websites, apps, etc. Those teams will get smaller and smaller as AI can do more and more of the tedious / repetitive / well-solved stuff.
In all my career I haven’t seen a single customer who was able to tell me out of the box what they need. Big part of my job is to talk to all entities to get the big picture. Gather information about Soft- and Hardware interfaces, visit places to see PHYSICAL things like sub processes or machines.
My focus may be shifted to less coding in an IDE and more of generating code with prompts to use AI as what it is: a TOOL.
I’m annoyed of this mentality of get rich quick, earn a lot of money with no work, develop software without earning the skills and experience. It’s like using libraries for every little problem you have to solve. Worst case you land in dependency/debug hell and waste much more time debugging stuff other people wrote than coding it by yourself and understanding how the things work under the hood.
And by that time, processors and open source AI are good enough that any noob can ask his phone to generate a new app from scratch. You'd only need big corpo for cloud storage and then only when distributed systems written by AI don't work.
the number of positions will be reduced as much as the owning class can get away with
Well, after all, you don't hire people to do nothing. It's simply a late-stage capitalism thing. Hopefully one day we can take the benefits of that extra productivity and share the wealth. The younger generations seem like they might move us that way in the coming decades.
I really hope so. Sometimes I think the kids are alright. Like the 12 year old owning the My Pillow idiot. Then I hear the horror stories from my school teacher friends.
It's worth noting that the new CEO is one of few people at Amazon to have worked their way up from PM and sales to CEO.
With that in mind, while it's a hilariously stupid comment to make, he's in the business of selling AWS and its role in AI. Take it with the same level of credibility as that crypto scammer you know telling you that Bitcoin is the future of banking.
I'm not entirely sold on the technology, especially since immutable ledgers have been around long before the blockchain, but also due to potential attack vectors and the natural push towards centralisation for many applications - but I'm just one man and if people find uses for it then good for them.
When I last tried to let some AI write actual code, it didn't even compile 🙂
And another time when it actually compiled it was trash anyway and I had to spend as much time fixing it, as I would have spent writing it myself in the first place.
So far I can only use AI as a glorified search engine 😅
Lol sure, and AI made human staff at grocery stores a thing of the....oops, oh yeah....y'all tried that for a while and it failed horribly....
So tired of the bullshit "AI" hype train. I can't wait for the market to crash hard once everybody realizes it's a bubble and AI won't magically make programmers obsolete.
Remember when everything was using machine learning and blockchain technology? Pepperidge Farm remembers...
In their worldview engineers are a material, and all that matters in the world is knowing how to do business. So it just makes sense that one can guide and use and direct engineers to replace themselves.
They don't think of fundamentals, they really believe it's some magic that happens all by itself, you just have to direct energy and something will come out of it.
Lysenko vibes.
This wouldn't happen were not the C-suite mostly comprised of bean counters. They really think they are to engineers what officers are to soldiers. The issue is - an officer must perfectly know everything a soldier knows and their own specialty, and also bears responsibility. Bean counters in general less education, experience and intelligence than engineers they direct, and also avoid responsibility all the time.
So, putting themselves as some superior caste, they really think they can "direct progress" to replace everyone else the way factories with machines replaced artisans.
It's literally a whole layer of people who know how to get power, but not how to create it, and imagine weird magical stuff about things they don't know.
Just the other day, the Mixtral chatbot insisted that PostgreSQL v16 doesn't exist.
A few weeks ago, Chat GPT gave me a DAX measure for an Excel pivot table that used several DAX functions in ways that they could not be used.
The funny thing was, it knew and could explain why those functions couldn't be used when I corrected it.
But it wasn't able to correlate and use that information to generate a proper function. In fact, I had to correct it for the same mistakes multiple times and it never did get it quite right.
Generative AI is very good at confidently spitting out inaccurate information in ways that make it sound like it knows what it's talking about to the average person.
Basically, AI is currently functioning at the same level as the average tech CEO.
The job of CEO seems the far easier to replace with AI. A fairly basic algorithm with weighted goals and parameters (chosen by the board) + LLM + character avatar would probably perform better than most CEOs. Leave out the LLM if you want it to spout nonsense like this Amazon Cloud CEO.
Worst case scenario the ai fucking loses it and decides to do some wacky but weirdly effective shit. Like spamming out 1 width units en masse in Hearts of Iron 4.
I just want to remind everyone that capital won't wait until AI is "as good" as humans, just when it's minimally viable.
They didn't wait for self-checkout to be as good as a cashier; They didn't wait for chat-bots to be as good as human support; and they won't wait for AI to be as good as programmers.
They'll try the opposite. It's what the movie producers did to the writers. They gave them AI generated junk and told them to fix it. It was basically rewriting the whole thing but because now it was "just touching up an existing script" it was half price.
They won't, and they'll suffer because of it and want to immediately hire back programmers (who can actually do problem solving for difficult issues). We've already seen this happen with customer service reps - some companies have resumed hiring customer service reps because they realized AI isn't able to do their jobs.
And because all the theft and malfunctions, the nearby supermarkets replaced the self checkout by normal cashiers again.
If it's AI doing all the work, the responsibility goes to the remaining humans. They'll be interesting lawsuits even there's the inevitable bug that the AI itself can't figure out.
We saw this happen in Amazon's cashier-less stores. They were actively trying to use a computer based AI system but it didn't work without thousands of man hours from real humans which is why those stores are going away. Companies will try this repeatedly til they get something that does work or run out of money. The problem is, some companies have cash to burn.
I doubt the vast majority of tech workers will be replaced by AI any time soon. But they'll probably keep trying because they really really don't want to pay human beings a liveable wage.
It's really funny how AI "will perform X job in the near future" but you barely, if any, see articles saying that AI will replace CEO's in the near future.
Somewhere there is a dev team secretly programming an AI to take over bureaucratic and manegerial jobs but disguising it as code writing AI to their CTO and CEO
The latter are some thieves who've inherited a state from Soviet leadership. They have a layman's idea of what a state and a country is, what history itself is, plus something that a taxi driver would say. In the last 20 years they are trying to apply that weird idea to reality, as if playing Hearts of Iron, because they want to be great and to be in charge of everything that happens.
The former have heard in school that there were industrial revolutions and such, and they too want to be great and believe in every such stupid hype about someone being replaced with new great technology, and of course they want to be in charge of that process.
While in actuality with today's P2P technologies CEO's are the most likely to be replaced, if we use our common sense, but without "AI", of course. Just by decentralized systems allowing much bigger, more powerful and competitive cooperatives than before, and those that form and disband very easily.
'Soon' is a questionable claim from a CEO who sells AI services and GPU instances. A single faulty update caused worldwide down time recently. Now, imagine all infrastructure is written with today's LLMs - which are sometimes hallucinate so bad, they claim 'C' in CRC-32C stands for 'Cool'.
I wish we could also add a "Do not hallucinate" prompt to some CEOs.
I'm actually really impressed with the auto complete intellij is packaged with now. It's really good with golang (probably because golang has a ton of code duplication).
Yeah, there are people who can "in general" imagine how this will happen, but programming is exactly 99% not about "in general" but about specific "dumb" conflicts in the objective reality.
People think that what they generally imagine as the task is the most important part, and since they don't actually do programming or anything requiring to deal with those small details, they just plainly ignore them, because those conversations and opinions exist in subjective bendable reality.
But objective reality doesn't bend. Their general ideas without every little bloody detail simply won't work.
Not really, it's doable with chatgpt right now for programs that have a relatively small scope. If you set very clear requirements and decompose the problem well it can generate fairly high quality solutions.
Sounds like he's just repeating a common meme. I don't see anything about higher level design that would make it more difficult for an AI (hypothetical future AI, not the stuff that's available now) compared to lower level tasks.
AI is terrible at solving real problems thru programming. As soon as the problem is not technical in nature and needs a decision to be made based on experience, it falls flat on its face.
Well, that would be the 3rd or 4th thing during my career that was supposed to make my job a thing of the past or at least severely reduce the need for it.
(If I remember it correctly, OO design were supposed to reduce the need for programmers, as were various languages, then there was Outsourcing, visual programming and on the server-side I vaguely remember various frameworks being hailed as reducing the need for programmers because people would just be able to wire modules together with config or some shit like that. Additionally many libraries and frameworks out there aim to reduce the need for coding)
All of them, even outsourcing, have made my skills be even more in demand - even when they did reduce the amount of programming needed without actually increasing it elsewhere (a requirement were already most failed) the market for software responded to that by expecting the software to do more things in more fancy ways and with data from more places, effectively wiping out the coding time savings and then some.
Granted, junior developers sometimes did suffer because of those things, but anything more complicated than monkey-coder tasks has never been successfully replaced, fully outsourced or the need for it removed, at least not without either the needs popping up somewhere else or the expected feature set of software increasing to take up the slack.
In fact I expect AI, like Outsourcing before it, in a decade or so is going to really have screwed the Market for Senior Software Engineers from the point of view of Employers (but a golden age for Employees with those skills) by removing the first part of the career path to get to that level of experience, and this time around they won't even be able to import the guys and galls in India who got to learn the job because the Junior positions were outsourced there.
i didn't start my tech career after high school because every career advice i got was "all jobs going to india." could've had 10 more year's experience but instead i joined the military. ugh!
I'd believe AI will replace human programmers when I can tell it to produce the code for a whole entire video game in a single prompt that is able to stand up to the likes of New Vegas, has zero bugs, and is roughly hundreds of hours of content upon first play due to vast exploration.
In other words, I doubt we'll see human programmers going anywhere any time soon.
Edit:
Reading other replies made me remember how I once, for fun, tried using a jailbroken copilot program to do python stuff slightly above my already basic coding skill and it gave me code that tried importing something that absolutely doesn't exist. I don't remember what it was called ince I deleted the file while cleaning up my laptop the other day, but I sure as hell looked it up before deleting it and found nothing.
Honestly, GPT has strengthened my coding skills... for the simple reason that the handful of times I've asked it to do something the response I get back is so outlandish that I go "That CAN'T be right" and figure out how to do it myself...
Research with extra steps... I get it, but still..
I feel like it's whispering bad advice at me while I'm typing. It's good for as auto completing the most rudimentary stuff, but I have a hard time imagining it completing even one file without injecting dangerous bugs, let alone a large refactor.
The best copilot can do is autofill lines that everyone's written a million times. That's not nothing, but it aint replacing a human brain any time soon.
To be honest, this could be an example of where AI could do marginally better. I don't mean that because of code quality or functionality. I mean it in the sense of MS software getting absolutely fucked by internal competition and stack-ranking fostered during the Balmer years. The APIs are inconsistent and there is a ton of partially implemented stuff that will never be pushed to completion because everyone who worked on it was fired.
An AI might be able to implement things without intentionally sabotaging itself but, because LLMs are in the forefront of what would be used and do not have the capability of intention or understanding context, I'm a bit pessimistic.
I taught myself Python in part by using ChatGPT. Which is to say, I coaxed it through the process of building my first app, while studying from various resources, and using the process of correcting its many mistakes as a way of guiding my studies. And I was only able to do this because I already had a decent grasp of many of the basics of coding. It was honestly an interesting learning approach; looking at bad code and figuring out why it's bad really helps you to get those little "Aha" moments that make programming fun. But at the end of the day it only serves as a learning tool because it's an engine for generating incompetent results.
ChatGPT, as a tool for creating software, absolutely sucks. It produces garbage code, and when it fails to produce something usable you need a strong understanding of what it's doing to figure out where it went wrong. An experienced Python dev could have built in a day what took me and ChatGPT a couple of weeks. My excuse is that I was learning Python from scratch, and had never used an object oriented language before. It has no excuse.
ChatGPT only gives good answers if you ask the right questions and to do that you have to be better than a novice. It's great as a rubber ducky that answers back but it's usefulness is a reflection of the user.
20 years ago at a trade show, a new module based visual coding tool was introduced in my field which claimed "You'll never need another programmer".
Oddly enough, I still have a job.
The tools have gotten better, but I still write code every day because procedural programming is still the best way to do things.
It is just now reaching the point that we can do some small to medium scale projects with plug and play systems, but only with very specific equipment and configurations.
The pace of change is about every five years, and some elements are always in transition.
All in one turn key solutions are always one to two cycles behind, so may work great with the stuff I'm already replacing.
I think these are honest attempts to simplify, but by the time they have it sorted its obsolete. If I have to build modules anyway to work with new equipemnt, might as well just write all the code in my native language.
These also tend to be attempts at all in one devices, requiring you to use devices only compatible with those subsystems. I want to be able to use best tech from what ever manufacturer. New and fancy almost always means a command line interface, which again means coding.
That guy has never seen AI code before. It regularly gets even simple stuff wrong. Was he especially good is when it gives made up crap. Or it tells you a method or function you can use but doesn't tell you where it got that. And then you're like "oh wow I didn't realize that was available" and then you try it and realize that's not part of the standard library and you ask it "where did you get that" and it's like "oh yeah sorry about that I don't know".
My absolute favorite is when I asked copilot to code a UI button and it just pasted "// the UI element should do (...) but instead it is doing (...)" a dozen times.
Like, clearly someone on stackoverflow asked for help, got used for training data, and confused copilot
I don't get how it's not that AI would help programmers build way better things. if it can actually replace a programmer I think it's probably just as capable of replacing a CEO. I bet it's a better use case to replace CEO
It's a bit nuts actually when I think about it. AI could really be useful for replacing CEO's. The more I think about it the more it makes sense. Its one career that makes sense for it to replace.
Something I've always found funny about the "AI will replace programmers soon" is that this means AI's can create AI's and isn't this basically the end of the economy?
Every office worker is out of a job just like that and labourers only have as long as it takes to sort out the robot bodies then everyone is out of a job.
You thought the great recession was bad? You ain't seen nothing!
But like I just keep going back to the idea that no invention in history ever had reduced our work load. It has always only shifted the work to the other end of what the invention can produce. I just always expect a human is still need to glue some process together and in that niche little area, whole industries are created and the rest of us have to learn it
Let me weigh in with something. The hard part about programming is not the code. It is in understanding all the edge cases, making flexible solutions and so much more.
I have seen many organizations with tens of really capable programmers that can implement anything. Now, most management barely knows what they want or what the actual end goal is. Since managers aren't capable of delivering perfect products every time with really skilled programmers, if i subtract programmers from the equation and substitute in a magic box that delivers code to managers whenever they ask for it, the managers won't do much better. The biggest problem is not knowing what to ask for, and even if you DO know what to ask for, they typically will ignore all the fine details.
By the time there is an AI intelligent enough to coordinate a large technical operation, AIs will be capable of replacing attorneys, congressmen, patent examiners, middle managers, etc. It would really take a GENERAL artificial intelligence to be feasible here, and you'd be wildly optimistic to say we are anywhere close to having one of those available on the open market.
I agree with you completely, but he did say no need for 'human programmers' not 'human software engineers. The skill set you are describing is one I would put forward is one of if not the biggest different between the two.
This is really splitting hairs, but if you asked that cloud CEO if he employed programmers or 'software engineers' he would almost certainly say the latter. The larger the company, the greater the chance they have what they consider an 'engineering' department. I would guess he employs 0 "programmers" or 'engineeringless programmers'.
And anyone who believes that should be fired, because they don't understand the technology at all or what is involved in programming for that matter. At the very least it should make everyone question the company if its leadership doesn't understand their own product.
That is what happens when you mix a fucking CEO with tech "How many workers can I fire to make more money and boast about my achievements in the annual conference of mega yacht owners" where as the correct question should obviously have always been (unless you are a psychopath) "how can I use this tech to boost productivity of my workers so they can produce the same amount of work in less amount of time and have more personal time for themselves"
Also these idiots always forget the "problem solving" part of most programming tasks which is still beyond the capability of LLMs. Sure have LLMs do the mundane stuff so that programmers can spend time on stuff that is more rewarding? No instead lets try to fire everyone.
To predict what jobs AI will replace, you need to know both of the following:
What's special about the human mind that makes people necessary for completing certain tasks
What AI can do to replicate or replace those special features
This guy has an MA in industrial engineering and an MBA, and has been in business his whole career. He has no knowledge of psychology and whatever knowledge of AI that he's picked up on the side as part of his work.
He's not the guy to ask. And yet, I feel like this is the only kind of guy anyone asks.
When my job was outsouced a few years back, I was thinking there is probably a boat load of indien coming out of management schools that would do a great job at C level ! For a fraction of the price.
The sentiment on AI in the span of 10 years went from "it's inevitable it will replace your job" to "nope not gonna happen". The difference back then the jobs it was going to replace were not tech jobs. Just saying.
From the very beginning people were absolutely making connections between ai and tech jobs like programming.
The fuck are you talking about? Are you seriously trying to imply that now that it’s threatening tech jobs (it’s not) suddenly the narrative around how useful it will be changed (it didn’t)
When is that exactly do you have in mind? I'm talking about automation which roughly around 2010 the discourse was primarily centered around blue collar jobs. The discussion was about these careers becoming obsolete if AI ever advanced to the point where it involved little to no humans to perform the tasks.
Back then AI with regards to white collar jobs was no where near the primary focus of discourse much less programming.
Tech nerds back then were all gung ho about it making entire careers obsolete in the near future. Truck drivers were supposed to be a dead career by now. They absolutely do not hold the same enthusiasm right now when it's being said about their own careers.
I managed to get an AI to build pong in assembly. Are are pretty cool things, but not sci-fi level just yet, but I didn't just say "build pong in assembly", I have to hand hold it a little bit. You need to be a programmer to understand how to guide the AI to do the task.
That was something very simple, I doubt that you can get it to do more complex tasks without a more lot of back and forth.
To give you an example I had a hard time getting it to understand that the ball needed to bounce off at an angle if intercepted at an angle, it just kept snapping it to 90° increments. I couldn't fix it myself because I don't really know assembly well enough to really get into the weeds with it so I was sort of stuck until I was finally able to get the AI to do what I wanted it to. I sort of understood what the problem was, there was a number somewhere in the system and it needed to make the number negative, but it just kept setting the number to a value. A non-programmer wouldn't really understand that's what the problem was and so they wouldn't be able to explain to the AI how to fix it.
I believe AI is going to become an unimaginably useful tool in the future and we probably don't really yet understand how useful it's going to be. But unless they actually make AGI it isn't going to replace programmers.
If they do make AGI all bets are off it will probably go build a Dyson Sphere or something at that point and we will have no way of understanding what it's doing.
I tried to get it to build a game of checkers, spent an entire day on it, in the end I could have built the thing myself. Each iteration got slightly worse, and each fix broke more than it corrected.
AI can generate an "almost-checkers" game nearly perfectly every time, but once you start getting into more complex rules like double jumping it just shits the bed.
What these headlines fail to capture is that AI is exceptionally good at bite sized pre-defined tasks at scale, and that is the game changer. Its still very far from being capable of building an entire app on its own. That feels more like 5-10 years out.
I haven't tried to scaffold whole projects, but otherwise that lines up with my usage of AI copilots so far.
At this point, they're good at helping you interface with the built in commands and maybe some specific APIs, but it won't do your thinking for you. It just removes the need for some specific base-level knowledge.
Yeah, I don't see AI replacing any developers working on an existing, moderately complex codebase. It can help speed up some tasks, but it's far from being able to take a requirement and turn it into code that edits the right places and doesn't break everything.
I admit that I work faster with AI help and if people get more stuff done in less time there might be less billable hours in the future for us. But AI did not replace me, a 10 times cheaper dude from India did.
Everyone was always joking about how AI should just replace CEOs, but it turns out CEOs are so easily lead by the nose that AI companies practically already run the show.
Most companies can't even give decent requirements for humans to understand and implement. An AI will just write any old stuff it thinks they want and they won't have any way to really know if it's right etc.
They would have more luck trying to create an AI that takes whimsical ideas and turns them into quantified requirements with acceptance criteria. Once they can do that they may stand a chance of replacing developers, but it's gonna take far more than the simpleton code generators they have at the moment which at best are like bad SO answers you copy and paste then refactor.
This isn't even factoring in automation testers who are programmers, build engineers, devops etc. Can't wait for companies to cry even more about cloud costs when some AI is just lobbing everything into lambdas 😂
How much longer until cloud CEOs are a thing of the past? Wouldn't an AI sufficiently intelligent to solve technical problems at scale also be able to run a large corporate division? By the time this is actually viable, we are all fucked.
Don't worry guys. As long as project managers think "do the thing ... like the thing ... (waves hands around) ... you know ... (waves hands around some more) ... like the other thing ... but, um, ..., different" constitutes a detailed spec, we're safe.
The thing that I see most is that AI is dumb and can’t do it yet so we don’t need to worry about this.
To me, it’s not about whether it can or not. If the people in charge think it can, they’ll stop hiring. There is a lot of waste in some big companies so they might not realize it’s not working right away.
Source: I work for a big company that doesn’t do things efficiently.
Its not like jobs will disappear in a single day. Incremental improvements will render lower level tasks obsolete, it already has to a degree.
Someone will still need to translate the business objectives into logical structure, via code, language, or whatever medium. Whether you call that a "coder" or not, is kind of irrelevant. The nerdy introverts will need to translate sales-douche into computer one way or another. Sales-douches are not going to be building enterprise apps from their techbro-hypespeak.
I worked at a different MAANG company and saw internal slides showing that they planned on being able to replace junior devs with AI by 2025. I don't think it's going according to plan.
At the end of the day, one thing people forget about with these things is that even once you hit a point where an AI is capable of writing a full piece of software, a lot of businesses will still pay money to have a human read through, validate it, and take ownership of it if something goes wrong. A lot of engineering is not just building something for a customer, but taking ownership of it and providing something they can trust.
I don't doubt that eventually AI will do almost all software writing, but the field of software companies isn't about to be replaced by non software people just blindly trusting an AI to do it right (and in legally compliant ways), anytime soon.
I don't think ai will replace my job any time soon when it's first thought for going through a 2d matrix was to go through it 500 thousand times and check each one with a CPU intensive process leading my pc to come to a halt until I force stopped the script.