I predict execs will never see this despite you being correct. We replaced most of our HR department with enterprise GPT-4 and now almost all HR inquiries where I work is handled through a bot. It daydreams HR policies and sometimes deletes your PTO days.
"Workforce" doesn't produce innovation, either. It does the labor. AI is great at doing the labor. It excels in mindless, repetitive tasks. AI won't be replacing the innovators, it will be replacing the desk jockeys that do nothing but update spreadsheets or write code. What I predict we'll see is the floor dropping out of technical schools that teach the things that AI will be replacing. We are looking at the last generation of code monkeys. People joke about how bad AI is at writing code, but give it the same length of time as a graduate program and see where it is. Hell, ChatGPT has only been around since June of 2020 and that was the beta (just 13 years after the first iPhone, and look how far smartphones have come). There won't be a huge demand for workforce in 5 years, there will be a huge portion of the population that suddenly won't have a job. It won't be like the agricultural or industrial revolution where it takes time to make it's way around the world, or where this is some demand for artisanal goods. No one wants artisanal spreadsheets, and we are too global now to not outsource our work to the lowest bidder with the highest thread count. It will happen nearly overnight, and if the world's governments aren't prepared, we'll see an unemployment crisis like never before. We're still in "Fuck around." "Find out" is just around the corner, though.
Even mindless and repetitive tasks require instances of problem solving far beyond what a.i is capable of. In order to replace 41% of the work force you’ll need a.g.i and we don’t know if thats even possible.
I’ve worked with humans, who have computer science degrees and 20 years of experience, and some of them have trouble writing good code and debugging issues, communicating properly, integrating with other teams / components.
I don’t see “AI” doing this. At least not these LLM models everyone is calling AI today.
Once we get to Data from Star Trek levels, then I can see it. But this is not that. This is not even close to that.
You know what I like about Pareto law and all the "divide and conquer" algorithms? You should still know where the division is and which 10% are more important than the other 90%.
Anyway, my job is in learning new stuff quickly and fixing that. Like of many-many people, even some non-technical types really.
People who can be replaced with machines have already been for the most part, and where they can't, it's also a matter of social pressure. Mercantilism and protectionism and guilds historically were defending the interests of certain parties, with force too.
No, I don't think there'll be a sudden "find out" different from any other period of history.
just 13 years after the first iPhone, and look how far smartphones have come
I disagree.
As someone who has the first iPhone, it was amazing and basically did everything that a new one does. It went on all websites, had banking apps and everything.
I would actually argue phones have become worse, they are very bloated and spy on you, at first they actually made your life better and there was no social media apps super charged for addiction.
these are the same people who continue to use monetary incentives despite hard scientific evidence that it has the opposite effect from what is desired. they're not gonna realise shit.
In my experience, 100% of executives don't actually know what their workforce does day-to-day, so it doesn't really surprise me that they think they can lay people off because they started using ChatGPT to write their emails.
This was my immediate thought too. Even people 2-3 levels of management above me struggle to understand our job let alone the person 5-6 levels up in the executive suite.
At my last job my direct manager had to explain to upper management multiple times that X role and Y role could not be combined because it would require someone to physically be in multiple places simultaneously. I think about that a lot when I hear about these corporate plans to automate the workforce.
However, people saying that C-suite can be replaced with GPTs don't understand that plenty of people not in C-suite could be replaced or not replaced just as well. Lots of office plankton around with such reasoning skills that I just don't know how their work can bring profit.
I can't decide whether those people are really needed or they are employed so that they wouldn't collectively lynch those of us who'd keep relevance, but wouldn't be social enough to defend from that doom.
The problem with building hierarchies of humans is with humans politicking and lying and scheming with each other, not even talking about usual stuff like friendship and sympathy and their opposites. It's just impossible to see what's really happening behind all that.
Some of that 59% might, but I guarantee at least some very strongly think it will change things, but think the change it brings will require as many people as before (if not more), but that they will be doing exponentially more with the people they have.
As soon as we’ve managed to make a computer that can simulate an entire brain in real time. Who knows how many decades or even centuries will that take.
I really want to see if worker owned cooperatives plus AI could do help democratize running companies (where appropriate). Not just LLMs, but a mix of techniques for different purposes (e.g., hierarchial task networks to help with operations and pipelining, LLM for assembling/disseminating information to workers).
I've advised past clients to avoid reducing headcount and instead be looking at how they can scale up productivity.
It's honestly pretty bizarre to me that so many people think this is going to result in the same amount of work with less people. Maybe in the short term a number of companies will go that way, but not long after they'll be out of business.
Long term, the companies that are going to survive the coming tides of change are going to be the ones that aggressively do more and try to grow and expand what they do as much as possible.
Effective monopolies are going out the window, and the diminishing returns of large corporations are going to be going head to head with a legion of new entrants with orders of magnitude more efficiency and ambition.
This is definitely one of those periods in time where the focus on a quarterly return is going to turn out to be a cyanide pill.
Short term is all that matters. Business fails? Start another one, and now you have a bunch of people that you made unemployed creating downward pressure on labor prices.
Scaling up productivity is what tends to lead to layoffs. Having the exact same output but with fewer employees is pretty much guaranteed to lower cost and increase profit, so that's what most execs are likely to do. Short-sited maybe, but businesses are explicitly short-sited, only focusing on the next quarter.
Freeing humans from toil is a good idea, just like the industrial revolution was. We just need our system to adapt and change with this new reality, AGI and universal basic income means we could live in something like the society in star trek.
Can't wait for AI to replace all those useless execs and CEOs. It's not like they even do much anyways, except fondling their stocks. They could probably be automated by a markov chain
If they could replace project managers that would be nice. In theory it is an important job, but in practice it's just done by someone's mate who was most productive when they don't actually turn up.
The Paranoia RPG has a very realistic way of determining who gets to be the leader of a group. First, you pick who'll do what kind of job (electronics, brute force, etc). Whoever didn't get picked becomes the leader, as that person is too dumb to do anything useful.
I swear people don't know the difference between a good project manager and a bad one, or no one.
Everyone on here is on about how the.board has no idea what the bottom rungs of the ladder do and are all "haha they are so stupid they think we do nothing". Then in the next sentence say they don't know what the board does and that they just do nothing.
Don't get a job in government contracting. Pretty much I do the work and around 5 people have suggestions. None of whom I can tell to fuck off directly.
Submit the drawing. Get asked to make a change to align with a spec. Point out that we took exception to the spec during bid. Get asked to make the change anyway. Make the change. Get asked to make another change by someone higher up the chain of five. Point out change will add delays and cost. Told to do it anyway. Make the next change....
Meanwhile every social scientist "we don't know what is causing cost disease"
Game's changed. Now we fire people, try to rehire them for less money and if that doesn't work we demand policy changes and less labour protection to counter the "labour shortage".
Labor shortage is such a funny term. It's like coming to a store and looking for 1kg of meat for 1$, not finding it and saying there's meat shortage. Or coming to a vegetarian store and looking for 1kg of any meat and saying the same.
When everybody is employed, but the economy needs more people - that's labor shortage. When there are people looking for jobs, but not satisfied with particular offerings - that's something else.
If Gartner comes out with a decent AI model, you could replace over half of your CIOs, CISOs, CTOs, etc. Most of them lack any real leadership qualities and simply parrot what they're told/what they've read. They're their through nepotism.
Also, most of them use AI as a crutch, so that's all they know. Meanwhile, the rest of us use it as a tool (what it's meant to be).
41% execs think that a huge amount of class power will go from workers in general to AI specialists (and probally the companies they make or that hire them).
I personally can't wait for a lot these businesses that bet on the wrong people to replace turn around and form new competition but with this new tech filling in the gaps of middle management, hr, execs, etc.
I mean its fucking meme, but an AI assisted workplace democracy seems alright to me on paper (the devils in details).
Execs don't give a shit. They simply double down on the false cause fallacy instead. They wouldn't ever admit they fucked up.
Last year the company I work for went through a run of redundancies, claiming AI and system improvements were the cause. Before this point we were growing (slowly) year on year. Just not growing fast enough for the shareholders.
They cut too deep, shit is falling apart, and we're loosing bids to competitors. Now they've doubled down on AI, claiming blindness to the systems issues they created, and just made an employee's "Can Do" attitude a performance goal.
Same. I welcome our AI overlords as long as that means I can just stay at home and fully embrace my autism by not giving a fuck about the workforce while studying all of the thousands of subjects I enjoy learning about.
The autism is not required. No one cares about their jobs, especially people who work in jobs where "everyone is a family". People care about those jobs the least.
I will never care if AI takes mandatory work from me, but I want income replacement lol. Seriously though I hate working so much every job I've ever had has made me suicidal at some point. I'm glad there's a chance at least I won't have nothing but work and death ahead of me. If that's all that's left it's okay, a little disappointing but it is what it is.
Execs? The same people who make short sighted decisions and don't understand basic psychology? Let me go get a pen so I won't...give two fucks what this bogus survey says. Let AI run your business so I can have some excitement in my life
As someone scripting a lot for my department in the tech industry, yea AI and scripts have a lot of potential to reduce labor. However, given how chaotic this industry is, there will still need to be humans to take into account the variables that scripts and AI haven't been trained on (or are otherwise hard to predict). I know the managers don't wanna spend their time on these issues, as there's plenty more for them to deal with. When there's true AGI, that may be a different scenario, but time will tell.
Currently, we need to have some people in each department overseeing the automations of their area. This stuff mostly kills the super redundant data entry tasks that make me feel cross eyed by the end of my shift. I don't wanna be the embodiment of vlookup between pdfs and type the same number 4+ times.
But the fact that this tech really kicked off just three years ago and is already threatening so many jobs, is pretty telling. Not only will LLMs continue to get better, but they're a big step towards AGI and that's always been an existential crisis we knew was coming. This is the the time to start adapting, quick.
You’ll get blindsided real quick. AIs are just getting better. OpenAI are already saying they moved past GPT for their next models. It’s not 5 years before it can fix code longer than 400 lines, and not 20 before it can digest a specification and spout a working software. Said software might not be optimized or pretty, but those are things people can work separately. Where you needed 20 software engineers, you’ll need 10, then 5, then 1-2.
You have more in common with the guy getting replaced today than you care to admit in your comment.
Edit: not sure why I’m getting downvoted instead of having a discussion, but good luck to you all in your careers.
I know it's getting boring. I am tried of people telling me how chatgpt and friends are toys that just spit back website data and in the same comment telling me how they are basically angry gods ready to end the human race.
Yeah, don’t smash the looms, seize them. The ability to make labor easier and more efficient is a positive if we don’t allow it to be a means to impoverish the workers
Is the intent here to preserve jobs even if it's less productive? That's solving the wrong problem. Instead of banning it, we should be adapting to it. If AI is more efficient than people, the jobs people take should change.
I think there's a solid case that if something would devolve into rent-seeking because competition is unproductive, it should be provided as a public service. Do you need a job if all of your basic needs are met by AI? At that point, any work you do would be optional, so people would follow their passions instead of working to make ends meet (see: Star Trek universe).
Think of it like Basic Income, but instead of cash, you'd get services at-cost. I think there's room for non-profits (or maybe the government) to provide these AI-services at-cost.
Outlawing it is a very dangerous aim, because outlawing it completely will enable other countries to out-compete us, and a outlawing it completely is right next to "outlaw it for normal people, but allow companies to exploit it for profit" on the dart board of possibilities.
Better path all around is "allow everyone to use AI and establish strong social safety nets and move towards enabling people to work less".
Haven't I been hearing that since the rise of computing and the internet? And it's probably been around even longer. Seems like this sort of stuff only gets going when a lot of workers start putting up a fight.
But hey, maybe 41% jobs lost might be the tipping point. Because people aren't just gonna sit on the sidewalk and starve.
Let's get rid of corporate profits for shareholders. If your actually want to fix the problem. Make it illegal for shareholders to profit more than employees
There is no denial a.i. is going to replace or significantly reduce some jobs. But I predict it's going to happen mostly in bullshit job like marketing, advertisement, the kind of journalism that repeat the same news from other reputed newspaper.
A.i. isn't going to replace the migrants that lay bricks in front on me, it's not going to replace their chief.
It’ll reduce the workforce from well-remunerated professionals who perform tasks to a larger number of disposable minimum-wage labourers who clean up botshit.
The jobs AI would be best at eliminating are HR and management. Instead, corpos give these shitters the power to eliminate other positions and then they still act like HR and management are the people producing value.
The AI won't do my job exactly, but managers mostly manage, i.e. deal with organisational overhead. That Excel you've been maintaining for the past decade was never as crucial to the business' success as you made it appear. It was something the higher ups liked to talk about with pretty charts. An UI can generate other things to talk about from the same data.
I don't agree that those people don't have transferable skills, but I agree that's going to hurt. Like flattening hierachies, self-organised teams and outsourcing, previously cushy jobs will be replaced with more stressful ones.
You used to have a secretary to make calls for you and organise your calender. Now you have copilot and customers call you directly.
You don't need powerful AI or anything for this to happen. They just stopped hiring secretarial staff when managers learned how to use a computer.
I always ask myself who will buy the products these companies produce if all the workers have been fired. Maybe inflation is just the natural ramp up to McDonald's charging 5,000 dollars for automated chicken nuggets when there are only billionaire left with money lol.
When it's cheaper to make the products because you don't have to pay anyone, people will look at that manufacturer and think... wow I can start a business like that and make an easy profit? Competition will drive down prices.
If they gain decent market share, they will be bought by one of the two or three companies that owns the entirety of that manufacturing category. If they don't, the incumbents will lower prices until the new thing is out of business. In either case, the prices bounce back, and even increase because of "inflation."
There's one missing piece here, and it's startup capital. You don't usually see new chemicals manufacturers for instance, because you need a lot of money to buy everything to start with.
I never had the impression that there were enough people for the amount of work anyways. I don't see jobs go, but shift. Most developers will be fine, because of never ending work, AI is just a tool speeding things up. But not that much, as someone who is good with Google and git, is just a bit slower to find the same answers. And AI needs verification too, even if it links you directly to the issue at hand, via source url.
AI will create new issues. Some of the low level requirement jobs will go, like working in first level support, but only if you learn the AI yourself, else it's too generic. We're not there yet, where companies learn their own LLM yet. some outlier try.
We got to understand that there's still a human layer and a lot of people might prefer calling a human, even if the result is worse, simply because we're social beings. This can cost a lot of customers, if companies believe they can just shove an AI in front.
No one really knows how good AI will get. As the technology advances, we find more and more hard to solve issues, for instance that AI will make things up or gives wrong answers, despite knowing the real answer, if you pressure hard enough.
Also for security reasons you can't add AI everywhere, unless you want to send all secrets directly to Microsoft, Google or Facebook.
AI won't so much replace labor as make it more fungible, and thus exploitable/abusable.
Except where its used as an excuse to just... Not. "Yes we have customer service; its just all chatgpt with no permissions" so nobody can ever return shit that was delivered broken.
After reading this article that got posted on Lemmy a few days ago, I honestly think we're approaching the soft cap for how good LLMs can get. Improving on the current state of the art would require feeding it more data, but that's not really feasible. We've already scraped pretty much the entire internet to get to where we are now, and it's nigh-impossible to manually curate a higher-quality dataset because of the sheer scale of the task involved.
We also can't ask AI to curate its own dataset, because that runs into model collapse issues. Even if we don't have AI explicitly curate its own dataset, it's highly likely going to be a problem in the near future with the tide of AI-generated spam. I have a feeling that companies like Reddit signing licensing deals with AI companies are going to find that they mostly want data from 2022 and earlier, similar to manufacturers looking for low-background steel to make particle detectors.
We also can't just throw more processing power at it because current LLMs are already nearly cost-prohibitive in terms of processing power per query (it's just being masked by VC money subsidizing the cost). Even if cost wasn't an issue, we're also starting to approach hard limits in physics like waste heat in terms of how much faster we can run current technology.
So we already have a pretty good idea what the answer to "how good AI will get" is, and it's "not very." At best, it'll get a little more efficient with AI-specific chips, and some specially-trained models may provide some decent results. But as it stands, pretty much any organization that tries to use AI in any public-facing role (including merely using AI to write code that is exposed to the public) is just asking for bad publicity when the AI inevitably makes a glaringly obvious error. It's marginally better than the old memes about "I trained an AI on X episodes of this show and asked it to make a script," but not by much.
As it stands, I only see two outcomes: 1) OpenAI manages to come up with a breakthrough--something game-changing, like a technique that drastically increases the efficiency of current models so they can be run cheaply, or something entirely new that could feasibly be called AGI, 2) The AI companies hit a brick wall, and the flow of VC money gradually slows down, forcing the companies to raise prices and cut costs, resulting in a product that's even worse-performing and more expensive than what we have today. In the second case, the AI bubble will likely pop, and most people will abandon AI in general--the only people still using it at large will be the ones trying to push disinfo (either in politics or in Google rankings) along with the odd person playing with image generation.
In the meantime, what I'm most worried for are the people working for idiot CEOs who buy into the hype, but most of all I'm worried for artists doing professional graphic design or video production--they're going to have their lunch eaten by Stable Diffusion and Midjourney taking all the bread-and-butter logo design jobs that many artists rely on for their living. But hey, they can always do furry porn instead, I've heard that pays well~
Yeah, it's my second language. Sorry I wrote it a minute before bed, sometimes sentences become even weirder then. I went back and added some more commas. Haha
That only shows what they hope will happen. In reality menial tasks can be automated and humans can be shifted to more creative roles, which in all honesty means execs could be reduced and replaced by AI. They do nothing other than follow trail of money and waste company resources, something AI can be much more efficient at.
Imo when you make an industry easier for the managers/ceos using ai and have fewer workers, it will also be easier for people to create competition in that industry... driving down prices.
A survey of senior biz executives reveals that 41 percent expect to have a smaller workforce in five years due to the implementation of AI technologies.
The research from staffing provider and recruitment agency Adecco Group found a "buy mindset" around AI, which "could exacerbate skills scarcity and create a two-speed workforce."
The figure is highest in Germany and France, where 49 percent of respondents say their company will employ fewer people in five years because of AI.
Seventy-eight percent of respondents say GenAI will play a "critical role in providing upskilling and development opportunities."
"While there is no denying that commercial interest in AI has been driven by its ability to reduce headcounts, the disruption will be a positive one – these industries have been suffering from decades-long skills crises, short on talent due to the high barriers to entry.
"Robotic engineers, data governors, drug discovery analysts – these are the jobs tomorrow that rely on AI," she told us.
The original article contains 438 words, the summary contains 161 words. Saved 63%. I'm a bot and I'm open source!
ITT: bunch of people who have no idea what AI even means
This is kind of like the early days of computers or internet all over again. LLMs is not what educated people mean when they're talking about AI. ChatGPT is not going to take your jobs, AGI will. Nobody just knows when. Might be next year or it might take 2 decades.
Who do these assholes think will buy their products and services when they put the entire workforce out of work? Do they plan to retreat to their bunkers and live out their days underground while the world burns above?
If it isn't yet, AI will be calling the shots on the actual money owners (those big investment companies like Blackrock). Invest here, invest there, demand more from elsewhere. Said AI will then dictate who should be appointed CEO, director, etc, because it will be asked to name "a human" and little Timmy McMeritocracy, son of a high up elsewhere, needs his first job, nevermind that putting an AI in his place would be more profitable.
But seriously, work will always expand to the available workforce. That's why there are so many stupid industries. They always tank during a resession, but other industries will expand to use excess labor.