In fairness the computing world has seen unfathomable efficiency gains that are being pushed further with the sudden adoption of arm. We are doing our damnedest to make computers faster and more efficient, and we're doing a really good job of it, but energy production hasn't seen nearly those gains in the same amount of time. With the sudden widespread adoption of AI, a very power hungry tool (because it's basically emulating a brain in a computer), it has caused a sudden spike in energy needed for computers that are already getting more efficient as fast as we can. Meanwhile energy production isn't keeping up at the same rate of innovation.
The problem there is the paradox of efficiency, making something more efficient ends up using more of it not less as the increase in use stimulated by the greater efficiency outweighs the reduced input used.
It's not so much the hardware as it is the software and utilisation, and by software I don't necessarily mean any specific algorithm, because I know they give much thought to optimisation strategies when it comes to implementation and design of machine learning architectures. What I mean by software is the full stack considered as a whole, and by utilisation I mean the way services advertise and make use of ill-suited architectures.
The full stack consists of general purpose computing devices with an unreasonable number of layers of abstraction between the hardware and the languages used in implementations of machine learning. A lot of this stuff is written in Python! While algorithmic complexity is naturally a major factor, how it is compiled and executed matters a lot, too.
Once AI implementations stabilise, the theoretically most energy efficient way to run it would be on custom hardware made to only run that code, and that code would be written in the lowest possible level of abstraction. The closer we get to the metal (or the closer the metal gets to our program), the more efficient we can make it go. I don't think we take bespoke hardware seriously enough; we're stuck in this mindset of everything being general-purpose.
As for utilisation: LLMs are not fit or even capable of dealing with logical problems or anything involving reasoning based on knowledge; they can't even reliably regurgitate knowledge. Yet, as far as I can tell, this constitutes a significant portion of its current use.
If the usage of LLMs was reserved for solving linguistic problems, then we wouldn't be wasting so much energy generating text and expecting it to contain wisdom. A language model should serve as a surface layer -- an interface -- on top of bespoke tools, including other domain-specific types of models. I know we're seeing this idea being iterated on, but I don't see this being pushed nearly enough.[^1]
When it comes to image generation models, I think it's wrong to focus on generating derivative art/remixes of existing works instead of on tools to help artists express themselves. All these image generation sites we have now consume so much power just so that artistically wanting people can generate 20 versions (give or take an order of magnitude) of the same generic thing. I would like to see AI technology made specifically for integration into professional workflows and tools, enabling creative people to enhance and iterate on their work through specific instructions.[^2] The AI we have now are made for people who can't tell (or don't care about) the difference between remixing and creating and just want to tell the computer to make something nice so they can use it to sell their products.
The end result in all these cases is that fewer people can live off of being creative and/or knowledgeable while energy consumption spikes as computers generate shitty substitutes. After all, capitalism is all about efficient allocation of resources. Just so happens that quality (of life; art; anything) is inefficient and exploiting the planet is cheap.
[^1]: For example, why does OpenAI gate external tool integration behind a payment plan while offering simple text generation for free? That just encourages people to rely on text generation for all kinds of tasks it's not suitable for. Other examples include companies offering AI "assistants" or even AI "teachers"(!), all of which are incapable of even remembering the topic being discussed 2 minutes into a conversation.
[^2]: I get incredibly frustrated when I try to use image generation tools because I go into it with a vision, but since the models are incapable of creating anything new based on actual concepts I only ever end up with something incredibly artistically compromised and derivative. I can generate hundreds of images based on various contortions of the same prompt, reference image, masking, etc and still not get what I want. THAT is inefficient use of resources, and it's all because the tools are just not made to help me do art.
It's emulating a ridiculously simplified brain. Real brains have orders of magnitude more neurons, but beyond that they already have completely asynchronous evaluation of those neurons, as well as much more complicated connecting structure, as well as multiple methods of communicating with other neurons, some of which are incredibly subtle and hard to detect.
To really take AI to the next level I think you'd need a completely bespoke processor that can replicate those attributes in hardware, but it would be a very expensive gamble because you'd have no idea if it would work until you built it.
Some of the smartest people on the planet are working to make this profitable. It's fucking hard.
You are dense and haven't taking even a look at simple shit like hugging face. Power consumption is about the biggest topic you find with anyone in the know.
And yet we have brains. This brute force approach to machine learning is quite effective but has problems scaling. So, new energy sources or new thinking?
Well we can, we had a "jumpstyle" wave going on in the Netherlands a couple of years ago. No clue if it ever got off the ground anywhere else seeing as it was a techno thing or something.
I think we've got a bit before we have to worry about another major jump in AI and way longer for an Ultron. The ones we have now are effectively parsers for google or other existing data. I personally still don't see how we feel like we can get away with calling that AI.
Any AI that actually creates something 'new' that I've seen still requires a tremendous amount of oversight, tweaking and guidance to produce useful results. To me, they still feel like very fancy search engines.
So AI can't exist without stealing people's content
Using the word “steal” in a way that implies misconduct here is “You wouldn’t download a car” level reasoning. It’s not stealing to use the work of some other artist to inform your own work. If you copy it precisely then it’s plagiarism or infringement, but if you take the style of another artist and learn to use it yourself, that’s…exactly how art has advanced over the course of human history. “Great artists steal,” said Picasso famously.
Training your model on pirated copies, that’s shady. But training your model on purchased or freely available content that’s out there for anyone else to learn from? That’s…just how learning works.
Obviously there are differences, in that generative AI is not actually doing structured “thinking” about the creation of a work. That is, of course, the job of the human writing and tweaking the prompts. But training an AI to be able to write like someone else or paint like someone else isn’t theft unless the AI is, without HEAVY manipulation, spitting out copies that infringe on the intellectual property of the original author/artist/musician.
Generative AI, in its current form, is nothing more than a tool. And you can use any tool nefariously, but that doesn’t mean the tool is inherently nefarious. You can use Microsoft Word to copy Eat, Pray, Love but Elizabeth Gilbert shouldn’t sue Microsoft, she should sue you.
The models get more efficient and smaller very fast if you look just a year back. I bet we’ll run some small LLMs locally on our phones (I don’t really believe in the other form factors yet) sooner as we believe.
I’d say prior 2030.
I can already locally host a pretty decent ai chatbot on my old M1 Macbook (llama v2 7B) which writes at the same speed I can read, its probably already possible with the top of the line phones.
Because it's a miracle technology. Both of those things are also engineering problems - ones that have been massively mitigated already. You can run models almost as good as gpt3.5 on a phone, and individuals are pushing the limits on how efficiently we can train every week
It's not just making a chatbot or a new tool for art - it's also protein folding, coming up with unexpected materials, and being another pair of eyes that will assist a person do anything.
They literally promise the fountain of youth, autonomous robots, better materials, better batteries, better everything. It's a path for our species to break our limits, and become more.
The downside is we don't know how to handle it. We're making a mess of it, but it's not like we could stop... The AI alignment problem is dwarfed by the corporation alignment problem
I mean, we can only do that because our system was trained for hundreds of thousands, millions of years into being able to recognise others of same species
Almost all of our training was done without requiring burning fossil fuels. So maybe ole Sammy can put the brakes on his shit until it’s as fuel efficient as a human brain.
I recall a study about kids under a specific age that cannot get scared of looking at pictures of demons and other horror stuff because they don't know yet what your everyday default person looks like.
So I'd argue that even people need to get accustomed to a thing before they could recognise or have an opinion about anything.
Who's? Can the human brain just know what someone looks like without prior experience?
Your ability to do anything is based on decades of "data sets" that you're being constantly fed, it's no different than an AI they just get it all at once and we have to learn by individual experience.
At least AI has the potential to do something useful unlike coin mining. Although its not doing much currently so not to wild about it. Maybe real ai that could actually find new energy sources.
General artificial intelligence has the potential to be actually useful. Generative AI, like Chat GPT (Generative Pre-Trained Transformer) absolutely does not. It’s a glorified autocomplete.
In fact, the original script of The Matrix had the machines harvest humans to be used as ultra efficient compute nodes. Executive meddling led to the dumb battery idea .
not really. It's just different energy. Calories can be converted to a unit of heat, a unit of heat is directly analogous to a unit of energy. Electricity is a unit of energy as well. Thus you can compare them. It's how you compare things like electrical production efficiency of a thermal cycle generation process.
These comments often indicate a lack of understanding about ai.
Ml algorithms have been in use for nearly 50 years. They certainly become much more common since about 2012, particularly with the development of CUDA, It’s not just some new trend or buzz word.
Rather, what we starting to see are the fruits of our labour. There are so many really hard problems that just cannot be solved with deductive reasoning.
It's simultaneously possible to realize that something is useful while also recognizing the damage that its trend is causing from a sustainability standpoint, and that neither realization particularly demonstrates a lack of understanding about AI.
Exactly. This is why the AI hype train is overblown. Stop shoving "AI" everywhere when they know it'll cost a lot in electricity.
The real path forwards with AI will be specialized super advanced models costing hundreds per run (business use case) and/or locally run AI using NPUs, especially the latter.
In the end, as always, it will only benefit the companies. And all the people get is put out of a job because they have been replaced by some piece of software no one even understands anymore.
This is a silly take, people have benefitted hugely from all the big tech developments in the past and will do from ai also - just as you have a mobile phone that can save and improve your life in a myriad of ways so you'll have access to various forms of ai which will do similar. GPS is a good example, functionally free and making navigation far safer, faster, and better.
Here's a genuine already happened use case for ai benefitting you, an open source developer was able to add a whole load of useful features to their free software by using AI to help code - I know because it was me, among many many others.
I know people making open source ai tools too and they're all using AI coding assistants - mostly the free ones. I've seen a lot of academic researchers using AI tools also generally built using open source tools like pytorch and with help from ai coding tools. Even if you don't use ai yourself you're already benefitting from it, even if you don't use open source software the services you rely on do.
Imagine being able to implement the most advanced and newest methodologies in your design process or get answers to complex and niche questions about new technology instantly. You buy a printer for example and say to your computer 'I've plugged in a printer make it work' and it says 'ok, there isn't a driver available that'll work with your pc but I've written one based on the spec in the datasheet, do you want me to print a test page?'
Imagine being able to say 'talk me through diagnosing a fault on my washing machine' and it guides you through locating and fixing the fault, possibly by designing a replacement part and giving you fabrication options.
Or being able to say 'this website is annoying, change it so that I only see the video window' or 'make a playlist in release order of all abba songs that charted' or 'check on currently available archives to see if there's a mirror of this deleted post' or 'check all the sites and see if anyone posted a sub version of the next episode of this anime' or 'Keep an eye on this lemmy community and add any popular memes involving fish to my feed but don't bother with any meta stuff or aquatic mammals' or 'this advert says I can make free money, is it ligit?'
The use cases that will directly benefit your life are almost endless, natural language computing is a huge deal even without task based solvers and physical automation but we also have those too so the increased ability of people to make community projects and freely shared designs is huge.
It's called nuclear energy. It was discovered in 1932 and properly harnessed with an effective reactor that consumes both radioactive material and waste (CANDU) in 1950's/1960's and the newest CANDU reactors are some of the safest and most efficient energy generation in the world.
Pretending like there needs to be a larger investment into something like cold fusion in order to run these computers is incredibly dishonest or presenting a clear hole in education coverage. (The DoE should still work on researching cold fusion, but not because of this.)
I love nuclear but China is building them as fast as they can and they're still being massively outpaced by their own solar installations. If we hadn't shut down most of the research and construction in the 80's it would have been great, but it's not going to be a solution to the huge power requirement growth from EVs and shit like AI in the "short" term of 1-20 years.
Solar alone can't meet humanity's energy needs without breakthroughs in energy storage.
Most energy we use the grid for is generated on demand. That means only a few moments ago, the electricity powering your computer was just a lump of coal in a furnace.
If we don't have the means to store enough energy to meet demands when the sun isn't out or wind isn't blowing, then we need more sources of energy than just sun and wind.
There is a lot of misinformation being perpetuated by the solar industry to fool people like you into thinking all investments should be directed to it over other options.
Please educate yourself before parroting industry talking points that only exist to take people for a ride.
Yeah, nuclear has been available and in use over the period of the sharpest increase in co2 emissions. It’s not responsible for it, but it’s not the answer. The average person can’t harness nuclear energy. But all the renewable energies in the world can fit on a small house: wind, solar, hydro. Why bring radioactive materials into this?
I'm sorry but as an AI language processing model I am unable to discover alternative energy sources. My training data concludes on June 21, 2021 and I am unable to understand requests that would require knowledge after that date.
Not efficiently and not as reliably as a nuclear reactor though. It would if they built a space station in an orbit with minimal other objects getting in the way of it and the sun. Teach the ai in between Sol and Venus and bring it back if it discovers anything useful rather than making revenge porn and plagiarizing artists
1 GW of solar is much cheaper than 1 GW of nuclear. Solar is both cheaper to build and cheaper to run. It's the most efficient energy source e currently have.
Might be because it's a LLM not an AI and requires massive amounts of data to be funneled into it to actually work. My admittedly limited understanding of it makes it seem like it's just another buzzword for things like neural networks and machine learning.
The positive thing there is that it probably paces our development. If we can't get to true AGI without way more energy than we can currently produce, then we don't have true AGI risk right now.
There's still risk because it might not be true or we might be able to get close enough to do damage. But slowing down AI is fine by me.
Obviously not. We’re being faced with an existential threat if we don’t secure alternative, sustainable forms of energy and even that threat isn’t enough to motivate our species.
Massively subsidized and where do you put all the nuclear waste? Nuclear energy is dumb even without thinking about possible disasters. You are just falling for grifters who don't want us to use renewable sources of energy. And before you say it: no, nuclear energy is not green. You would know that if you actually googled for like 5 seconds, but it's easier to believe grifters promising "the one easy solution to solve all our problems", right?
While more dangerous, the quantity of waste generated compared to all other forms of energy generation is very small. Storage is a solved problem, but you have probably read articles about a lack of storage in the U.S. This is entirely due to politicians' failure to agree on where to store waste. Despite the relative safety, no one wants nuclear waste stored in their "back yard."
And before you say it: no, nuclear energy is not green.
Nuclear energy generates zero CO2. Surely we can agree that this is the most pressing consideration in terms of climate change. If your concern is the nuclear waste, then I direct you to the growing problem of disposing of solar cells and wind turbines. Newer turbine blades, for example, are 40 meters long and weigh 2.5 tons. These cannot be recycled.
No matter how you cut the data, nuclear is an order of magnitude better than almost all other forms of energy generation. If our goal is to radically improve our environmental footprint while keeping the lights on even at night when it's not windy, then nuclear absolutely must be part of the mix.
you put the nuclear waste in a hole, deep underground, after burning most of it up. Modern gen 4 designs can burn the vast majority of existing waste products down to a much more reasonable time span.
Nuclear energy is vastly more green than, coal, gas, petro, etc... Currently arguably more sustainable than massive amounts of solar and wind energy. Wind in particular has a massive waste issue, solar, it's more complicated but there are a lot of precious metals involved and heavy refining done. It's not a zero emissions industry either. The actual production of electricity IS net zero, unlike coal, petro, and gas, which still powers the majority of our grids. Please continue to explain to me how fossil fuels are better than funny green rock.
You're also accusing me of knowing nothing about nuclear, which is funny, considering i have quite the autistic hyper-fixation on it. And know vastly more about it than the average person. Judging by your response, you're probably not in the field of nuclear energy either.
Nuclear is a technology we know how to build, understand how to operate safely, and are capable of doing correctly. The only thing we need, is more nuclear plants.
Wow, I fucking hate this guy the more he opens his mouth. He can seriously fuck off right now, if he thinks AI realistically needs him at this point he's sadly mistaken.
Aneutronic fusion isn't happening on this planet. We don't even have the fuel for it. It's a dumb thing to market when we can't even break even on D-T fusion and turning the neutrons into heat.
While I’m too much of an optimist to say that we’ll never figure out viable fusion power, I do think you’re more right than wrong.
Fission power is essentially us discharging a fusion battery, where the battery was charged by a supernova. We don’t get any free help with fusion, and we have to replicate input energies only seen in nature with stellar amounts of gravitational mass. It is (IMO) an important area of research, but I don’t expect it to power our cities in my lifetime.
Thorium fuel cycle is nearly the same as the uranium fuel cycle in regards to downsides. It just requires breeders, which you could use with uranium too. The only real benefit of thorium is that it's more plentiful, but the cost in nuclear power isn't in Uranium.
Toxic/Radioactive waste is obviously toxic and radioactive, but how bad that really is is kind of overblown especially if you compare it to the harm caused by popular existing methods like coal/etc. When adjusted based on energy produced, there's more than one study out there showing how Nuclear is significantly safer than coal by a very wide margin. Coal ash is also radioactive and coal plants have very limited requirements to prevent it from escaping to the environment.
Even 'Radioactive Waste' really only feels scary because all of the bad stuff is condensed into a much smaller package when you adjust based on energy produced again.
The process is ludicrously energy intensive, with experts estimating that the industry could soon suck up as much electricity as an entire country.
Unperturbed, billionaires including Jeff Bezos, Peter Thiel and Bill Gates have poured substantial amounts of money into the idea.
However, while the emergent crop of startups like Helion has repeatedly claimed that fusion energy is right around the corner, we have yet to see any concrete results.
Of course, if Altman's rosy vision of the future of energy production were to turn into a reality, we'd have a considerably greener way to power these AI models.
According to an October paper published in the journal Joule, adding generative AI to Google Search alone balloons its energy uses by more than tenfold.
"Let’s not make a new model to improve only its accuracy and speed," University of Florence assistant professor Roberto Verdecchia told the New York Times.
The original article contains 525 words, the summary contains 149 words. Saved 72%. I'm a bot and I'm open source!
pocket nuke plants... have to be the stopgap between here and fusion. are there still people working on those car-sized nuke plants for a more distributed system?