Or maybe artist should be able to not justify their existence monetarily and also not have their art fucking stolen and murdered to generate terrible pseudo art lmao.
The only reason you care is because you’ve been conditioned to attack anything that could harm your income-potential.
My ‘favorite’ is the argument that replacing jobs is what technology is meant to do.
This isn’t just a job. If I won the lotto tomorrow, if I had billions and billions of dollars and never had to make another cent in my life, I would still be writing. Art is not just a production, it is a form of communication, between artist and audience, even if you never see them.
Writing has always been something like tossing a message in a bottle into a sea of bottles and hoping someone reads it. Even if the arguments that AI can never replace human writing in terms of quality is true, we’re still drowned out by the noise of it.
Even if the arguments that AI can never replace human writing in terms of quality is true, we’re still drowned out by the noise of it.
It's a good point but honestly the internet is mostly just noise and it's not a problem we're going to solve. It's something we have to learn to live with. If you take more than a passing interest in an art, you should be able to find an island in the ocean of noise with like-minded people.
The issue is, the goal of AI companies and AI believers is the replacement of artists. They see more traditional forms of art as obsolete. Especially the AI believers, who do everything they can (including trying to generate fake art progresses, and entering art contests with AI generated images) to "disrupt the art community".
You can't "defeat" AI. It's not an organization or a group to fight against; it's technological progress.
The weavers' uprisings didn't stop the Industrial Revolution either. AI is a tool, and those who learn to handle and adapt to it the fastest will be the ones who fare the best.
The weavers fought the capitalist system not industrial progress with the weaving machines being the easiest target as the where expensive to build, but highly profitable.
Same here: The goal is the overcoming of capitalism but until then we can annoy them by messing with their new toys.
It's not even a matter of bullying: NFTs disappeared because they were fundamentally not viable, and there's a good chance that generative AI is also not viable.
Generating an output is extremely computationally expensive, which is a problem because you need several attempts to get an acceptable output (at least in terms of images). This service can't stay free or cheap forever, and once it starts being expensive, that's also a problem in itself since generative AI is most suited to generate large amounts of low-profit content.
For example, earlier this month, Deviantart highlighted a creator that they claimed to be one of their highest earners; they made $25k "in less than a year", which is not much for the highest earner, and they did it by posting over NINE THOUSAND images in that time. They were selling exlusives for less than $10.
The only way this makes sense is if it's really cheap to generate that many images. Even a moderate price, multiplied by 9000, multiplied by the number of attempts each time, would have destroyed their already middling profit.
Like, reading this comic as a normal person, I see a ha-ha funny joke about the robot doing a hitler. Why does being rejected from art school make him do this? Uh... I don't know. It's just a reflection on an old story, don't think too much about it.
Viewing this comic through the lense of a nazi, however, doesn't it seem a little bit like a call to action? As if it's excusing the violence the nazi robot will engage in as a kind of justice for people's dismissal of AI art?
The only thing you have to do to reach that second conclusion is not believe that the nazi outcome is a bad one.
Anyway, I dunno. I'll never know what Stonetoss really meant, I just think it's interesting.
Nobody has been able to make a convincing argument in favour of generative AI. Sure, it's a tool for creating art. It abstracts the art making process away so that the barrier to entry is low enough that anyone can use it regardless of skill. A lot of people have used these arguments to argue for these tools, and some artists argue that because it takes no skill it is bad. I think that's beside the point. These models have been trained on data that is, in my opinion, both unethical and unlawful. They have not been able to conclusively demonstrate that the data was acquired and used in line with copyright law. That leads to the second, more powerful argument: they are using the labour of artists without any form of compensation, recognition, permission, or credit.
If, somehow, the tools could come up with their own styles and ideas then it should be perfectly fine to use them. But until that happens (it won't, nobody will see unintended changes in AI as anything other than mistakes because it has no demonstrable intent) use of a generative AI should be seen as plagiarism or copyright infringement.
Copyright gives the copyright holder exclusive rights to modify the work, to use the work for commercial purposes, and attribution rights. The use of a work as training data constitutes using a work for commercial purposes since the companies building these models are distributing licencing them for profit. I think it would be a marginal argument to say that the output of these models constitutes copyright infringement on the basis of modification, but worth arguing nonetheless. Copyright does only protect a work up to a certain, indefinable amount of modification, but some of the outputs would certainly constitute infringement in any other situation. And these AI companies would probably find it nigh impossible to disclose specifically who the data came from.
So, how do art students learn? They are doing the exact same things. Only they do a lot less, because natural neural networks (aka brains) are not capable of processing training data as quickly. It's not as if every artist has to reinvent the wheel and generative AIs don't and as such have an unfair advantage.
Look at inventions like the printing press! Did everybody like it? The catholic church certainly didn't! Is it a a phantastic peace of technology anyway? Sure is!
Students learn techniques that they apply to their own personal style. The goal of art school isn't to create a legion of artists that can churn out identical art, it's to give young creatives the tools they need to realize the ideas in their head.
AI has no ideas in it's head. Instead, it takes in a bunch of an artists work, and then produces something that does it's best to match the plagiarized artist's style exactly.
Unions would probably work, as long as you get some people the company doesn't want to replace in there too
Maybe also federal regulations, although would probably just slow it because models are being made all around the world, including places like Russia and China that the US and EU don't have legal influence over
Also, it might be just me, but it feels like generative AI progress has really slowed, it almost feels like we're approaching the point where we've squeezed the most out of the hardware we have and now we just have to wait for the hardware to get better
Well, current law is not written with AI in mind, so what current law says about the legality of AI doesn't reflect its morality or how we should regulate it in the future
EFF does some good stuff elsewhere, but I don't buy this. You can't just break this problem down to small steps and then show for each step how this is fine when considered in isolation, while ignoring the overall effects. Simple example from a different area to make the case (came up with this in 2 minutes so it's not perfect, but you can craft this out better):
Step 1: Writing an exploit is not a problem, because it's necessary that e.g., security researchers can do that.
Step 2: Sending a database request is not a problem, because if we forbid it the whole internet will break
Step 3: Receiving freely available data from a database is not a problem, because otherwise the internet will break
Conclusion: We can't say that hacking into someone else's database is a problem.
What is especially telling about the "AI" "art" case: The major companies in the field are massively restrictive about copyright elsewhere, as long as it's the product of their own valuable time (or stuff they bought). But if it's someone else's work, apparently it's not so important to consider their take on copyright, because it's freely available online so "it's their own fault to upload it lol".
Another issue is the chilling effect: I for one have become more cautious sharing some of my work on the internet, specifically because I don't want it to be fed into "AI"s. I want to share it with other humans, but not with exploitative corporations. Do you know a way for me to achieve this goal (sharing with humans but not "AI") in today's internet? I don't see a solution currently. So the EFF's take on this prevents people (me) from freely sharing their stuff with everyone, which would otherwise be something they would encourage and I would like to do.
I don't see a problem with Generative AI, because it's just going to be a great tool for companies to add graphics real fast to their products. I don't see it replacing regular art, since "AI art" is just a natural progression for endless content that you can already scroll on social media when you are bored.