I just want to make funny Pictures.
I’ve gotten arguments that it’s theft, because technically the AI is utilizing other artist’s work as resources for the images it produces. I’ve pointed out that that’s more like copying another artist’s style than theft, which real artists do all the time, but it’s apparently different when a computer algorithm does it?
Look, I understand people’s fears that AI image generation is going to put regular artists out of work, I just don’t agree with them. Did photography put painters out of work? Did the printing press stop the use of writing utensils? Did cinema cause theatre to go extinct?
No. People need to calm down and stop freaking out about technology moving forward. You’re not going to stop it; so you might as well learn to live with it. If history is a reliable teacher, it really won’t be that bad.
Except it isn’t copying a style. It’s taking the actual images and turning them into statistical arrays and then combining them into an algorithmic output based on your prompt. It’s basically a pixel by pixel collage of thousands of pictures. Copying a style implies an understanding of the artistic intent behind that style. The why and how the artist does what they do. Image generators can do that exactly as well as the Gaussian Blur tool can.
The difference between the two is that you can understand why an artist made a line and copy that intent, but you’ll never make exactly the same line. You’re not copying and pasting that one line into your own work, while that’s exactly what the generator is doing. It just doesn’t look like it because it’s buried under hundreds of other lines taken from hundreds of other images (sometimes - sometimes it just gives you straight-up Darth Vader in the image).
and just about any artist can draw Darth Vader as well, almost all non “ethics” or intent based argument can be applied to artists or sufficiently convoluted machine models.
But just about any artist isn’t reproducing a still from The Mandalorian in the middle of a picture like right-clicking and hitting “save as” on a picture you got from a Google search. Which these generators have done multiple times. A “sufficiently convoluted machine model” would be a senient machine. At the level required for what you’re talking about, we’re getting into the philosophical area of what it means to be a sentient being, which is so far removed from these generators as to be irrelevant to the point. And at that point, you’re not creating anything anyway. You’ve hired a machine to create for you.
These models are tools that use an algorithm to collage pre-existing works into a derivative work. They can not create. If you tell a generator to draw a cat, but it hasn’t any pictures of cats in its data set, you won’t get anything. If you feed AI images back into these generators, they quickly degrade into garbage. Because they don’t have a concept of anything. They don’t understand color theory or two point perspective or anything. They simply are programmed to output their collection of vectorized arrays in an algorithmic format based upon certain keywords.
It’s taking the actual images and turning them into statistical arrays and then combining them into an algorithmic output based on your prompt.
So looking at images to make a generalised understanding of them, and then reproduce based upon additional information isn’t exactly what our brain does to copy someones style?
You are arguing against your own point here. You don’t need to “understand the artistic intent” to copy. Most artists don’t.
Well said.
I’d like to add that the biggest problem, imo, is the closed source nature of the models. Corporations who used our collective knowledge, without permission, to create AI to sell back to us is unethical at best. All AI models should be open source for public access, sort of like libraries. Corpos are thrilled we’re fighting about copyright pennies instead, I’m sure.
If only there was a way to make funny pictures without AI…
The problem with Generative Neural Networks is not generally the people using them so much as the people who are creating them for profit using unethical methods.
As far as I’m concerned, if you’re using AI it’s no worse than grabbing a random image from the internet, which is a common and accepted practice for many situations that don’t involve a profit motive.
The “profit motive” is just the tip of the iceberg.
I’ve seen people stopping looking for random images from the web to grab them, and instead going full AI. With reverse image searches, it even doubled as an advertisement, nowadays you’re getting even less of that.
Oooo an AI straw man
Hey, as long as you don’t try to
- Sell it
- Claim it’s yours
- Use it instead of hiring professionals if you’re a business
not too fussed.
Why not sell it? Pet Rocks were sold.
Why not claim it’s yours? You wrote the prompt. See Pet Rocks above.
Not use it and instead hire a professional? That argument died with photography. Don’t take a photo, hire a painter!
So what if AI art is low quality. Not every product needs to be art.
Why not sell it? Pet Rocks were sold.
Why not claim it’s yours? You wrote the prompt. See Pet Rocks above.
Because, unlike pet rocks, AI generated art is often based on the work of real people without attribution or permission, let alone compensation.
Not use it and instead hire a professional? That argument died with photography. Don’t take a photo, hire a painter!
So what if AI art is low quality. Not every product needs to be art.
Do you know what solidarity is? Any clue at all?
Seems like the concept is completely alien to you, so here you go:
Copyright and intellectual property is a lie cooked up by capitalists to edge indie creators out of the market.
True solidarity is making AI tools and freely sharing them with the world. Not all AIs are locked down by corporations.
Do you know what solidarity is?
Do you know what a luddite is?
The simplest argument, supported by many painters and a section of the public, was that since photography was a mechanical device that involved physical and chemical procedures instead of human hand and spirit, it shouldn’t be considered an art form;
That a particular AI could have used copywrited work is a completely different argument than what was first stated.
Solidarity with you bourgeoisie fucks is like the solidarity of the turtle with the scorpion
Why not sell it? Because chances are the things it was trained off of were stolen in the first place and you have no right to claim them
Why not claim it’s yours? Because it is not, it is using the work of others, primarily without permission, to generate derivative work.
Not use it and hire a professional? If you use AI instead of an artist, you will never make anything new or compelling, AI cannot generate images without a stream of information to train off of. If we don’t have artists and replace them with AI, like dumbass investors and CEOs want, they will reach a point where it is AI training off AI and the well will be poisoned. Ai should be used simply as a tool to help with the creation of art if anything, using it to generate “new” artwork is a fundamentally doomed concept.
If your AI was trained entirely off work you had the rights to, sure. But it was not.
Why is it valid for you to be trained off of art you didn’t have rights to but not for an open source program running locally on my PC?
It would not be a copyright violation if you created a completely original super hero in the art style of Jack Kirby.
Use it instead of hiring professionals if you’re a business
Why wouldn’t you though?
Remember when corporations tried to claim that money you didn’t spend on their product was theft ? This way of thinking has been recycled by the anti-AI bros.
Turns out all the money you don’t spend on struggling artists is not only theft, but also class warfare. You stinking bougie you.
Because that’s a harm to society and economy.
It’s gutting entire swaths of middle-class careers, and funneling that income into the pockets of the wealthy.
If you’re a single-person startup using your own money and you can’t afford to hire someone else, sure. That’s ok until you can afford to hire someone else.
If you’re just using it for your personal hobbies and for fun, that’s probably ok
But if you’re contributing to unemployment and suppressed wages just to avoid payroll expenses, there is a guillotine with your name on it.
Please don’t use the “but it creates jobs” argument.
Me shitting in the street also “creates jobs” because someone has to clean it.
I think what matters if you would’ve otherwise hired someone. Otherwise I can’t see it making any impact.
And in a lot of cases you would’ve paid for stock photo company anyway
Because then artists aren’t getting paid but you’re still using their art. The AI isn’t making art for you just because you typed a prompt in. It got everything it needs to do that from artists.
So it’s more of an ethical “someone somewhere is probably being plagiarized and that’s bad” thing and not really a business or pragmatic decision. I guess I can get that but can’t see many people following through with that.
Some people got mad at a podcast I follow because they use AI generated episode covers. Which is funny because they absolutely wouldn’t be paying an artist for that work, it’d just be the same cover, so not like they switched from paying someone to not paying them.
So because I use chatgpt for help coding data analysis scripts, I am no longer a mechanical engineer?
No, you are a mechanical engineer that uses AI.
“Prompt Engineer” is a “real” job title
The issue has never been the tech itself. Image generators are basically just a more complicated Gaussian Blur tool.
The issue is, and always has been, the ethics involved in the creation of the tools. The companies steal the work they use to train these models without paying the artists for their efforts (wage theft). They’ve outright said that they couldn’t afford to make these tools if they had to pay copyright fees for the images that they scrape from the internet. They replace jobs with AI tools that aren’t fit for the task because it’s cheaper to fire people. They train these models on the works of those employees. When you pay for a subscription to these things, you’re paying a corporation to do all the things we hate about late stage capitalism.
I think that, in many ways AI is just worsening the problems of excessive copyright terms. Copyright should last 20 years, maybe 40 if it can be proven that it is actively in use.
Copyright is its own whole can of worms that could have entire essays just about how it and AI cause problems. But the issue at hand really comes down to one simple question:
Is a man not entitled to the sweat of his brow?
“No!” Says society. “It’s not worth anything.”
“No!” Says the prompter. “It belongs to the people.”
“No!” Says the corporation. “It belongs to me.”
Does it not belong to the people? The meaning of that saying is a shitty analogy for this. You’re entitled to the sweat of your brow, but not more from a society, and if you use free infrastructure of the commons to share your work, it belongs to the commons
I think you are making the mistake of assuming disagreement with your stance means someone would say no to these questions. Simply put - it’s a strawman.
Most (yes, even corporations, albeit much less so for the larger ones), would say “Yes” to this question on it’s face value, because they would want the same for their own “sweat of the brow”. But certain uses after the work is created no longer have a definitive “Yes” to their answer, which is why your ‘simple question’ is not an accurate representation, as it forms no distinctions between that. You cannot stop your publicly posted work from being analyzed, by human or computer. This is firmly established. As others have put in this thread, reducing protections over analysis will be detrimental to both artists as well as everyone else. It would quite literally cause society’s ability to advance to slow down if not halt completely as most research requires analysis of existing data, and most of that is computer assisted.
Artists have always been undervalued, I will give you that. But to mitigate that, we should provide artists better protections that don’t rely on breaking down other freedoms. For example, UBI. And I wish people that were against AI would focus on that, since that is actually something you could get agreement on with most of society and actually help artists with. Fighting against technology that besides it negatives also provides great positives is a losing battle.
Agreed. The problem is that so many (including in this thread) argue that training AI models is no different than training humans—that a human brain inspired by what it sees is functionally the same thing.
My response to why there is still an ethical difference revolves around two arguments: scale, and profession.
Scale: AI models’ sheer image output makes them a threat to artists where other human artists are not. One artist clearly profiting off another’s style can still be inspiration, and even part of the former’s path toward their own style; however, the functional equivalent of ten thousand artists doing the same is something else entirely. The art is produced at a scale that could drown out the original artist’s work, without which such image generation wouldn’t be possible in the first place.
Profession. Those profiting from AI art, which relies on unpaid scraping of artists’s work for data sets, are not themselves artists. They are programmers, engineers, and the CEOs and stakeholders who can even afford the ridiculous capital necessary in the first place to utilize this technology at scale. The idea that this is just a “continuation of the chain of inspiration from which all artists benefit” is nonsense.
As the popular adage goes nowadays, “AI models allow wealth to access skill while forbidding skill to access wealth.”