Apparently, stealing other peopleâs work to create product for money is now âfair useâ as according to OpenAI because they are âinnovatingâ (stealing). Yeah. Move fast and break things, huh?
âBecause copyright today covers virtually every sort of human expressionâincluding blogposts, photographs, forum posts, scraps of software code, and government documentsâit would be impossible to train todayâs leading AI models without using copyrighted materials,â wrote OpenAI in the House of Lords submission.
OpenAI claimed that the authors in that lawsuit âmisconceive[d] the scope of copyright, failing to take into account the limitations and exceptions (including fair use) that properly leave room for innovations like the large language models now at the forefront of artificial intelligence.â
OpenAI are not going to make the source code for their model accessible to all to learn from. This is 100% about profiting from it themselves. And using copyrighted data to create open source models would seem to violate the very principles the open source community stands for - namely that everybody contributes what they agree to, and everything is published under a licence. If the basis of an open source model is a vast quantity of training data from a vast quantity of extremely pissed off artists, at least some of the people working on that model are going to have a âare we the baddies?â moment.
The AI models are also never going to produce a solution to climate change that humans will accept. We already know what the solution is, but nobody wants to hear it, and expecting anyone to listen to ChatGPT and suddenly change their minds about using fossil fuels is ludicrous. And an AI that is trained specifically on knowledge about the climate and technologies that can improve it, with the purpose of innovating some hypothetical technology that will fix everything without humans changing any of their behaviour, categorically does not need the entire contents of ArtStation in its training data. AIs that are trained to do specific tasks, like the ones trained to identify new antibiotics, are trained on a very limited set of data, most of which is not protected by copyright and any that is can be easily licenced because the quantity is so small - and you donât see anybody complaining about those models!
OpenAI are not going to make the source code for their model accessible to all to learn from
OpenAI isnât the only company doing this, nor is their specific model the knowledge that Iâm referring to.
The AI models are also never going to produce a solution to climate change that humans will accept.
We already know what the solution is, but nobody wants to hear it
Then itâs not a solution. Thatâs like telling your therapist, âI know how to fix my relationship, my partner just wonât do it!â
expecting anyone to listen to ChatGPT and suddenly change their minds about using fossil fuels is ludicrous
Lol. Yeah, I agree, thatâs never going to work.
categorically does not need the entire contents of ArtStation in its training data.
Thatâs a strong claim to make. Regardless of the ethics involved, or the problems the AI can solve today, the fact is we seeing rapid advances in AI research as a direct result of these ethically dubious models.
In general, Iâm all for the capitalist method of artists being paid their fair share for the work they do, but on the flip side, I see a very possible mass extinction event on the horizon, which could cause suffering the likes of which humanity has never seen. If we assume that is the case, and we assume AI has a chance of preventing it, then I would prioritize that over peopleâs profits today. And I think itâs perfectly reasonable to say Iâm wrong.
And then thereâs the problem of actually enforcing any sort of regulation, which would be so much more difficult than people here are willing to admit. Thereâs basically nothing you can do even if you wanted to. Your Carlin example is exactly the defense a company would use: âI guess our AI just happened to create a movie that sounds just like Paul Blart, but we swear itâs never seen the film. Great minds think alike, I guess, and we sell only the greatest of mindsâ.
Personally I think the claim that the entire contents of ArtStation will lead to working technology that fixes climate change is the bolder claim - and if there was any merit to it, there would be some evidence for it that the corporations who want copyright to be disapplied to artists would be able to produce. And if weâre saying that getting rid of copyright protections will save the planet, then perhaps Disney should give up theirs as well. Because thatâs the reality here: weâre expecting humans to be obliterated by AI but are not expecting the rich and powerful to make any sacrifices at all. And art is part of who we are as a species, and has been for hundreds of thousands of years. Replacing artists with AI because somehow that will fix climate change is not only a massive stretch, but what would we even be saving humanity for at that point? So that everybody can slave away in insecure, meaningless work so the few can hoard everything for themselves? Because the Star Trek utopia where AI does all the work and humans can pursue self-enrichment is not an option on the table. The tech bros just want you to think it is.