AI Industry Struggles to Curb Misuse as Users Exploit Generative AI for Chaos::Artificial intelligence just can’t keep up with the human desire to see boobs and 9/11 memes, no matter how strong the guardrails are.
Serious question - why should anyone care about using AI to make 9/11 memes? Boobs I can see the potential argument against at least (deep fakes and whatnot), but bad taste jokes?
Are these image generation companies actually concerned they’ll be sued because someone used their platform to make an image in bad taste? Even if such a thing we’re possible, wouldn’t the responsibility be on the person who made it? Or at worst the platform that distributed the images -As opposed to the one that privately made it?
I’d guess that they are worried the IP owners will sue them for singing their IP.
So sonic creators will say, your profiting by using sonic and not paying us for the right to use him.
But I agree that deep fakes can be pretty bad.
Protect the brand. That’s it.
Microsoft doesn’t want non-PC stuff being associated with the Bing brand.
It’s what a ton of the ‘safety’ alignment work is about.
This generation of models doesn’t pose any actual threat of hostile actions. The “GPT-4 lied and said it was human to try to buy chemical weapons” in the safety paper at release was comical if you read the full transcript.
But they pose a great deal of risk to brand control.
Yet still apparently not enough to run results through additional passes which fixes 99% of all these issues, just at 2-3x the cost.
It’s part of why articles like these are ridiculous. It’s broadly a solved problem, it’s just the cost/benefit of the solution isn’t enough to justify it because (a) these issues are low impact and don’t really matter for 98% of the audience, and (b) the robust fix is way more costly than the low hanging fruit chatbot applications can justify.
I don’t see adobe trying to stop people from making 911 memes in photoshop nor have they been sued over anything like that, I dont get why AI should be different. It’s just a tool.
That’s a great analogy, wish I’d thought of it
I guess it comes down to whether the courts decide to view AI as a tool like photoshop, or a service - like an art commission. I think it should be the former, but I wouldn’t be at all surprised if the dinosaurs in the US gov think it’s the latter