In this video I discuss how generative AI technology has grown far past the governments ability to effectively control it and how the current legislative measures could lead to innocent people being jailed.
Didn’t watch the video, but I don’t care about AI CSAM. Even if it looks completely lifelike, it’s not real.
What data is it trained on? This isn’t meant to be a “gotcha” question, I’m wondering about it.
Prove it’s fake when some of it of your daughter is making it’s way around school.
You’ve missed the point. Fake or not it does damage to people. And eventually it won’t be possible to determine if it’s real or not.
When that becomes widespread, photos will be generateable for literally everyone, not just minors but every person with photos online. It will be a societal shift; images will be assumed to be AI generated, making any guilt or shame about a nude photo existing obselete.
What a disguising assumption. And the best argument against AI I’ve ever heard.
AI generated porn depicting real people seems like a different and much bigger issue
AI generated CSAM in general, while disgusting, at least doesn’t directly harm people, fabricated nudes most definitely does, regardless of the age of the victim