TAKING OUR JOBS
HARASSING WOMEN AND CHILDREN
-
Boys are taking images of female classmates and using AI to deepfake nude photos - Fortune
-
Female Fox journalist harassed and chased by migrants while reporting outside shelter - Dailymail
A THREAT TO OUR WAY OF LIFE
-
A.I. Poses ‘Risk of Extinction,’ Industry Leaders Warn - NYTimes
-
Poll: Americans Fear their way of life is under threat - Fox News
THEY’RE SHITTING ON THE BEACHES
REWRITING HISTORY BY DOCTORING PHOTOS WITH NEVER SEEN BEFORE PHOTO MANIPULATIONS
Sorry everyone I keep forgetting which zeitgeist that media is currently using to make us hate and fear something.
…did you just post 6 completely random articles as if there was some sort of point other than “news sites report lots of different news?”
did you just post 6 completely random articles
No, I mean there’s headings and groupings to assist with the inference
as if there was some sort of point other than “news sites report lots of different news?”
There might be a point. I see an association. If others do as well that’s good. If others don’t that is also ok.
To spell it out directly. I think its weird that media is recycling headlines for AI from republican headlines for immigration.
Often I cannot see the forest for the trees but sometimes I feel the presence of it even when I’m in it.
TL;DR: The new Reimage feature on the Google Pixel 9 phones is really good at AI manipulation, while being very easy to use. This is bad.
It’ll sink in for you when photographic evidence is no longer admissible in court
Photoshop has existed for a bit now. So incredibly shocking it was only going to get better and easier to do, move along with the times oldtimer.
I really don’t have much knowledge on it but it sound like it’s would be an actual good application of blockchain.
Couldn’t a blockchain be used to certify that pictures are original and have not been tampered with ?
On the other hand if it was possible I’m certain someone either have already started it, it is the prefect investor magnet “Using blockchain to counter AI”
How would that work?
I am being serious, I am an IT and can’t see how that would work in any realistic way.
And even if we had a working system to track all changes made to a photo, it would only work if the author submitted the original image before any change haf been made, but how would you verify that the original copy of a photo submitted to the system has not been tempered with?
Sure, you could be required to submit the raw file from the camera, but it is only a matter of time untill AI can perfectly simulate an optical sensor to take a simulated raw of a simulated scene.
Nope, we simply have to fall back on building trust with photo journalists, and trust digital signatures to tell us when we are seeing a photograph modified outsided of the journalist’s agency.
Yep, I think we pictures are becoming a valuable as text and it is fine, we just need to get used to it.
Before photography became mainstream the only source of information was written, it is extremely simple to make a fake story so people had to rely on trusted sources. Then for a short period of history photography became a (kinda) reliable sources of information by itself and this trust system lost its importance.
In most cases seeing a photo means that we were seeing a true reflection of what happened, especially if we were song multiple photos of the same event.
Now we are arriving at the end of this period, we cannot trust a photo by itself anymore, tampering a photo is becoming as easy as writing a fake story. This is a great opportunity for journalists I believe.
We’ve had fake photos for over 100 years at this point.
https://en.wikipedia.org/wiki/Cottingley_Fairies
Maybe it’s time to do something about confirming authenticity, rather than just accepting any old nonsense as evidence of anything.
At this point anything can be presented as evidence, and now can be equally refuted as an AI fabrication.
We need a new generation of secure cameras with internal signing of images and video (to prevent manipulation), built in LIDAR (to make sure they’re not filming a screen), periodic external timestamps of data (so nothing can be changed after the supposed date), etc.
I am very opposed to this. It means surrendering all trust in pictures to Big Tech. If at some time only photos signed by Sony, Samsung, etc. are considered genuine, then photos taken with other equipment, e.g., independently manufactured cameras or image sensors, will be dismissed out of hand. If, however, you were to accept photos signed by the operating system on those devices regardless of who is the vendor, that would invalidate the entire purpose because everyone could just self-sign their pictures. This means that the only way to effectively enforce your approach is to surrender user freedom, and that runs contrary to the Free Software Movement and the many people around the world aligned with it. It would be a very dystopian world.
There’s no need to make these things Big Tech, so if that’s why you are opposed to it, reconsider what you are actually opposed to. This could be implemented in a FOSS way or an open standard.
So you not trust HTTPS because you’d have to trust big tech? Microsoft and Google and others sign the certificates you use to trust that your are sending your password to your bank and not a phisher. Like how any browser can see and validate certificates, any camera could have a validation or certificate system in place to prove that the data is straight from an unmodified validated camera sensor.
It would also involve trusting those corporations not to fudge evidence themselves.
I mean, not everything photo related would have to be like this.
But if you wanted you photo to be able to document things, to provide evidence that could send people to prison or be executed…
The other choice is that we no longer accept photographic, audio or video evidence in court at all. If it can no longer be trusted and even a complete novice can convincingly fake things, I don’t see how it can be used.
It’s a shitty toy that’ll make some people sorry when they don’t have any photos from their night out without tiny godzilla dancing on their table. It won’t have the staying power Google wishes it to, since it’s useless except for gags.
But, please, Verge,
It took specialized knowledge and specialized tools to sabotage the intuitive trust in a photograph.
get fucked
Not as easy and accessible as now.
Before, I don’t even know how to erase a pimple on my selfies. Now I can easily generate picture of a photorealistic cat girl riding a bike naked on Time square that could fool any elders in my neighborhood.
There are some really subtle details experts can look at to detect Photoshop work, such as patterns in the JPEG artifacts than can indicate a photo was reocmpressed multiple times in some areas but not others.