You are viewing a single thread.
View all comments View context
10 points

By safe I mean about privacy, if there’s possibility that someone can “intercept” the photos of the child. Sorry if I didn’t explain it well

permalink
report
parent
reply
5 points

No , but they will get the meta data. But image should be secure. But then your recepient download it , upload it to Google cloud and so on

permalink
report
parent
reply

You can delete or alter the metadata with multiple available EXIF editor apps.

permalink
report
parent
reply
3 points

Yea GOS camera does it automatically as well. I was thinking of the message data. Size , time, contacts etc

permalink
report
parent
reply
4 points

Referred here as “metadata” is metadata about the communication itself which META gets and extensively uses for marketing, not the image-metadata stored in the image-file.

permalink
report
parent
reply
17 points
*

In computer security it always depends on your thread model. WhatsApp is supposed to be end-to-end-encrypted, so nobody can intercept your messages. However: Once someone flags a message as inappropriate, this gets circumvented and messages get forwarded to Meta. This is only supposed to happen if it’s flagged. So unlikely in a family group. I trust this actually works the way Meta tells us, though I can’t be sure because I haven’t dissected the app and this may change in the future. And there is lawful intercept.

Mind that people can download or screenshot messages and forward them or do whatever they like with the pictures.

And another thing: If you have Sync enabled, Google Photos will sync pictures you take with their cloud servers and it’ll end up there. And Apple does the same with their iCloud. As far as I know both platforms automatically scan pictures to help fight crime and child exploitation. We aren’t allowed to know how those algorithms work in detail. I doubt a toddler in clothes or wrapped in a blanket will trigger the automatism. They claim a ‘high level of accuracy’. But people generally advise not to take pictures of children without clothes with a smartphone. Bad incidents have already happened.

Edit: Apple seems to have pushed for cloud scanning initially, but currently that doesn’t happen any more. They have some on device filters as far as I understand.

permalink
report
parent
reply
5 points

As far as I know both platforms automatically scan pictures to help fight crime and child exploitation.

Apple doesn’t. They should but they don’t. They came up with a really clever system that would do the actual scanning on your device immediately before uploading to iCloud, so their servers would never need to analyze your photos, but people went insane after they announced the plan.

permalink
report
parent
reply
4 points
*

Oh. I didn’t know that. I don’t use Apple products and just read the news, I must have missed how the story turned out, so thanks for the info.

Technically I suppose it doesn’t make a huge difference. It still gets scanned by Apple software. And sent to them if it’s deemed conspicuous. And the algorithm on a device is probably limited by processing power and energy budget. So it might even be less accurate. But this is just my speculation. I think all of that is more of a marketing stunt. This way the provider reduces cost, they don’t need additional servers to filter the messages and in the end it doesn’t really matter where exactly the content is processed if it’s a continuous chain like in the Apple ecosystem.

The last story I linked about the dad being incriminated for sending the doctor a picture would play out the same way, regardless.

Edit: I googled it and it seems the story with Apple has changed multiple times. The last article I read says they don’t even do on-device scanning. Just a ‘nude filter’. Whatever that is. I’m cautious around cloud services anyways. And all of that might change and also affect old pictures. We just avoided mandatory content filtering in the EU and upload filters and things like that are debated regularly. Also the US has updated their laws regarding internet crime and prevention of child exploitation in the last years. I’m generally unsure where we’re headed with this.

permalink
report
parent
reply
2 points

you have to trust that Meta doesn’t do anything with your pictures before they are sent and that the person you’re sending them to doesn’t backup their whatsapp stuff to google.

It’s more secure to use Signal

permalink
report
parent
reply
16 points
*

Interception by a third party is highly unlike, as the transport layer of basically everything is encrypted nowadays. What is left unknown is what can Meta do once the file is on their servers, as you’ll have to trust Zuckk’s word and Zuckk’s encryption

permalink
report
parent
reply
3 points

It’s end to end encrypted so they can’t see it then. What they could do is access it once it’s on your device and unencrypted potentially.

permalink
report
parent
reply
1 point
*

Or through unencrypted by default backup. It goes on Google drive and there’s no guarantee that it doesn’t go to Meta.

permalink
report
parent
reply
5 points

But the Signal people also say the e2e is trustworthy, no (Whatsapp, I mean)?

permalink
report
parent
reply
3 points

If Meta would really not know your Messages and encryption Keys, they would not be able to recover Every single one oft your messages even if you forgot your Password.

permalink
report
parent
reply
4 points

If someone is able to intercept WhatsApp messages, they aren’t using it to look at photos of your baby they’re using it to spy on government officials

permalink
report
parent
reply

Asklemmy

!asklemmy@lemmy.ml

Create post

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it’s welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

Icon by @Double_A@discuss.tchncs.de

Community stats

  • 7.3K

    Monthly active users

  • 5.6K

    Posts

  • 308K

    Comments