You are viewing a single thread.
View all comments
209 points
*
Deleted by creator
permalink
report
reply
43 points

Other than their asinine charging cable/accessory situations I consistently find myself agreeing with Apple pretty much any time any government body or group is mad they won’t do something.

permalink
report
parent
reply
46 points

They’re generally on the wrong side of the battle for right to repair and removable batteries too.

But yeah, privacy they almost always have the right of it.

permalink
report
parent
reply
21 points

Requiring usb c was something I agreed with. But indeed many times apple has rightly fought for their userbase.

permalink
report
parent
reply
18 points

how do you reckon?

only time they have been on the consumer’s side was with regards to privacy, refusing to comply with the FBI and now this.

everything else they are pretty anti-consumer, off the top of my head

  • first to remove jack 3.5 (even though I don’t really care about this, others do.)
  • sticking to shitty lightning cable so they can sell overpriced cables
  • the charger thing with the EU
  • worst of all entirely against right to repair
permalink
report
parent
reply
1 point
Deleted by creator
permalink
report
parent
reply
0 points

To be fair, those first three points fall squarely under that “charging cable/accessory situations” exception. With Apple, it turns out that’s a pretty broad exception.

permalink
report
parent
reply
-28 points
*
Deleted by creator
permalink
report
parent
reply
24 points

Remember how everyone kicked up a giant stink about apple adding “on device CSAM scanning when uploading photos to iCloud”?

They did that precisely because it would allow them to search for CSAM without giving up any privacy. As I said back when all that rage was happening, if apple don’t get to implement it this way you can be damn sure that the government is going to force them to implement CSAM scanning in a much more privacy-destroying way, and well here we are.

permalink
report
parent
reply
21 points

CSAM without giving up any privacy.

Hmmmm funny because security researchers said the opposite, I kinda believe them more?

permalink
report
parent
reply
2 points

Who said it was givening up privacy. The worst I heard is slippery slope of they donthis they might ad more to it later. And how was it privacy compromising?

permalink
report
parent
reply
1 point

How did they say it’s giving up privacy?

permalink
report
parent
reply
15 points

Like the politicians would have cared. This is just a convenient excuse. Either they would have found another one or they would have said “we can’t trust Apple to scan for this material. The police has to do these scans!”

We were right to oppose it then and we are right to oppose it now.

permalink
report
parent
reply
-9 points

We were right to oppose it then and we are right to oppose it now.

You were right to oppose doing it in the most privacy conscious way? Or were you against CSAM scanning at all?

permalink
report
parent
reply
7 points

CSAM, as defined by apple, SPOILER that could be anything, including, and I could rattle off names, anything that threatens the government or those who got their tendrils into it, if we, For example have authoritarians change us to be facist, or re-introduce slavery or segrogation. A mere picture of your bedroom or face could have a somthing in it that allows you to be put into a cohort for later use (legal or not)

permalink
report
parent
reply
4 points
*

No, that’s not at all what it was defined as or what it could be. CSAM is Child Sex Abuse Material. It wasn’t going to be memes of winny the pooh like people argued.

That’s also not how CSAM matching works. It simply compares hashes of images. If you take a photo of you in your bedroom with a sign saying “fuck the government” it will not match any CSAM database hashes no matter how authoritarian or fascist the government is, because they don’t have that same photo in their CSAM databases.

You’re doing what the outraged did back then and thinking CSAM scanning is some sort of AI powered image recognition that scans images for specific things. It’s not that at all. It is a database of known CSAM images that have been hashed and that have been confirmed by multiple different governments (multiple different ones so one government can’t just put an image of their president that they don’t like in theirs and then find out who has uploaded that photo. If it only appears in one government CSAM database it will not be checked). It takes your photo, hashes it, and then checks to see if that hash is in the CSAM database. It won’t be, ever.

You know what will be in there and matched? If you download child porn that is already out there on the web.

permalink
report
parent
reply
5 points

Anything scanning messages or media on my device is an absolute NO if I don’t control it.

permalink
report
parent
reply
1 point

You did control it though. It only scanned what you were uploading to iCloud, and only during the upload process.

If you turned off iCloud upload it never scanned anything.

permalink
report
parent
reply
20 points

so basically apple doesn’t want government spyware on their phones

permalink
report
parent
reply
3 points

Exactly! Apple wants to make sure the personal data they hand out is directly from them.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 16K

    Monthly active users

  • 12K

    Posts

  • 553K

    Comments