Nah, it’s exactly the same. Arguably in some aspects more suspect, in that it doesn’t seem to have an opt-out at all and it IS sending some data over the Internet for remote processing.
Presumably better local security than the first version MS announced, but we’ll have to see when compared to the shipping version. Definitely obscuring what they’re actually doing a lot more. It’s Apple magic, not just letting some AI look at your screen and stuff.
But hey, ultimately, that’s my point. The fact that they went on that stage, sold the exact same thing and multiple people are out here, of all places going “no, but this time it’s fine” shows just how much better at selling stuff Apple is. I’m not particularly excited or intend to use either of these, but come on, Apple’s messaging was so far ahead of MS’s on this one.
Oh, did I miss that? Did they explain how that works and what AI features are still functional if you don’t turn it on?
EDIT: I’m not being passive aggressive here, BTW. I genuinely don’t know if they’ve explained this either way. If somebody can source it, I’m genuinely interested.
Apple‘s solution does not require 200gb of screenshots where most personal info is visible in plain text… Apple wins here because they have a clear structure in their OS and all important data already in Apple‘s own Apps. And they analyze this stuff already very much as one can see with all the Siri suggestions everywhere since, I don’t know 5 years? microsoft‘s chaos approach in their Windows is now shooting them in their foot real hard.
I hope, that we can get a open source linuxAI to be run locally, that integrates like AppleAI. Should be better possible since, at least, all apps are installed mostly the same way(s) and are designed to be dependent on each other.
I’m not saying anything particularly new and I’m mostly repeating what I’ve been saying since tghe announcement, but I’d argue that all of those caveats are entirely down to branding and PR and not engineering.
App design, yes. Microsoft made their Timeline 2 so that it actually shows you in the UI all the screenshots that it took from you doing stuff and that’s creepy. Apple doesn’t tell you what they’re pulling and they are almost certainly processing it further to get deeper insights… but they do it in the background so you don’t have to think about it as much.
So again, better understanding of the user, messaging and branding. Same fundamental functionality. Way different reactions.
Yes, but apple doesn’t need to screenshot shit, thats the point, they trained their customers to only use apple apps, where they have full control and force developers to use their AI API to stay relevant.
Microsoft failed to convince user to use microsoft everywhere except with teams and the office suite
Google has the relevant data of most microsoft user, and screenshoting this (like scraping) would have allowed microsoft to get to that data without paying google for it
But that is kinda shady and thus not widely accepted.