Google’s latest flagship smartphone raises concerns about user privacy and security. It frequently transmits private user data to the tech giant before any app is installed. Moreover, the Cybernews research team has discovered that it potentially has remote management capabilities without user awareness or approval.
Cybernews researchers analyzed the new Pixel 9 Pro XL smartphone’s web traffic, focusing on what a new smartphone sends to Google.
“Every 15 minutes, Google Pixel 9 Pro XL sends a data packet to Google. The device shares location, email address, phone number, network status, and other telemetry. Even more concerning, the phone periodically attempts to download and run new code, potentially opening up security risks,” said Aras Nazarovas, a security researcher at Cybernews…
… “The amount of data transmitted and the potential for remote management casts doubt on who truly owns the device. Users may have paid for it, but the deep integration of surveillance systems in the ecosystem may leave users vulnerable to privacy violations,” Nazarovas said…
It’s so ironic that Pixels are the go to devices for privacy roms these days.
All this shit is probably happening at the hardware level too, with 100 different backdoors you can’t remove with your megamind plan of installing a custom rom.
The silicon probably has the ability to live stream all sensor data directly to the NSA using the fanciest ML compression technology lmao.
It’s so ironic that Pixels are the go to devices for privacy roms these days.
It’s so ironic it’s a show-stopper for me. I’m not paying fucking Google to escape the Google dystopia. Nosiree! That’s just too rich for me.
This is why I own a Fairphone running CalyxOS. Yes, I know GrapheneOS is supposedly more secure - I say supposedly because I think 95% of users don’t have a threat model that justifies the extra security really. But I don’t care: my number one priority is not giving Google a single cent. If it means running a less secure OS, I’m fine with that.
There’s no way on God’s green Earth I’m buying a Pixel phone to run a deGoogled OS. That’s such an insane proposition I don’t even know how anybody can twist their brain into believing this is a rational thing to do.
I say supposedly because I think 95% of users don’t have a threat model that justifies the extra security really.
Does street cred with my Cybersecurity peers count as a threat model?
I’m definitely one of the users of GrapheneOS that you’re talking about. My threat model is “this is fucking cool!”
Also, the grass is always greener on the other side. I want a Fair phone.
What if you buy a used Pixel? Google was already getting that money, but you haven’t paid them…or would that just be a cop out?
I’ve been arguing this many times with many people, and everybody seems to adopt their own way of interpreting things to suit their preferences.
Here’s my line of thinking:
- If the first buyer buys a Google cellphone new for, say, $500 (no idea of the price, just making it up for the sake of explaining), this buyer gives $500 to Google
- If I then buy this cellphone second-hand for, say, $300, the original buyer gets $300 back, meaning Google now has $300 of my money.
That’s a hard no.
Of course, there’s the argument that Google got $500 no matter what and they don’t know who the money is from. But that’s besides the point: I know Google got my money. I most defintely parted with $300 to acquire a Google cellphome, meaning as far as I’m concerned, I indirectly gave Google $300 of my money. And I refuse to give Google any money, however indirect the transaction might be. The only way I could become the owner of a Google phone is if someone gave one to me, I found it in the trash or I stole it.
There’s also the argument that if I don’t buy the cellphone, it might end up in a landfill, so if I’m environmentally-minded, I should save it from the landfill. That’s true, but my counter-argument to this is that a healthy second-hand market for Google phones gives them more value, therefore makes them more appealing to potential buyers and ultimately supports Google’s business.
I don’t like serviceable stuff being landfilled for no good reason (otherwise I wouldn’t pay extra to buy a Fairphone) but in the case of Google hardware, I reckon it should end up at the landfill as often as possible to diminish its value and hurt Google. Of course, I’m only one meaningless guy, but I reckon boycotting Google is a moral duty for anybody who’s concerned about privacy and civil liberties.
And of course, I don’t want a Google product in my pocket because it would make me nauseous. But that’s entirely subjective.
Citation needed. I get that it’s healthy not to trust anyone, but with the amount of security research that goes into these devices if something like that was happening then we would know about it.
- Applies to every phone, smart or simple, can be combatted with a £5 Faraday bag
- That is about monitoring by your network, nothing to do with the phone manufacturer really
- A ten year old article about Samsung phones
- An exploit affecting lots of phones that seems like it was fixed
So a few interesting points, but nothing even slightly like what OP was suggesting.
Maybe and maybe not. We need to encourage robust alternatives, unfortunately this requires a ton of capital to develop hardware and reserve fab time and get your devices fabricated instead of a major player like Google or Samsung.
We basically need something in the smartphone space equivalent to the Framework laptop, that can meet the security hardware requirements, allow bootloader unlock/relock and support GrapheneOS and other custom ROMs.