wait what’s bad about it?
i installed llama on my phone…
If it’s running locally nothing.
If you install the actual deepseek app, you might as well post your prompts openly on social media.
nono, the whole thing is about some people putting personal info into these chatbots.
and even if not, they are guaranteed to train their newer models on the requests and generated responses.
if ur putting personal info, running in locally/privately is kinda a must, if u care about security at all.
i think peeps try lewd prompts once, then find out it doesn’t work, and then give up. (they don’t know about huggingface)
I normally use it to help me with my English translations here on lemmy (I can more or less get by but I prefer it to be correct), so I already do it.
You can also use DeepL for that, which I’m willing to bet uses a lot less energy and is a very reliable translation service.
It doesn’t require any permissions.
Meanwhile Facebook, Google, Snap Chat, fucking TEAMS, require you to give them access to your camera and microphone 24/7
But yeah app bad cuz china
A US senator has introduced a bill trying to criminalize it with a 20 year prison sentence.
Also, the US and El Salvador are doing a prisoner deportation deal.
I’m not making this up.
Wait, just the client? I thought the madlad ran the model on their phone!
apparently not. it seems they are refering to the official bs deepseek ui
for ur phone. running it on your phone fr is super cool! Imma try that out now - with the smol 1.5B model
i kno! i’m already running a smol llama model on the phone, and yeaaaa that’s a 2 token per second speed and it makes the phone lag like crazy… but it works!
currently i’m doing this with termux and ollama, but if there’s some better foss way to run it, i’d be totally happy to use that instead <3