You are viewing a single thread.
View all comments
18 points

Wait, just the client? I thought the madlad ran the model on their phone!

permalink
report
reply
20 points

apparently not. it seems they are refering to the official bs deepseek ui for ur phone. running it on your phone fr is super cool! Imma try that out now - with the smol 1.5B model

permalink
report
parent
reply
5 points

Good luck ! You’ll need it to run it without a GPU…

permalink
report
parent
reply
13 points

i kno! i’m already running a smol llama model on the phone, and yeaaaa that’s a 2 token per second speed and it makes the phone lag like crazy… but it works!

currently i’m doing this with termux and ollama, but if there’s some better foss way to run it, i’d be totally happy to use that instead <3

permalink
report
parent
reply