You are viewing a single thread.
View all comments View context

There are no lite versions? I was trying to find a small LLM version I can run on an old machine and take it off the internet (or just firewall it) and play around with it to see if there is anything worth learning there for me. I was looking at the lite version of llama but when I tried to run the install on mint I ran into some issues and then had to many drinks to focus on it so I went back to something else. Maybe next weekend. If you have any recommendations I’m all ears

permalink
report
parent
reply
1 point

There are finetunes of Llama, Qwen, etc., based on DeepSeek that implement the same pre-response thinking logic, but they are ultimately still the smaller models with some tuning. If you want to run locally and don’t have tens of thousands to throw at datacenter-scale GPUs, those are your best option, but they differ from what you’d get in the Deepseek app.

permalink
report
parent
reply

United States | News & Politics

!usa@midwest.social

Create post

Welcome to !usa@midwest.social, where you can share and converse about the different things happening all over/about the United States.

If you’re interested in participating, please subscribe.

Rules

Be respectful and civil. No racism/bigotry/hateful speech.

Post anything related to the United States.

Community stats

  • 5.6K

    Monthly active users

  • 2.1K

    Posts

  • 16K

    Comments