I’ve recently played with the idea of self hosting a LLM. I am aware that it will not reach GPT4 levels, but beeing free from restraining prompts with confidential data is very nice tool for me to have.

Has anyone got experience with this? Any recommendations? I have downloaded the full Reddit dataset so I could retrain the model on this one as selected communities provide immense value and knowledge (hehe this is exactly what reddit, twitter etc. are trying to avoid…)

1 point

There is also runpod.io. you can rent quite powerful machines on a hourly base which gives you the possibility to run the large models. Also they have templates so the machine will be set up ready to go in minutes. All you have to do is to load the model you like to try via the oogaboga web interface of your machine.

permalink
report
reply
1 point

Honestly all these are great suggestions for today, but this area is moving so fast I almost would suggest holding off six months to a year or so for a better solution to rise to the top. Their capabilities grow daily, and you may put in the work to get this set-up and have a much more capable solution appear soon afterwards. Just a thought though, if it’s mainly for a fun experiment then try some of these out!

permalink
report
reply
2 points
*

While yes something else is going to move to the top, it’s still awesome to play with it today you should because it’s really important to see people learning how to run this stuff at home

permalink
report
parent
reply
2 points

I tried quay.io/go-skynet/local-ai but my Server lacks the Cpu instruction set for it.

permalink
report
reply
2 points
*

KoboldCPP works with and without GPU. And is quite easy to install and use. I’d recommend something like that for a beginner.

permalink
report
reply
4 points
*

You might find some starting points or even projects or terms to look for in this article:

https://arstechnica.com/information-technology/2023/03/you-can-now-run-a-gpt-3-level-ai-model-on-your-laptop-phone-and-raspberry-pi/ .

permalink
report
reply

Selfhosted

!selfhosted@lemmy.world

Create post

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.

Rules:

  1. Be civil: we’re here to support and learn from one another. Insults won’t be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it’s not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don’t duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

Community stats

  • 3.5K

    Monthly active users

  • 3.3K

    Posts

  • 71K

    Comments