You are viewing a single thread.
View all comments View context
2 points

big sad :(

wish it would be nice and easi to do stuff like this - yea hosting it somewhere is probably best for ur moni and phone.

permalink
report
parent
reply
1 point

actually i think it kinda is nice and easy to do, i’m just too lazy/cheap to rent a server with 8GB of RAM, even though it would only cost $15/month or sth.

permalink
report
parent
reply
2 points

it would also be super slow, u usually want a GPU for LLM inference… but u already know this, u are Gandald der zwölfte after all <3

permalink
report
parent
reply