

24·
3 months agohardware is fine. If you’re not experienced the 3000km will fuck you though. Stuff will arise where you will need to get at it.
I’ve been using two laptops als “servers” for years.
well, the first one died after about 6 years of use.
But I can get at them reasonably.
Because you don’t train your self-hosted LLM.
As a result you only pay for the electricity of computing your tokens (your request), this can be especially reasonable if the same machine also does local game streaming and or transcoding, and thus already has the requirements to host a LLM.
If you don’t have rather unreasonable means, your local LLM is just very much more limited in parameters (size), and will not be as good as other, much larger models.
Privacy, Ethics and personal interest usually are the largest drivers from what I can tell.