- 4 Posts
- 22 Comments
Thanks will do all that!
Start now! Install it, get a python environment up and running if you haven’t already, and get that first play-around project working which you work outwards from!
catty@lemmy.worldOPto
Selfhosted@lemmy.world•What is a self-hosted small LLM actually good for (<= 3B)English
1·8 months agoThanks, when I get some time soon, I’ll have another look at it and cherry ai with a local install of ollama
catty@lemmy.worldOPto
Selfhosted@lemmy.world•What is a self-hosted small LLM actually good for (<= 3B)English
2·8 months agoAny suggestions for solutions?
catty@lemmy.worldOPto
Selfhosted@lemmy.world•What is a self-hosted small LLM actually good for (<= 3B)English
3·8 months agoYou’re conflating me asking how to use these tools with you who’s misusing them. I see you still don’t accept what you’re doing is wrong. But go you.
catty@lemmy.worldOPto
Selfhosted@lemmy.world•What is a self-hosted small LLM actually good for (<= 3B)English
6·8 months agoPlease be very careful. The python code it’ll spit out will most likely be outdated, not work as well as it should (the code isn’t “thought out” as if a human did it.
If you want to learn, dive it, set yourself tasks, get stuck, and f around.
catty@lemmy.worldOPto
Selfhosted@lemmy.world•What is a self-hosted small LLM actually good for (<= 3B)English
4·8 months agoYeah shell scripts are one of those things that you never remember how to do something and have to always look it up!
catty@lemmy.worldOPto
Selfhosted@lemmy.world•What is a self-hosted small LLM actually good for (<= 3B)English
31·8 months agoWas this system vibe coded? I get the feeling it was…
catty@lemmy.worldOPto
Selfhosted@lemmy.world•What is a self-hosted small LLM actually good for (<= 3B)English
21·8 months agolol. Way to contradict yourself.
catty@lemmy.worldOPto
Selfhosted@lemmy.world•What is a self-hosted small LLM actually good for (<= 3B)English
1·8 months agoI haven’t actually found the coder-specific ones to be much (if at all) better than the generic ones. I wish I could have. Hopefully LLMs can become more efficient in the very near future.
catty@lemmy.worldOPto
Selfhosted@lemmy.world•What is a self-hosted small LLM actually good for (<= 3B)English
3·8 months agoSome questions and because you don’t actually understand, also, the answers.
- what does the LLM understand the context of, (other user’s data owned by Twitch)
- How is the LLM fed that data? (You store it and feed it to the LLM)
- Do you use Twitch’s data and its users data through an AI without their consent? (Most likely, yes)
- Do you have consent from the users to store ‘facts’ about them (You’re pissy, so obviously not)
- Are you then storing that processed data? (Yes, you are, written to a file)
- Is the purpose this data processing commercial (Yes, it is, designed to increase viewer count for the user of this system - and before you retort “OMG it helps twitch too”… Uhm no, Twitch has the viewers if not watching him, watching someone else)
I mean yeah, it’s a use case, but own up to the fact that you’re wrong. Or be pissy. I don’t care.
catty@lemmy.worldOPto
Selfhosted@lemmy.world•What is a self-hosted small LLM actually good for (<= 3B)English
3·8 months agoDoesn’t Twitch own all data that is written and their TOS will state something like you can’t store data yourself locally.
catty@lemmy.worldOPto
Selfhosted@lemmy.world•What is a self-hosted small LLM actually good for (<= 3B)English
5·8 months agoNo, what is it? How do I try it?
catty@lemmy.worldOPto
Selfhosted@lemmy.world•What is a self-hosted small LLM actually good for (<= 3B)English
1·8 months agoSurely none of that uses a small LLM <= 3B?
catty@lemmy.worldOPto
Selfhosted@lemmy.world•What can I use for an offline, selfhosted LLM client, pref with images,charts, python code executionEnglish
1·8 months agoBut won’t this be a mish-mash of different docker containers and projects creating an installation, dependency, upgrade nightmare?
catty@lemmy.worldOPto
Selfhosted@lemmy.world•What can I use for an offline, selfhosted LLM client, pref with images,charts, python code executionEnglish
3·8 months agoBut its website is Chinese. Also what’s the github?
catty@lemmy.worldOPto
Selfhosted@lemmy.world•What can I use for an offline, selfhosted LLM client, pref with images,charts, python code executionEnglish
3·8 months agoThis looks interesting - do you have experience of it? How reliable / efficient is it?
catty@lemmy.worldOPto
Selfhosted@lemmy.world•What can I use for an offline, selfhosted LLM client, pref with images,charts, python code executionEnglish
1·8 months agoTry the beta on the github repo, and use a smaller model!
catty@lemmy.worldOPto
Selfhosted@lemmy.world•What can I use for an offline, selfhosted LLM client, pref with images,charts, python code executionEnglish
4·8 months agoI’m getting very-near real-time on my old laptop. Maybe a delay of 1-2s whilst it creates the response
Sounds like a great first question! Go for it!