Hiya,

Recently upgraded my server to an i5-12400 CPU, and have neen wanting to push my server a bit. Been looking to host my own LLM tasks and workloads, such as building pipelines to scan open-source projects for vulnerabilities and insecure code, to mention one of the things I want to start doing. Inspiration for this started after reading the recent scannings of the Curl project.

Sidenote: I have no intention of swamping devs with AI bugreports, i will simply want to scan projects that i personally use to be aware of its current state and future changes, before i blindly update apps i host.

What budget friendly GPU should i be looking for? Afaik VRAM is quite important, higher the better. What other features do i need to be on the look out for?

  • snekerpimp@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    9 hours ago

    Everyone is mentioning nvidia, but amds rocm has improved tremendously in the last few years, making a 6900xt 16gb an attractive option for me. I currently have a 6700xt 12gb that works no problem with ollama and comfyui, and an instinct mi25 16gb that works with some fiddling as well. From what I understand, an mi50 32gb requires less fiddling. However the instinct line is passively cooled, so finding a way to cool it might be a reason to stay away from them.

    Edit: I should add, my experience is on a few Linux distributions, I can not attest to the experience on windows.