My rack is finished for now (because I’m out of money).

Last time I posted I had some jank cables going through the rack and now we’re using patch panels with color coordinated cables!

But as is tradition, I’m thinking about upgrades and I’m looking at that 1U filler panel. A mini PC with a 5060ti 16gb or maybe a 5070 12gb would be pretty sick to move my AI slop generating into my tiny rack.

I’m also thinking about the PI cluster at the top. Currently that’s running a Kubernetes cluster that I’m trying to learn on. They’re all PI4 4GB, so I was going to start replacing them with PI5 8/16GB. Would those be better price/performance for mostly coding tasks? Or maybe a discord bot for shitposting.

Thoughts? MiniPC recs? Wanna bully me for using AI? Please do!

  • muppeth@scribe.disroot.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    is the mac mini really that good? running 12-14b models on my radeon rx 7600xt is ok’ish but i do “feel it” while running 7-8b models sometimes just doesn’t feel enough. I wonder where does mac mini land in here.

    • nagaram@startrek.websiteOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      From what I understand its not as fast as a consumer Nvdia card but but close.

      And you can have much more “Vram” because they do unified memory. I think the max is 75% of total system memory goes to the GPU. So a top spec Mac mini M4 Pro with 48GB of Ram would have 32gb dedicated to GPU/NPU tasks for $2000

      Compare that to JUST a 5090 32GB for $2000 MSRP and its pretty compelling.

      $200 and its the 64GB model with 2x 4090’s amounts of Vram.

      Its certainly better than the AMD AI experience and its the best price for getting into AI stuff so says nerds with more money and experience than me.

    • nagaram@startrek.websiteOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      From what I understand its not as fast as a consumer Nvdia card but but close.

      And you can have much more “Vram” because they do unified memory. I think the max is 75% of total system memory goes to the GPU. So a top spec Mac mini M4 Pro with 48GB of Ram would have 32gb dedicated to GPU/NPU tasks for $2000

      Compare that to JUST a 5090 32GB for $2000 MSRP and its pretty compelling.

      $200 and its the 64GB model with 2x 4090’s amounts of Vram.

      Its certainly better than the AMD AI experience and its the best price for getting into AI stuff so says nerds with more money and experience than me.