Yeah, for sure. That I was aware of.
We were focusing on the Mini instead because… well, if the OP is fretting about going for a big GPU I’m assuming we’re talking user-level costs here. The Mini’s reputation comes from starting at 600 bucks for 16 gigs of fast shared RAM, which is competitive with consumer GPUs as a standalone system. I wanted to correct the record about the 24Gig starter speccing up to 64 because the 64 gig one is still in the 2K range, which is lower than the realistic market prices of 4090s and 5090s, so if my priority was running LLMs there would be some thinking to do about which option makes most sense in the 500-2K price range.
I am much less aware of larger options and their relative cost to performance because… well, I may not hate LLMs as much as is popular around the Internet, but I’m no roaming cryptobro, either, and I assume neither is anybody else in this conversation.
A quick look at US Amazon spits out that the only 24Gb card in stock is a 3090 for 1500 USD. A look at the European storefront shows 2400EUR for a 4090. Looking at other assorted stores shows a bunch of out of stock notices.
It’s quite competitive, I’m afraid. Things are very stupid at this point and for obvious reasons seem poised to get even dumber.