• 4 Posts
  • 42 Comments
Joined 2 months ago
cake
Cake day: November 25th, 2024

help-circle



  • The fix to that problem seems trivial though.

    Hopefully there will pop up some commercial FOSS fediverse FB alternatives now when Facebook is about to go all in on the crazy. Or that the EU will pour some big money on the existing alternatives.

    Anyway, Frendica seems to be the best option right now so I’ll give it a serious try.


  • Best would probably be if I quit social media. I find arguing on Facebook, Reddit and Lemmy very addictive.

    However, I prefer it if my local boomers are on a slightly addictive but open and transparent social media, which isn’t controlled by an oligarch or random orange fascist.

    I want something that is so similar to Facebook, that I basically can post a link on Facebook and when the boomer follow that link they’ll effortlessly create a new account and start using the fediverse FB clone, because that’s where I’ll post my content from now on.




  • sith@lemmy.zipOPtoFediverse@lemmy.worldI need a Facebook replacement
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    7 days ago

    Can’t access OfferUp because I’m not in the US. What’s special about it? Is it on fediverse?

    In Sweden, the most popular alternatives to FBM are Blocket and Tradera/ebay. But they’re all commercial and not on fediverse.

    What do you think of Matrix based messaging apps? Seems like a better alternative to Signal and Wire. Depending on what server you’re using.





  • Only problem is it’s mostly 50 year olds complaining about their neighboorhods’s minor problems or local contractors advertising themselves over and over again.

    This is basically what I’m looking for. For me its not a problem. Most of the active people on Facebook are boomers. And that’s also the biggest reason why I’ve been “forced” to use Facebook. And also because it’s been the default communication platform of all local communities I’ve been part of.






  • I’m an active user who post and comment regularly, and I would say that the experience is very similar to Reddit. Except for less adds and smaller numbers on the main/all page. The experience is probably very different if you’re mainly a passive consumer of content.

    Though I’ve never been active in “large” subreddits and I tend to block them from my feed. So guess I don’t know what I’m missing.



  • Is that still true though? My impression is that AMD works just fine for inference with ROCm and llama.cpp nowadays. And you get much more VRAM per dollar, which means you can stuff a bigger model in there. You might get fewer tokens per second compared with a similar Nvidia, but that shouldn’t really be a problem for a home assistant. I believe. Even an Arc a770 should work with IPEX-LLM. Buy two Arc or Radeon with 16 GB VRAM each, and you can fit a Llama 3.2 11B or a Pixtral 12B without any quantization. Just make sure that ROCm supports that specific Radeon card, if you go for team red.