

Really depends on many factors. If you have everything in RAM, almost nothing matters.
If your dataset outgrows the capacity, various things start to matter, based on your workload. Random reads need to have good indices (also writes with unique columns), OLAPs benefit from work_mem, >100M rows will need good partitioning, OLTP may even need some custom solutions if you need to keep a long history, but not for every transaction.
But even with >B of rows, Postgres can handle it with relative ease, if you know what you’re doing. Usually even on a hardware you would consider absolutely inadequate (last year I migrated our company DB from MySQL to Postgres, and with even more data and more complex workflows we downsized our RAM by more than half).
I use the pip install without issues for 8 years.
Never join matrix.org from your server (or ideally any other) and you’ll be fine.
I have my entire inner circle on my homeserver without issues, sharing pictures and texting a lot, and in that time we are on like 25ish GB in disk space.