I agree with you. However, I don’t think most people understand that. For most folks they haven’t been exposed to the most mentally darkest souls among us.
This really is a complicated problem, but if the fediverse wants to grow without relying on slave labor for moderation like Meta and the rest, then we have to find ways to lighten the load on moderators. Thats why creating transparent pre moderation tools like the image scanners used by many fediverse instances is so important.
This one is a very basic CSAM scanner that goes through lemmy image storage and just deletes stuff it deems bad. https://github.com/db0/fedi-safety I havent tried it tho, so i cant attest to its quality.
Im sure there are tools made for mastodon too, since it has a lot more users.
I agree with you. However, I don’t think most people understand that. For most folks they haven’t been exposed to the most mentally darkest souls among us.
This really is a complicated problem, but if the fediverse wants to grow without relying on slave labor for moderation like Meta and the rest, then we have to find ways to lighten the load on moderators. Thats why creating transparent pre moderation tools like the image scanners used by many fediverse instances is so important.
Are there any moderation tool projects going on right now one could throw a few bucks at to support it?
This one is a very basic CSAM scanner that goes through lemmy image storage and just deletes stuff it deems bad. https://github.com/db0/fedi-safety I havent tried it tho, so i cant attest to its quality.
Im sure there are tools made for mastodon too, since it has a lot more users.