When you moderate a platform that the public is free to upload anything to, you will see everything. From insults to threats to gore to CSAM. Thats how the internet is. That is the worst part in my opinion, because you cant just look away, but actually have to deal with it.
I agree with you. However, I don’t think most people understand that. For most folks they haven’t been exposed to the most mentally darkest souls among us.
This really is a complicated problem, but if the fediverse wants to grow without relying on slave labor for moderation like Meta and the rest, then we have to find ways to lighten the load on moderators. Thats why creating transparent pre moderation tools like the image scanners used by many fediverse instances is so important.
PieFed has a number of features designed to democratize moderation - e.g. keyword filtering (allowing users to filter All, None, and even just Some content, of e.g. Musk or Trump or USA) facilitates individual end-users to curate their experiences so that mods don’t have to be as aggressive at removing things.
Another cool feature is the user icons - like a brand-new account on the Fediverse gets an icon next to their name, as too does someone who receives let’s say >10x more downvotes than upvotes, or a potential unregistered bot account that posts 10x more often but never replies to comments. These icons don’t remove content like a moderator would, just label it so you can choose to use that knowledge however you wish.
Another one is that people looking for a less controversial discussion environment can auto-hide or even auto-remove content from your feed - I have these turned off but if someone would be offended easily and want not to see things that are heavily downvoted, they have this option. Here it is the combination of the entire community and the end user deciding their personal tolerance threshold that decides what content appears in someone’s feed. There are also options to use “community members only” votes, to help separate drive-by votes from people who have not joined the community and were just scrolling All, e.g. for polls and such.
Oh yeah, PieFed has polls. Also flairs - both user and post. And categories of communities that are user customizable and shareable. It has a ton of new features, both related and unrelated to community moderation. Check it out!
This one is a very basic CSAM scanner that goes through lemmy image storage and just deletes stuff it deems bad. https://github.com/db0/fedi-safety I havent tried it tho, so i cant attest to its quality.
Im sure there are tools made for mastodon too, since it has a lot more users.
When you moderate a platform that the public is free to upload anything to, you will see everything. From insults to threats to gore to CSAM. Thats how the internet is. That is the worst part in my opinion, because you cant just look away, but actually have to deal with it.
I agree with you. However, I don’t think most people understand that. For most folks they haven’t been exposed to the most mentally darkest souls among us.
This really is a complicated problem, but if the fediverse wants to grow without relying on slave labor for moderation like Meta and the rest, then we have to find ways to lighten the load on moderators. Thats why creating transparent pre moderation tools like the image scanners used by many fediverse instances is so important.
Are there any moderation tool projects going on right now one could throw a few bucks at to support it?
PieFed has a number of features designed to democratize moderation - e.g. keyword filtering (allowing users to filter All, None, and even just Some content, of e.g. Musk or Trump or USA) facilitates individual end-users to curate their experiences so that mods don’t have to be as aggressive at removing things.
Another cool feature is the user icons - like a brand-new account on the Fediverse gets an icon next to their name, as too does someone who receives let’s say >10x more downvotes than upvotes, or a potential unregistered bot account that posts 10x more often but never replies to comments. These icons don’t remove content like a moderator would, just label it so you can choose to use that knowledge however you wish.
Another one is that people looking for a less controversial discussion environment can auto-hide or even auto-remove content from your feed - I have these turned off but if someone would be offended easily and want not to see things that are heavily downvoted, they have this option. Here it is the combination of the entire community and the end user deciding their personal tolerance threshold that decides what content appears in someone’s feed. There are also options to use “community members only” votes, to help separate drive-by votes from people who have not joined the community and were just scrolling All, e.g. for polls and such.
Oh yeah, PieFed has polls. Also flairs - both user and post. And categories of communities that are user customizable and shareable. It has a ton of new features, both related and unrelated to community moderation. Check it out!
This one is a very basic CSAM scanner that goes through lemmy image storage and just deletes stuff it deems bad. https://github.com/db0/fedi-safety I havent tried it tho, so i cant attest to its quality.
Im sure there are tools made for mastodon too, since it has a lot more users.
thats why the bigger platforms resorted to using ai overmoderation to deal with these.
I managed a forum for a decade, and seen everything except CSAM. You just delete it.
It was on a .tk domain and it got suspended for content violations anyway