hitagi (ani.social)

howdy

  • 1 Post
  • 68 Comments
Joined 1 year ago
cake
Cake day: June 21st, 2023

help-circle

  • This Lemmy instance is much harder to maintain due to the fact that I can’t tell what images get uploaded here, which means anyone can use this as a free image host for illegal shit, and the fact that there’s no user list that I can easily see. Moderation tools are nonexistent on here.

    0.19.4 provides a way to see uploaded images (although not the best) but this version was only recently released so I can see where the frustration is coming from especially since the CSAM attacks happened nearly a year ago. At the time, I had to make a copy of pictrs, view everything on a file manager, and manually remove those images. People can still upload images without anyone seeing it however.

    It also eats up storage like crazy due to the fact that it rapidly caches images from scraped URLs and the few remaining instances that we still federate with.

    This was fixed in 0.19.3 (released 7 months ago) where you can disable image “caching”. This has solved storage costs for us together with pictrs’ image processing.

    plug in an expensive AI image checker to scan for illegal imagery

    It’s unfortunate that we need this. Not everybody has the resources to run fedisafety nor does everyone live in USA where they can use Cloudflare’s CSAM scanner. I think a good way to deal with the issue is to have images that are not public, not be stored (or have no private images at all). This way images can be easily reported.

    Overall, I understand the frustration and to some degree I also feel the same but I also limit my expectations considering the nature of the project.








  • I did say the complaint:

    The mod tools are too basic and barebones

    I also gave three ideas that are (1) manually done by mods/admins, (2) manually done by users, (3) automated by mods/admins.

    On a small subreddit I moderate, we have automod automatically queue posts at a certain age. If your account is less than a day old, your post is hidden for public viewing while the mods manually approve it. We had to deal with a lot of spam bots and this was a good solution for us.

    I believe the way the report feature works on Lemmy is that it sends out to the mods of the community and all the admins. It doesn’t work quite well when I don’t know the rules of a community. “No memes on mondays”, is that really something I (as an admin) should be worried about? That’s not my job.

    I hope this clears things up.


  • The mod tools are too basic and barebones. I’d appreciate if we can get:

    • Post approval system for new or low-karma accounts (it can be karma for local communities)
    • Reporting system with multiple options (very hard to implement in the fediverse as everyone has different rules) so that the reports are sent to the proper channels. Admins shouldn’t have to deal with community-specific rules (for example, “memes only on mondays”).
    • Automoderator (these might already exist? I have not tried them.)

    I don’t know if I should split this comment into three or not but they’re some general suggestions to address the complaint of basic mod tools.