Mastodon, an alternative social network to Twitter, has a serious problem with child sexual abuse material according to researchers from Stanford University. In just two days, researchers found over 100 instances of known CSAM across over 325,000 posts on Mastodon. The researchers found hundreds of posts containing CSAM related hashtags and links pointing to CSAM trading and grooming of minors. One Mastodon server was even taken down for a period of time due to CSAM being posted. The researchers suggest that decentralized networks like Mastodon need to implement more robust moderation tools and reporting mechanisms to address the prevalence of CSAM.

  • crystal@feddit.de
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    I don’t know how you get the impression that this increases censorship.

    Instance admin already manually block content. And they are already able to do that to any extend they wish to do.

    This tool would simply automate that process.

    Admins would not gain or lose any ability to block content. Identifying child porn would simply be easier.

    (Imagine an admin going to their database and doing a CTRL+F with the term “child porn”, and then going through the posts to find offending ones. But instead of CTRL+F it’s an AI.)

    (For some reason I don’t get a notification when you answer my comment. Is that a known issue? Did you block me or something?)