Mastodon, an alternative social network to Twitter, has a serious problem with child sexual abuse material according to researchers from Stanford University. In just two days, researchers found over 100 instances of known CSAM across over 325,000 posts on Mastodon. The researchers found hundreds of posts containing CSAM related hashtags and links pointing to CSAM trading and grooming of minors. One Mastodon server was even taken down for a period of time due to CSAM being posted. The researchers suggest that decentralized networks like Mastodon need to implement more robust moderation tools and reporting mechanisms to address the prevalence of CSAM.

  • zygo_histo_morpheus@programming.dev
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    1 year ago

    Is there any way mastodon stands out from other self hosted websites? Would the CSAM material be harder to distribute or easier to prosecute if they ran, say, a self-hosted bulletin board for it instead?

    • ParsnipWitch@feddit.de
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 year ago

      Privately hosted websites are only useful for established clients. Via social media and image sharing platforms the distributors try to reach new clientele. They often have more or less hidden tags and codes that can attract potential customers. When someone reacts to these they carefully try to see if the person could be trusted to have access to private sharing. It’s how drug dealers online sometimes work or extremist groups.

    • Big P@feddit.uk
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Probably just the ease at which you can find it since each instance is linked, it basically becomes a search engine that might not have the same controls/protection as Google etc