Quick question, I’m looking to make an Mbin account and just wanted to ask if there is any lemmy.ml type of situation to be aware of.

  • matcha_addict@lemy.lol
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    3 months ago

    Is there a still a concern for self hosters of public instances regarding CSAM content? And if so, any guidance on how to mitigate it?

    I am very interested in self hosting, but I am worried of its legal repercussions, especially since I am an immigrant in the country where I live and afraid to get in any legal trouble.

    • SLCW💥@newsie.social
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      @matcha_addict @melroy It’s not possible to stop every instance of a user posting illegal content. What’s important is that you have a clear policy, and a moderation team that consistently and proactively enforces that policy.

    • DarkThoughts@fedia.io
      link
      fedilink
      arrow-up
      2
      ·
      3 months ago

      I would not risk it if you’re at risk of getting kicked out or something, especially if your host country is stricter on that type of content. Just join one of the existing instances and subscribe to whatever Lemmy communities you want to see federated with it.

    • Trainguyrom@reddthat.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 months ago

      The EFF had a handy explainer a couple of years ago on basically that subject:

      https://www.eff.org/deeplinks/2022/12/user-generated-content-and-fediverse-legal-primer

      Child Sexual Abuse Material (CSAM): Service providers are required to report any CSAM on their servers to the CyberTipline operated by the National Center for Missing and Exploited Children (NCMEC), a private, nonprofit organization established by the U.S. Congress, and can be criminally prosecuted for knowingly facilitating its distribution. NCMEC shares those reports with law enforcement. However, you are not required to affirmatively monitor your instance for CSAM.

      By my understanding, you don’t have to setup proactive monitoring for CSAM being federated in, but if you specifically spot CSAM or it is reported to you then you are legally obligated to report it

      • schizo@forum.uncomfortable.business
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 months ago

        As a slight side note, that really only directly applies to people in the US.

        Other countries have different requirements around what you’re required to do, and some are stricter and some are uh, not as strict.

    • melroy@kbin.melroy.org
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      As long as you report the content and act fast if such kind of material is on your site. Also you can disable account registration, so it only you and maybe some friends you know that have an account.

      Last but not least, you can also block domains in Mbin. I blocked some domains I just do not want to federate with!

      NSFW site

      Like lemmynsfw (dot) com.

      And yes you can add that in the admin panel yourself.

      • matcha_addict@lemy.lol
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 months ago

        Even if I disable account registration, I may still be liable if somehow the content ended up on my instance due to federation right? Or is that not a concern?

        • melroy@kbin.melroy.org
          link
          fedilink
          arrow-up
          1
          ·
          3 months ago

          Well… the answer is always yes. But you can mitigate the problem further by not only disable account registration but also check the setting “Force users to login before they can access any content”.

          So in the end, that means that nobody can see any content without logging in. And since you are the only account, only you can see the federated content.