Today I posted a picture of a stamp with an animal in it and they said the picture contained nudity and made me take it down, but I reported a photo of a guy with a fully visible swastika tattoo and they said that’s fine.

I’d like to start a Lemmy community with photos of stuff that they refuse to remove called FacebookSaysItsFine.

  • ThePowerOfGeek@lemmy.world
    link
    fedilink
    arrow-up
    8
    ·
    1 year ago

    It would definitely require some very active moderation and clearly-defined community rules. But it sounds like a great idea for a Lemmy community, if you have the time.

    • thantik@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      Cloudflare has free CSAM scanning tools available - they really just need to implement it.

      • Rai@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        When did “CP” become “CSAM”?

        If you want to change the acronym, wouldn’t “CR” make more sense?

        • Cracks_InTheWalls@sh.itjust.works
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          1 year ago

          'cause porn is made with consenting adults. CSAM isn’t porn. CR is typically what’s depicted in CSAM (assuming that R stands for rape), but there’s two (or more) separate though closely related crimes here. That and SA (sexual assault) covers a wider breadth of activities, which is good if a person wants to quibble over the term rape when regardless something horrific happened to a kid and videos/images of said horrific thing is now getting shared among pedophiles.

          Will note I’ve only seen CSAM used when I started using Lemmy, so I’m not really sure when people started using the term over CP. I’m personally for it - it more accurately describes what it is, and while I haven’t seen this term in the wild SAM to describe video or images of non-consenual sex acts among adults is good too.