• Phoenixz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    73
    arrow-down
    17
    ·
    1 year ago

    I’m very conflicted on this one.

    Child porn one of those things that won’t go away if you prohibit it, like alcohol. It’ll just go underground and cause harm to real children.

    AI child pornography images, as disturbing as they might be, would serve a “need”, if you will, while not actually harming children. Since child pornography doesn’t appear to be one of those “try it and you’ll get addicted” things, I’m genuinely wondering if this would actually reduce the harm caused to real children. If so, I think it should be legal.

    • /home/pineapplelover@lemm.ee
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      2
      ·
      edit-2
      1 year ago

      I’m thinking it should still be illegal but if they get charged for it, make it less severe than being charged with actual cp. This might naturally incentivize that industry to go for ai generated images instead of trafficking. Also I think if they took an image of an actual child and used AI to do this stuff it should be more severe than using a picture of a legal aged person to make cp.

    • clausetrophobic@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      43
      arrow-down
      9
      ·
      1 year ago

      Normalisation in culture has effects on how people behave in the real world. Look at Japan’s sexualization of women and minors, and how they have huge problems with sexual assault. It’s not about whether or not real children are getting hurt, it’s about whether it’s morally right or wrong. And as a society, we’ve decided that CP is very wrong as a moral concept.

      • fubo@lemmy.world
        link
        fedilink
        English
        arrow-up
        19
        ·
        1 year ago

        On the other hand, producing porn is illegal in India and they have huge problems with sexual assault too.

      • Ð Greıt Þu̇mpkin@lemm.ee
        link
        fedilink
        English
        arrow-up
        27
        arrow-down
        3
        ·
        1 year ago

        Here’s the thing though, being too paranoid about normalization also makes the problem worse, because the truth is that these are people with severe mental problems, who in all likelihood want to seek professional help in most cases.

        The problem is the subject is SO taboo that even a lot of mental health professionals will chase them off like rabid animals when the solution is developing an understanding that can lead to a clinical treatment plan for these cases.

        Doing that will also help the CSAM problem too since getting people out of the alleyways and into professional help will shrink the market significantly, both immediately and overtime, reducing the amount of content that gets made, and as a result, the amount of children victimized to make that content.

        The key factor remains, we have to stop treating these people like inhuman monsters that deserve death and far worse whenever they’re found. They’re sick in the head souls who need robust mental health care and thought management strategies.

        • JoBo@feddit.uk
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          None of that is an argument for normalisation via legalisation. Offenders and potential offenders should feel safe to seek help. Legalising AI-generated CSAM just makes it much less likely that they’ll see the need to seek help. In much the same way that rapists assume all men are rapists, because most men don’t make it clear that they’re not.

          • Phoenixz@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            I’m sorry, should I make clear to every bank that I’m not a bank robber? Do I seriously have to tell every woman that I am not a rapist? That is a really bad argument. The vast VAST majority of men are not rapists, saying that it’s men’s fault because they don’t apologize or clarify that they’re not rapists is just… crazy

            • JoBo@feddit.uk
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              1 year ago

              Where did you get any of that from? Why does any of what I said somehow imply telling women anything at all?

              Get a fucking grip.

    • UsernameIsTooLon@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      It’s an ethical dilemma. It’s just an extremely controversial one. You really have to weigh in whether or not we should keep chaos if it means betterment for society as we advance forward.

      I don’t think things should be as black and white as legal or not. I think the answer lies somewhere between something like decriminalizing drugs. Mostly illegal, but could benefit those who are genuinely seeking help. It would just have to take me a lot of convincing on an individual to need to seek out this material or else they are a danger to those around them.

    • JoBo@feddit.uk
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      8
      ·
      1 year ago

      You can certainly argue that AI-generated CSAM does less harm but you can’t argue from that to legalising it because it still does a bucketload of harm. Consumers of CSAM are very likely to harm real children and normalising CSAM makes that much more likely.

      This argument is a non-starter and people really need to stop pushing it.

      • Phoenixz@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        You’re literally claiming a bunch of things as facts. Any spur ea to back that up?

      • NightAuthor@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        Consumers of CSAM are very likely to harm real children and normalising CSAM makes that much more likely.

        If any of that was objectively true, then yeah, I agree. Problem is, it looks like you just pulled that out of your ass.

    • Discoslugs@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      21
      ·
      1 year ago

      I’m genuinely wondering if this would actually reduce the harm caused to real children. If so, I think it should be legal.

      So tired of seeing this point made. Allowing animated or AI generated CSAM to exists openly and legally will not reduce violence against childern. It will increase it. It will normalized it.

      You seem to think people who are willing and capable of commiting sexual violence against childern are going to do it less when theres a robust market of leaglly accessable CSAM.

      It wont. it will instead create predator pipelines. It will take people with mild sexual disorders and histories of their own sexual assualts as childern and feed them CSAM. It will create more predators.

      It will allow for communities of pedophiles to exist openly, findable on google searchs and advertised on regular porn sites.

      Also the people who make AI generated CSAM are not going to be water marking it a AI genrated.

      They are going to make it as real as possible. it will be indistinguishable to the naked eye and thus allow for Actual CSAM to masquarade and AI generated.

      I could go on. But im not an expert on any of this.

      • Phoenixz@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        You make a huge amount of claims, all as fact. How do you know that any of it is true? I’m not trying to defend rapists and pedophiles, I’m trying to think rationally and pragmatically about how to solve or at least improve this problem. Your reaction to it seems to be more emotional than rational and factual.

        • Discoslugs@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I’m trying to think rationally and pragmatically

          Ahh yes the rational thought process which leads you to think a government is capable of Safely facilitating the production of csam. ???

          They are unable to stop child poverty but totally capable to producing CSAM in a safe way…

          Spare me Your fact finding mission.

          Im not an expert or a social worker but i can tell But i can tell you that drug addiction and pedophilia are not the same.

          To consider these two the same, as the original commentor did, is disgisting, offensive and ignorant.

          There is no inherent victim with drug use. The same cannot be said pedophilia and Child sexual assualt.

          While there is always a spectrum of people particpating in child victimization. The people who are the creators of the CSAM and those who participate in its distribution are not addicts. The are predators.

          I’m not trying to defend rapists and pedophiles

          Well you are…

      • shrugal@lemm.ee
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        2
        ·
        edit-2
        1 year ago

        You completely ignored the “state controlled generation and access” part of the argument. Experience with addictive drugs has shown us that tightly controlled access, oversight and possibly treatment can be a much better solution than just making it illegal. The truth is that we just don’t know if it would work the same with CSAM, but we do know that making it a taboo topic doesn’t work.

        • JoBo@feddit.uk
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          3
          ·
          1 year ago

          There’s no parallel here. Providing safe access to drugs reduces harm to the user and the harm done by the black-market drug trade. Normalising AI-generated CSAM might reduce the harm done to children during production of the material but it creates many more abusers.

          The parallel only works if the “state controlled generation and access” to drugs was an open shop handing out drugs to new users and creating new addicts. Which is pretty much how the opiate epidemic was created by drug companies, pharmacists and doctors using their legitimate status for entirely illegitimate purposes.

            • JoBo@feddit.uk
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              Says me. And I explained exactly why. Feel free to engage with that argument.

          • shrugal@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Normalising AI-generated CSAM might reduce the harm done to children during production of the material but it creates many more abusers.

            The problem with your argument is that you assume a bunch of stuff that we just don’t know, because we haven’t tried it yet. The closest thing we do know are drugs, and for them controlled access has proven to work really well. So I think it’s at least worth thinking about and doing limited real-world trials.

            And I don’t think any sane person is suggesting to just legalize and normalize it. It would have to be a way for people to self-report and seek help, with conditions such as mandatory check-in/counseling and not being allowed to work with children.

            • JoBo@feddit.uk
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              The closest thing we do know are drugs, and for them controlled access has proven to work really well.

              Controlled access to drugs does work well. But legalising AI-generated CSAM is much more analogous to the opiate crisis, which is an unmitigated disaster.

              • shrugal@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                1 year ago

                How so, if you don’t commercialize it? No legal actor would have an incentive to increase the market for CSAM, and it’s not like people who are not already affected would or could just order some for fun.

                • JoBo@feddit.uk
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  1 year ago

                  That would be a discussion for an entirely different thread. I would still disagree with you but the people arguing in favour of CSAM on this thread don’t think it should be a crime to make it using AI.

              • Phoenixz@lemmy.ca
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                1 year ago

                Again, how do you know this for a fact? I see your argument being feelings over facts

    • MrSqueezles@lemm.ee
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      1 year ago

      I heard an anonymous interview with someone who was sickened by their own attraction to children. Hearing that person speak changed my perspective. This person had already decided never to marry or have kids and chose a career to that same end, low likelihood that kids would be around. Clearly, since the alternative was giving up on love and family forever, the attraction wasn’t a choice. Child porn that wasn’t made with children, comics I guess, was used to fantasize to prevent carrying through on those desires in real life.

      I don’t get it, why anyone would be attracted to kids. It’s gross and hurtful and stupid. If people suffering from this problem have an outlet, though, maybe fewer kids will be hurt.

      • Skwerls@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        21
        ·
        1 year ago

        Yes, but not in the way I think you’re implying, it is not trained on csam images. It can put the pieces together to varying degrees of success. If you ask for a Martian hedgehog in a tuxedo riding a motorcycle, it can create something looking like that without being trained on exactly that thing.

        • LogicalDrivel@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          10
          ·
          1 year ago

          Martian hedgehog in a tuxedo riding a motorcycle

          Just to prove your point I fed that into an AI (dreamshaper 8). no other prompts or anything, and this was the first image it generated.

    • kromem@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      4
      ·
      1 year ago

      I’d go more in the direction of state sponsored generation and controlled access.

      If you want legal unlimited access to AI generated CSM, you need to register with the state for it and in so doing also close off access to positions that would put you in situations where you’d be more able to act on it (i.e. employment in schools, child hospitals, church youth leadership, etc).

      If doing that, and no children are harmed in the production of the AI generated CSM, then you have a license to view and possess (but not redistribute) the images registered with the system.

      But if you don’t have that license (i.e. didn’t register as sexually interested in children) and possess them, or are found to be distributing them, then you face the full force of the law.

      • boogetyboo@aussie.zone
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        11
        ·
        1 year ago

        I think this idea rests on the false premise that people both need and have a right to pornography.

        Many adults go about their lives without accessing it/getting off on it. It’s not a human need like food or shelter. So government isn’t going to become a supplier. Parallels could be made, I suppose, with safe injecting rooms and methadone clinics etc - but that’s a medical/health service that protects both the individual and the community. I don’t think the same argument could be made for a government sponsored porn bank.

        • kromem@lemmy.world
          link
          fedilink
          English
          arrow-up
          13
          arrow-down
          1
          ·
          1 year ago

          You don’t think there’s an argument to be made that motivating people sexually attracted to children to self-report that attraction to the state in order to be monitored and kept away from children would have a social good?

          I guess I just don’t really see eye to eye with you on that then.

          • boogetyboo@aussie.zone
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            1 year ago

            That component I don’t have an issue with at all, actually. But providing government sanctioned ai porn? Unlikely

          • Kage520@lemmy.world
            link
            fedilink
            English
            arrow-up
            26
            ·
            1 year ago

            This is such a touchy subject I find it difficult to articulate what society actually needs. We need a system where PEDOPHILES are able to receive the mental health they need before they become MOLESTERS.

            But any time you say something about helping someone who is attracted to children the knee jerk reaction is always like “kill them. What you don’t want them dead? Are YOU a pedophile?” And I end up unable to convince them that helping them to not molest children by treating their mental health condition will actually help children not be molested. I really feel like this reactionary public opinion is causing people to go underground and is actually causing more children to be harmed.

            • kromem@lemmy.world
              link
              fedilink
              English
              arrow-up
              8
              ·
              1 year ago

              Agreed.

              There’s a world of difference between socially inappropriate desires that someone might be born with and can’t help and inappropriate behaviors that they chose to do.

              By all means demonize the latter. But demonizing the former along with it does mean a likely increase in the latter by forcing a social climate where being open and transparent about the former to avoid the latter is far less common.

              People suck even dealing with people being alcoholics or drug addicts and giving them the space and situational consideration to avoid temptation.

              All that said, IIRC the numbers are something like 50% of people with a sexual attraction to children will have acted on it by college, so it’s understandable that the animosity for the former is often not far distanced from the latter.

              But I’m all for any social programs that provide support for helping the other 50% avoid going down that path.

              • dustyData@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                It’s impossible to talk about any kind of numbers or statistics regarding sexually attraction to minors. It all gets muddled up really fast. A lot of men normalize underage attraction to teenagers, but don’t consider it pedophilia (legally it is). Which brings up the question of classifying attraction, but in public speech this of course brings in the knee jerks reaction of questioning and attacking the person for knowing or addressing that there’s a qualitative different between attraction to a 5 year old and a 17 year old. But it does make statistics really hard to define.

                Plus most of the information we have comes from felons and convicted criminals which are the worst or most extreme examples. Non-molesters pedophiles have absolutely no incentive to tell anyone, not even their psychologists, which means we don’t know anything from them. A few researchers try to get into their world and derive some understanding, but it is always a hard sell for grants. But a friend researcher once told me, if we could only interview felons, we would be convinced that most people will murder someone before turning 30. The truth is we don’t know, and we currently have no way of knowing the real numbers regarding pedophilia.

                • kromem@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  edit-2
                  1 year ago

                  ‘Pedophilia’ has no legal definition.

                  And the psychiatric definition is attraction to pre-pubescent children.

                  And the statistics aren’t actually as hard to define as you might think, though the amount of research this topic gets is woefully underrepresented relative to the social impact.

                  For example, on the topic of violent offenders in prison you bought up:

                  Of the 100 male inmates who participated in this study, 59% reported experiencing some form of sexual abuse before puberty, and all such instances occurred before or at the age of 13 years.

                  This issue has a much bigger and broader impact on society than most people realize.

                  • dustyData@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    edit-2
                    1 year ago

                    Read your sources again. I’m not sure the are saying what you think they are saying. It says they were the victims (emphasis not dismissal) of sexual abuse before puberty. Not the perpetrators. Again, it’s scientifically disingenuous to extrapolate an observation of a extreme bias population like incarcerated inmates to the whole population. This is why academic research is full of disclaimers and qualifiers.

    • ParsnipWitch@feddit.de
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      1 year ago

      There are many things still unclear about whether or not this will increase harm.

      We don’t know how these images effect people and their behaviour. Many techbros online treat it like it’s a fact that media does not influence behaviour and thought processes, but if you look at the research this isn’t clear cut at all. And some research was able to show that specific media indeed does influence people.

      Additionally, something rarely talked about, these images, stories and videos can be used to groom children and teenagers. Either to become victims and/or to become consumers themselves. This was a thing in the past and I bet it is still happening with Manga depicting Loli Hentai. Making these images legal will give groomers even better tool.

      • Phoenixz@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        If Loli porn can turn people into pedophiles then I think humanity is having bigger issues