A South Korean man has been sentenced to jail for using artificial intelligence to generate exploitative images of children, the first case of its kind in the country as courts around the world encounter the use of new technologies in creating abusive sexual content.
Some of the comments here are so stupid: “either they unpedophile themselves or we just kill them for their thoughts”
Ok so let me think this through. Sexual preferences in any way or pretty normal and they don’t go away. Actually if you tend to ignore them they become stronger. Also being a pedophile is not a crime currently. It’s the acting on it. So what happens right now is that people bottle it up, then it gets too much and they act on it in gruesome ways, because “if I go to prison I might as well make sure it was worth it”. Kids get hurt.
“But we could make thinking about it illegal!” No we can’t. Say that’s a law, what now? If you don’t like someone, they’re a “pedophile”. Yay more false imprisonment. Also what happens to real pedophiles? Well they start commit more acts because theres punishment even for restraint. And the truth is a lot of ppl have pedophilic tendencies. You will not catch all of them. Things will just get worse.
So why AI? Well as the commenter above me already said, if there’s no victim, there’s no problems. While that doesn’t make extortion legal (I mean obv. it’s a different law), this could make ppl with those urges have more restraint. We could even still limit it to specific sites and make it non-shareable. We’d have more control over it.
I know ppl still want the easy solution which evidently doesn’t work, but imo this is a perfect solution.
There’s also a difference (not sure if clinically) between people who sexualize really young kids and someone who likes kids that are under the age that whatever society has decided splits children and adults. In the USA porn depicting the latter is fine as long as everyone is over the age of adulthood, even if they dress up to look younger.
I think in general people who refer to pedophilia are usually referring to the former and not the 30 year old dating a 17 year old or whatever. But the latter makes it a little weird. Images of fictional people don’t have ages. Can you charge anyone who has aigen porn with csam if the people depicted sorta look underage?
Ai generated content is gonna bring a lot of questions like these that we’re gonna have to grapple with as a society.
The first part of your comment is rather confusing to me, but the latter part I fully agree with. Decoding age on appearance is a thing that will haunt us even more with AI until we face new solutions. But that is gonna be one of a list of big questions to be asked in conjunction with new AI laws.
I largely agree with what you’re saying and there definitely is no easy solution. I’ve never understood why drawings or sex dolls depicting underage people are illegal in some places, as they are victimless crimes.
The issue with aigen that differentiates it a bit from the above is the fidelity. You can tell a doll or an anime isn’t real, but in a few years from now it’ll b difficult to spot aigen images. This isn’t unique to this scenario though, it’s going to wreck havok on politics, scams, etc, but there is the potential that real CP comes out from hiding and is somewhat shielded by the aigen.
Of course this is just speculation, I hope it would go the other way around and everyone just jacks off at their computers and CP disappears completely. We need to stop focusing our attention on people with pedophila, get them mental support, and focus on sex offenders who are actually hurting people.
I’m all for letting people have their dolls, drawings, and AI generated stuff, but yeah… it would become easy for offenders to say “Naw, I snatched that shit off of DALL-E.” and walk in court, so some kind of forensic tool that can tell AI Generated Images from Real Ones would have to be made…
Actually there’s a lot of reasons we’d want a tool like that that have nothing to do with hypothetical solutions to kiddie diddling.
Can you imagine how easy extortion would become if you could show an AI pictures of your neighbor next door killing some rando missing person in the area? But every new technology enables crime, until we find out what the proper safeguards are so I’m not too worried about it in the long-term.
You know what? Sure. Imagine I find ppl really taste, especially hands. But I never chew on one. I just think about it. Literally the same thing. You should be rewarded for restraint on these urges. If I’d get punished for thinking about munching on a thumb, I’d at least take a hand with me to jail. I’m going there anyway.
I pretty much agree, while we should never treat Pedophilia as “Just another perfectly valid sexuality, let’s throw a parade, it’s nothing to be ashamed of” (Having the urge to prey on children is ABSOLUTELY something to be ashamed of even if you can’t control it.), we need to face facts… It isn’t someone waking up one day and saying “Wouldn’t it be funny if I took little Billy out back and filled him full of cock?”
It’s something going on in their head, something chemical, some misfiring of the neurons, just the way their endocrine system is built.
As much as I’d love to wave a magic wand over these people I reluctantly call people and cure them of their desires, we don’t have the power to do that. No amount of therapy in the world can change someone’s sexual tastes.
So in lieu of an ideal solution, finding ways to prevent pedophiles from seeking victims in the first place is the next best thing.
It’s not dissimilar to how when we set up centers for drug addicted people to get small doses of what they’re addicted to so that they can fight withdrawal symptoms, crimes and death rates go down. When you enact things like universal basic income and SNAP, people have less of a reason to rob banks and gas stations so we see less of them.
It’s not enough to punish people who do something wrong, we need to find out why they’re doing it and eliminate the underlying cause.
Some of the comments here are so stupid: “either they unpedophile themselves or we just kill them for their thoughts”
Ok so let me think this through. Sexual preferences in any way or pretty normal and they don’t go away. Actually if you tend to ignore them they become stronger. Also being a pedophile is not a crime currently. It’s the acting on it. So what happens right now is that people bottle it up, then it gets too much and they act on it in gruesome ways, because “if I go to prison I might as well make sure it was worth it”. Kids get hurt.
“But we could make thinking about it illegal!” No we can’t. Say that’s a law, what now? If you don’t like someone, they’re a “pedophile”. Yay more false imprisonment. Also what happens to real pedophiles? Well they start commit more acts because theres punishment even for restraint. And the truth is a lot of ppl have pedophilic tendencies. You will not catch all of them. Things will just get worse.
So why AI? Well as the commenter above me already said, if there’s no victim, there’s no problems. While that doesn’t make extortion legal (I mean obv. it’s a different law), this could make ppl with those urges have more restraint. We could even still limit it to specific sites and make it non-shareable. We’d have more control over it.
I know ppl still want the easy solution which evidently doesn’t work, but imo this is a perfect solution.
There’s also a difference (not sure if clinically) between people who sexualize really young kids and someone who likes kids that are under the age that whatever society has decided splits children and adults. In the USA porn depicting the latter is fine as long as everyone is over the age of adulthood, even if they dress up to look younger.
I think in general people who refer to pedophilia are usually referring to the former and not the 30 year old dating a 17 year old or whatever. But the latter makes it a little weird. Images of fictional people don’t have ages. Can you charge anyone who has aigen porn with csam if the people depicted sorta look underage?
Ai generated content is gonna bring a lot of questions like these that we’re gonna have to grapple with as a society.
The first part of your comment is rather confusing to me, but the latter part I fully agree with. Decoding age on appearance is a thing that will haunt us even more with AI until we face new solutions. But that is gonna be one of a list of big questions to be asked in conjunction with new AI laws.
I largely agree with what you’re saying and there definitely is no easy solution. I’ve never understood why drawings or sex dolls depicting underage people are illegal in some places, as they are victimless crimes.
The issue with aigen that differentiates it a bit from the above is the fidelity. You can tell a doll or an anime isn’t real, but in a few years from now it’ll b difficult to spot aigen images. This isn’t unique to this scenario though, it’s going to wreck havok on politics, scams, etc, but there is the potential that real CP comes out from hiding and is somewhat shielded by the aigen.
Of course this is just speculation, I hope it would go the other way around and everyone just jacks off at their computers and CP disappears completely. We need to stop focusing our attention on people with pedophila, get them mental support, and focus on sex offenders who are actually hurting people.
I’m all for letting people have their dolls, drawings, and AI generated stuff, but yeah… it would become easy for offenders to say “Naw, I snatched that shit off of DALL-E.” and walk in court, so some kind of forensic tool that can tell AI Generated Images from Real Ones would have to be made…
Actually there’s a lot of reasons we’d want a tool like that that have nothing to do with hypothetical solutions to kiddie diddling.
Can you imagine how easy extortion would become if you could show an AI pictures of your neighbor next door killing some rando missing person in the area? But every new technology enables crime, until we find out what the proper safeguards are so I’m not too worried about it in the long-term.
It’s also worth remembering that to make super accurate pictures of children the AI may have been trained on illegal photos (on purpose or not)
But that’s an issue for the AI creator
Pedo isn’t a sexual preference anymore than cannibal a dietary one…
You know what? Sure. Imagine I find ppl really taste, especially hands. But I never chew on one. I just think about it. Literally the same thing. You should be rewarded for restraint on these urges. If I’d get punished for thinking about munching on a thumb, I’d at least take a hand with me to jail. I’m going there anyway.
I pretty much agree, while we should never treat Pedophilia as “Just another perfectly valid sexuality, let’s throw a parade, it’s nothing to be ashamed of” (Having the urge to prey on children is ABSOLUTELY something to be ashamed of even if you can’t control it.), we need to face facts… It isn’t someone waking up one day and saying “Wouldn’t it be funny if I took little Billy out back and filled him full of cock?”
It’s something going on in their head, something chemical, some misfiring of the neurons, just the way their endocrine system is built.
As much as I’d love to wave a magic wand over these people I reluctantly call people and cure them of their desires, we don’t have the power to do that. No amount of therapy in the world can change someone’s sexual tastes.
So in lieu of an ideal solution, finding ways to prevent pedophiles from seeking victims in the first place is the next best thing.
It’s not dissimilar to how when we set up centers for drug addicted people to get small doses of what they’re addicted to so that they can fight withdrawal symptoms, crimes and death rates go down. When you enact things like universal basic income and SNAP, people have less of a reason to rob banks and gas stations so we see less of them.
It’s not enough to punish people who do something wrong, we need to find out why they’re doing it and eliminate the underlying cause.