The Fediverse, a decentralized social network with interconnected spaces that are each independently managed with unique rules and cultural norms, has seen a surge in popularity. Decentralization h...
Not the best news in this report. We need to find ways to do more.
basically we dont know what they found, because they just looked up hashtags, and then didnt look at the results for ethics reasons. They dont even say what hashtags they looked through.
There are communities on NSFW Lemmy where people intentionally present as children engaged in sexual abuse scenarios. They’re adults and this is their kink that everyone is supposed to tolerate and pretend is ok.
See defederation drama over the last couple of days. What I’m saying is, the hashtags mean nothing.
There are communities on NSFW Lemmy where people intentionally present as children engaged in sexual abuse scenarios.
If you are referring to the community that was cited as the reason for defederation, this is completely false. The community in question is adorableporn, extremely similar to the subreddit of the same name. No one, in any manner, in either community, presents as a child. While yes, the women that post there tend to be on the shorter and thinner side, calling short, thin adults ‘children’ is not being honest.
To be clear, this community is about petite women. This community is NOT about women with a kink to present as a child.
What other bait communities? We can’t just accept “think of the children” as an excuse. That doesn’t work.
Yes, no one wants actual CSAM to show up in their feed, we can all completely agree on that. But just because some middle-aged woman can’t tell the difference between a 20 year old and a 15 year old, doesn’t make images of the 20 year old CSAM.
it says 112 instances of known CSAM. But that’s based on their methodology, right, and their methodology is not actually looking at the content, it’s looking at hashtags and whether google safesearch thinks it’s explicit. Which Im pretty sure doesnt differentiate with what the subject of the explicitness is. It’s just gonna try to detect breast or genitals I imagine.
Though they do give a few damning examples of things like actual CP trading, but also that they’ve been removed.
112 images out of 325,000 images scanned over two days, is about 0,03% So we are doing pretty well. With more moderation tools we could continue to knock out those sigmas.
basically we dont know what they found, because they just looked up hashtags, and then didnt look at the results for ethics reasons. They dont even say what hashtags they looked through.
There are communities on NSFW Lemmy where people intentionally present as children engaged in sexual abuse scenarios. They’re adults and this is their kink that everyone is supposed to tolerate and pretend is ok.
See defederation drama over the last couple of days. What I’m saying is, the hashtags mean nothing.
No, that admin lied about the community
then what’s the problem?
If you are referring to the community that was cited as the reason for defederation, this is completely false. The community in question is
adorableporn
, extremely similar to the subreddit of the same name. No one, in any manner, in either community, presents as a child. While yes, the women that post there tend to be on the shorter and thinner side, calling short, thin adults ‘children’ is not being honest.To be clear, this community is about petite women. This community is NOT about women with a kink to present as a child.
And what of all the other bait communities? Come on. It’s not ok.
What other bait communities? We can’t just accept “think of the children” as an excuse. That doesn’t work.
Yes, no one wants actual CSAM to show up in their feed, we can all completely agree on that. But just because some middle-aged woman can’t tell the difference between a 20 year old and a 15 year old, doesn’t make images of the 20 year old CSAM.
We do know they only found, what, 112 actual images of CP? That’s a very small number. I’d say that paints us in a pretty good light, relatively.
it says 112 instances of known CSAM. But that’s based on their methodology, right, and their methodology is not actually looking at the content, it’s looking at hashtags and whether google safesearch thinks it’s explicit. Which Im pretty sure doesnt differentiate with what the subject of the explicitness is. It’s just gonna try to detect breast or genitals I imagine.
Though they do give a few damning examples of things like actual CP trading, but also that they’ve been removed.
How many of those 112 instances are honeypots controlled by the FBI or another law enforcement agency?
112 images out of 325,000 images scanned over two days, is about 0,03% So we are doing pretty well. With more moderation tools we could continue to knock out those sigmas.