Deepfake pr0n. Apparently it’s a thing.

  • PaupersSerenade@beehaw.org
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    In response to the tile comment ‘apparently it’s a thing’. It is, and unfortunately has been. This is non consensual pornography and it’s awful.

  • conciselyverbose@kbin.social
    link
    fedilink
    arrow-up
    45
    ·
    1 year ago

    I don’t think that’s a particularly bad place to draw the line. That’s the bulk of the damage.

    The problem is that for non-celebrities catching whether it’s “based on a real person” or not is extremely difficult, and false negatives can genuinely be life changing. Regardless of whether you think people should be more free with their body or whatever “they’re prudes” narrative you want (and eventually it may well be normalized to expect people to see you naked in more than just imagination as technology advances), at the end of the day people are taught to be embarrassed by their bodies and will in many cases be genuinely traumatized even on the less bad end of the spectrum of that kind of behavior*.

    *As opposed to sustained harassment and badgering. One time is still very harmful.

    • lemillionsocks@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      yeah especially as I imagine these ai and image searches may also start getting used as part of screening services. So that job may run your id through a face search and because a service you uploaded a photo to was used to train ai faces there may be some content that bears more than a striking resemblance to you

    • Dankenstein@beehaw.org
      link
      fedilink
      arrow-up
      12
      ·
      1 year ago

      I like this but it leaves too large of a gap, wouldn’t it be better to say that all pornography and erotic art requires consent from the person(s) who’s likeness is intended to be represented?

      Then you can say “non-consenting pornography” for fakes that are and aren’t generated by AI.

      Kinda weird to limit the language to AI, kinda weird that people would want to see other people naked without their consent but y’know.

      • conciselyverbose@kbin.social
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        Revenge porn is already illegal in a lot of places and I’d be shocked if they didn’t have language in Reddit’s rules against it long before this.

    • Evergreen5970@beehaw.org
      link
      fedilink
      arrow-up
      7
      ·
      edit-2
      1 year ago

      I have no shame and don’t care if people fantasize about me or make porn as long as they respect my “no”s for real physical contact.

      I am also aware that sometimes, people don’t like you and they go for disproportionate retribution. I would not like to have porn deepfaked of me and it spread around as “she took this of herself, what a slut, fire her/do not hire her.” (I’m not exactly in a position to be able to pick and choose companies who share my values and wouldn’t care what non-hateful activities I might get up to in my off hours, I need to increase my skills so I’m desirable enough to be able to actually reject job offers.) I would also not like people to deepfake me having sex with a child and use a very good AI fake as “proof” I am a pedophile. Such an accusation will stick. And it would feel especially bad because I’m a virgin by choice and don’t desire sexual contact with anyone (I’m asexual), so for me to be painted as someone who would choose a nonconsensual sex act…

      I’d be okay with personal use of these images even if they were of me, but the reality is that it’s possible to use this for more than just an innocent masturbation session. There are more ways others’ judgment can affect you than just shame and feelings.

      Not to say that it is right to make these images just because I would personally be unbothered if it were me, just trying to also add other ways it can negatively affect people, including people like me who have no shame.

      • conciselyverbose@kbin.social
        link
        fedilink
        arrow-up
        10
        ·
        1 year ago

        Yeah, you have a lot of example of further towards the “really bad” side of the spectrum.

        I just wanted to make the point that even the “mild” end (eg nudes with no sex) has literally caused people to kill themselves because of the way some of society’s pressures push them. Anything you accidentally let through has significant harm potential.

  • JaymesRS@midwest.social
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 year ago

    So basically if I understand correctly, AI porn of: Emma Watson - Bad Hermione Granger - A-OK

    Seems potentially problematic.