shared via https://feddit.de/post/2805371

Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.

    • Jordan Lund@lemmy.one
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      You say that NOW, but if people start using your images to generate revenge porn or, you know, really anything you didn’t consent to, that’s a huge problem.

      Both for the people whose images were used to train the model and for the people whose images are generated using the models.

      Non-consent is non-consent.

      This is how you get the feds involved.

      • Evergreen5970@beehaw.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        As someone who personally wouldn’t care at all if someone made AI porn of me and masturbated to it, I am incredibly uncomfortable with the idea that someone who doesn’t like me may have the option to generate AI porn of me having sex with a child. Now there’s fake “proof” I’m a pedophile, and I get my life ruined for sex I never had, for violation of consent I never actually committed. Even if I’m vindicated in court, I might still be convicted in the court of public opinion. And people could post faked porn of me and send it to companies to try to say “Evergreen5970 is promiscuous, don’t hire them.” Not all of us have the luxury of being able to pick and choose between companies depending on whether they match our values, some of us have to take what they can get and sometimes that would include companies that would judge you for taking nude photos of yourself. It would feel especially bad given I’m a virgin by choice who has never taken nudes let alone sent them. Punished for something I didn’t do.

        Not everyone is going to restrict their use to their private wank sessions, to making a real image of the stuff they probably already envision in their imagination. Some will do their best to make its results public with the full intention of using it to do harm.

        And once faking abuse with AI porn becomes well-known, it might discredit actual photographic/video proof of CSAM happening. Humans get fooled by whether an AI-generated image was taken by a human or generated by AI, and AI doesn’t detect AI-generated images with a perfect accuracy rate. So the question becomes “how can we trust any image anymore?” Not to mention the ability to generate new CSAM with AI. Some more mainstream AI models might try to tweak algorithms to prevent people from generating any porn involving minors, but there’ll probably always be some floating around with those guardrails turned off.

        I’m also very wary of dismissing other peoples’ discomfort just because I don’t share it. I’m still worried for people who would care about someone making AI porn of them even if it was just to masturbate with and kept private.

      • ram@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        Let’s not forget that these AI aren’t limited by age. Like fuck am I gonna be out here defending tech that would turn my kid into CSAM. Fucking disgusting.

        • MaggiWuerze@feddit.de
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          On the other hand, this could be used to create material that did not need new suffering. So it might reduce the need for actual children to be abused for the production of it.

          • ram@lemmy.ca
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 year ago

            Ya, no, those people need psychological help. Not to feed the beast. This is nonsense.

        • PelicanPersuader@beehaw.org
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Worse, people making AI CSAM will wind up causing police to waste resources investigating abuse that didn’t happen, meaning those resource won’t be used to save real children in actual danger.

    • Lionir [he/him]@beehaw.org
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      Everybody gets horny, idiot.

      Please don’t call people idiots needlessly.

      Does it matter if someone jerks off to JaLo in the Fappening or some random AI generated BS?

      The issue is that this technology can be used to create pornographic material of anyone that has some level of realism without their consent. For creators and the average person, this is incredibly harmful. I don’t want porn of myself to be made and neither do a lot of creators online.

      Not only are these images an affront to the dignity of people but it can also be incredibly harmful for someone to see porn of themselves they did not make with someone else’s body.

      This is a matter of human decency and consent. It is not negotiable.

      As mentioned by @ram@lemmy.ca, this can also be used for other harmful things like CSAM which is genuinely terrifying.

      • TheFriendlyArtificer@beehaw.org
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I have to disagree (but won’t downvote!)

        AI porn is creepy. In multiple ways!

        But it’s also a natural evolution of what we’ve been doing as a species since before we were a species.

        Does imagining a different partner while having sex or masturbating count? I would imagine most people would say, “no”.

        How about if somebody draws a crude stick figure of somebody they met on the street? Unless you’re Randall Munroe, this is probably harmless too.

        Now a highly skilled portrait artist paints a near replica of somebody he knows, but has never seen in the nude. They never mention their friend by name, but the output is lifelike and unmistakably them.

        Maybe a digital artist finds a few social media pictures of a person and decided to test drive Krita and manipulates them into appearing nude.

        Or, and this happened to me quite recently, you find your porn doppelganger. My spouse found mine and it ruined her alone time. And they really did look just like me! Taking that a step further, is it illegal to find somebody’s doppelganger and to dress them up so that they look more like their double?

        Like you, I don’t want people like this in my life. But it feels like this is one of those slippery slopes that turns out to be an actual slippery slope.

        You can’t make it illegal without some serious downstream effects.

        If you did, the servers will just get hosted in an Eastern European country that is happy to lulwat at American warrants.

        I don’t have any answers, just more Devil’s advocate-esque questions. If there was a way to make it illegal without any collateral damage, I’d be proudly behind you leading the charge. I just can’t imagine a situation where it wouldn’t get abused, a’la the DMCA.

        • Lionir [he/him]@beehaw.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Does imagining a different partner while having sex or masturbating count? I would imagine most people would say, “no”.

          You can’t share that though so while I still think it is immoral, it is also kind of impossible to know.

          Now a highly skilled portrait artist paints a near replica of somebody he knows, but has never seen in the nude. They never mention their friend by name, but the output is lifelike and unmistakably them.

          Maybe a digital artist finds a few social media pictures of a person and decided to test drive Krita and manipulates them into appearing nude.

          Those would be immoral and reprehensible. The law already protects against such cases on the basis of using someone’s likeness.

          It’s harmful because it shares images of someone doing things they would never do. It’s not caricature, it’s simply a fabrication. It doesn’t provide criticism - it is simply erotic.

          Taking that a step further, is it illegal to find somebody’s doppelganger and to dress them up so that they look more like their double?

          If the goal is to look like you, I would imagine it is possible to defend by law. Otherwise, it is simply coincidence. There’s no intent there.

          I don’t think it is a stretch or slippery slope. Just as a picture is captured by a camera, a drawing is captured by a person or a machine.

          Both should be the same and it is often already the case in many jurisdictions around the world when it comes to CSAM.

  • SteleTrovilo@beehaw.org
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    The tech isn’t there yet. There are so often distracting flaws around the hands/feet. The AI doesn’t really know what a human is, its just endlessly re-combining existing material.

    • Catsrules@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Key word is yet.

      Yeah some body parts are a little weird today, but what about tomorrow, next week, next month, next year?

      I really haven’t given this much attention but the last time I did maybe 6-8 months ago, most of the photos had hands that were stuff of nightmares. Looking at them again today at least from the quick 10 minutes of looking they have improve significantly. Yeah they are still far from perfect but a handful are very good, most are passable and a few are still nightmare fuel.

    • rhabarba@feddit.deOP
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      As much as I loathe having to reveal this to you, the shapeliness of the hands should be semi-negligible to most people who would love to have an image created from the statement “I want to see Billie Eilish’s boobs”.

      • CraigeryTheKid@beehaw.org
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        Agree that was a strange take. Can you usually tell it’s AI/fake? Yes. Is it still achieving the goal of the creator/user? Yes.

          • SteleTrovilo@beehaw.org
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            I’m not into feet specifically, but when I ask for “Veronica Mars in a string bikini” I don’t want to get “Veronica Mars with unattached toes.” It’s distracting AF.

            Doesn’t happen with real models, or even human-made hentai.

            • lloram239@feddit.de
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              1 year ago

              Doesn’t happen with real models

              That actually happens quite a lot. Hollywood movie poster are especially full of it, as they rarely do specific photoshots for the poster and instead just copy&paste together whatever random images they can find. So you end up with stuff like the 300 poster where the sword doesn’t attach to the handle. Magazine covers adding an extra hand or leg isn’t all that uncommon either.

              And that’s the expensive stuff, once you go into low-budget productions like self-published book covers, it’s all just crude stock image copy&paste at best.

  • Melmi@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    I worry that the cat is out of the bag on this. The tech for this stuff is out there, and you can run it on your home computer, so barring some sort of massive governmental overreach I don’t see a way to stop it.

    They can’t even stop piracy and there’s the full weight of the US copyright industry behind it. How are they going to stop this tech?

    • Rekorse@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      The point isnt that its too late, its that the hype is overblown. Go to the website this article mentions and follow the instructions (explore, turn on nsfw, type name and nude) and you will VERY QUICKLY realize the technology is just shit.

      You can see some resemblances and sometimes one close one, but like another poster said they just look like the same shitty fan fiction we’ve had since Photoshop came about.

      Also, this could end porn blackmail when none can tell if its real or not. People will start judging the person who us supplying the material rather than the person in it.

    • raccoona_nongrata@beehaw.org
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      I don’t think just giving up and allowing porn deep fakes and stuff of people is really an acceptable answer here. The philosophical discussion of whether it’s “actually them” or not doesn’t really matter, it’s still intrusive, violating and gross. In the same way that stealing someone’s identity is illegal, it doesn’t matter that the identity created by the con man isn’t the real you, damage can be done with identity theft.

      Maybe there’s nothing you can do about it on the dark web, but sites absolutely can manage deepdakes, in the same way that PornHub will take down non-consensual ex-girlfriend type content.

      The way people are just throw up their hands at every new problematic issue with AI is not a good way of dealing with them, we don’t have to tolerate this stuff and now is the best time to deal with it and start creating rules and protections.

      • davehtaylor@beehaw.org
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        The way people are just throw up their hands at every new problematic issue with AI is not a good way of dealing with them, we don’t have to tolerate this stuff and now is the best time to deal with it and start creating rules and protections.

        Exactly. In another thread on here recently someone said something that basically boiled down to “your protest against AI isn’t going to stop it. There’s too much corporate power behind it. So you might as well embrace it” and I just cannot get my head around that mentality.

        Also, you can absolutely see the models who were used as references in some of the images generated by apps these days. Like that popular one right now that everyone is using to make idealized images of themselves. A few of my family and friends used it recently and you could clearly see in some of the pics the A-list celebs who were used as pose references, like Gal Godot, Scarlett Johansen, etc. It’s creepy as hell.

          • davehtaylor@beehaw.org
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            I never said it was. But like the person I was replying to said: we need to take a good hard look at what the hell these tools are doing and allowing and decide as a society if we’re going to tolerate it.

            The real issue here is what things like deepfakes can do. It’s already starting, and it’s going to continue accelerating, generating mis- and disinformation: for private citizens, celebs, and politicians. While you might say “it’s creepy, but there’s nothing we can do about people deepfaking Nancy Pelosi’s face onto their spank material”, it’s extremely problematic when someone decides to make a video where Joe Biden admits to running a CP ring, or some right wing chud makes a video of Trump appearing to say something they all want to hear, and it leads to a civil war. That’s the real stakes here. How we react to what’s happening with regular folk and celebs is just the canary int he coal mine.

  • stown@sedd.it
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    1 year ago

    This doesn’t even feel like an article - more like one long advertisement. The second paragraph of the article launches into a review of the “Erect Horse Penis - Concept LoRA”

      • LogicalDrivel@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Yeah i picked up quite a few tips for generating good AI porn in this article. Im still not sure if this was satire or not. it basically gives you step by step how to do this.

        • kniescherz@feddit.de
          link
          fedilink
          arrow-up
          4
          ·
          1 year ago

          I think its great to explain the concept and even some details of the process. This removes the mysticism from the topic and someone who has no big knowledge on the topic may umderstand that it is a pretty easy process nowadays and can give up the mental image of hackerman with a black hoodie.