A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • GrymEdm@lemmy.world
    link
    fedilink
    English
    arrow-up
    52
    arrow-down
    3
    ·
    edit-2
    1 year ago

    To people who aren’t sure if this should be illegal or what the big deal is: according to Harvard clinical psychiatrist and instructor Dr. Alok Kanojia (aka Dr. K from HealthyGamerGG), once non-consensual pornography (which deepfakes are classified as) is made public over half of people involved will have the urge to kill themselves. There’s also extreme risk of feeling depressed, angry, anxiety, etc. The analogy given is it’s like watching video the next day of yourself undergoing sex without consent as if you’d been drugged.

    I’ll admit I used to look at celeb deepfakes, but once I saw that video I stopped immediately and avoid it as much as I possibly can. I believe porn can be done correctly with participant protection and respect. Regarding deepfakes/revenge porn though that statistic about suicidal ideation puts it outside of healthy or ethical. Obviously I can’t make that decision for others or purge the internet, but the fact that there’s such regular and extreme harm for the (what I now know are) victims of non-consensual porn makes it personally immoral. Not because of religion or society but because I want my entertainment to be at minimum consensual and hopefully fun and exciting, not killing people or ruining their happiness.

    I get that people say this is the new normal, but it’s already resulted in trauma and will almost certainly continue to do so. Maybe even get worse as the deepfakes get more realistic.

    • Regrettable_incident@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      I’m wondering if this may already be illegal in some countries. Revenge porn laws now exist in some countries, and I’m not sure if the legislation specifies how the material should be produced to qualify. And if the image is based on a minor, that’s often going to be illegal too - some places I hear even pornographic cartoons are illegal if they feature minors. In my mind people who do this shit are doing something pretty similar to putting hidden cameras in bathrooms.

    • lud@lemm.ee
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      5
      ·
      1 year ago

      once non-consensual pornography (which deepfakes are classified as) is made public over half of people involved will have the urge to kill themselves.

      Not saying that they are justified or anything but wouldn’t people stop caring about them when they reach a critical mass? I mean if everyone could make fakes like these, I think people would care less since they can just dismiss them as fakes.

      • eatthecake@lemmy.world
        link
        fedilink
        English
        arrow-up
        18
        arrow-down
        2
        ·
        1 year ago

        The analogy given is it’s like watching video the next day of yourself undergoing sex without consent as if you’d been drugged.

        You want a world where people just desensitise themselves to things that make them want to die through repeated exposure. I think you’ll get a whole lot of complex PTSD instead.

    • spez_@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      The technology will become available everywhere and run on every device over time. Nothing will stop this

  • JackGreenEarth@lemm.ee
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    10
    ·
    1 year ago

    That’s a ripoff. It costs them at most $0.1 to do simple stable diffusion img2img. And most people could do it themselves, they’re purposefully exploiting people who aren’t tech savvy.

    • echo64@lemmy.world
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      3
      ·
      1 year ago

      The people being exploited are the ones who are the victims of this, not people who paid for it.

      • Dkarma@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        7
        ·
        1 year ago

        No one’s a victim no one’s being exploited. Same as taping a head on a porno mag.

    • IsThisAnAI@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      edit-2
      1 year ago

      Scam is another thing. removed these people selling.

      But removed dude they aren’t taking advantage of anyone buying the service. That’s not how the fucking world works. It turns out that even you have money you can post for people to do shit like clean your house or do an oil change.

      NOBODY on that side of the equation are bring exploited 🤣

    • M500@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      1 year ago

      Wait? This is a tool built into stable diffusion?

      In regards to people doing it themselves, it might be a bit too technical for some people to setup. But I’ve never tried stable diffusion.

  • SendMePhotos@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    12
    ·
    1 year ago

    I’d like to share my initial opinion here. “non consential Ai generated nudes” is technically a freedom, no? Like, we can bastardize our president’s, paste peoples photos on devils or other characters, why is Ai nudes where the line is drawn? The internet made photos of trump and putin kissing shirtless.

    • Maggoty@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      2
      ·
      1 year ago

      It’s a far cry from making weird memes to making actual porn. Especially when it’s not easily seen as fake.

        • Maggoty@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 year ago

          Psychological trauma. Normal people aren’t used to dealing with that and even celebrities seek help for it. Throw in the transition period where this technology is not widely known and you have careers on the line too.

    • abhibeckert@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      3
      ·
      edit-2
      1 year ago

      The internet made photos of trump and putin kissing shirtless.

      And is that OK? I mean I get it, free speech, but just because congress can’t stop you from expressing something doesn’t mean you actually should do it. It’s basically bullying.

      Imagine you meet someone you really like at a party, they like you too and look you up on a social network… and find galleries of hardcore porn with you as the star. Only you’re not a porn star, those galleries were created by someone who specifically wanted to hurt you.

      AI porn without consent is clearly illegal in almost every country in the world, and the ones where it’s not illegal yet it will be illegal soon. The 1st amendment will be a stumbling block, but it’s not an impenetrable wall - congress can pass laws that limit speech in certain edge cases, and this will be one of them.

      • WaxedWookie@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 year ago

        The internet made photos of trump and putin kissing shirtless.

        And is that OK?

        I’m going to jump in on this one and say yes - it’s mostly fine.

        I look at these things through the lens of the harm they do and the benefits they deliver - consequentialism and act utilitarianism.

        The benefits are artistic, comedic and political.

        The “harm” is that Putin and or Trump might feel bad, maaaaaaybe enough that they’d kill themselves. All that gets put back up under benefits as far as I’m concerned - they’re both extremely powerful monsters that have done and will continue to do incredible harm.

        The real harm is that such works risk normalising this treatment of regular folk, which is genuinely harmful. I think that’s unlikely, but it’s impossible to rule out.

        Similarly, the dissemination of the kinds of AI fakes under discussion is a negative because they do serious,measurable harm.

    • afraid_of_zombies@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Public figures vs private figures. Fair or not a public figure is usually open season. Go ahead and make a comic where Ben Stein rides a horse home to his love nest with Ben Stiller.

    • UsernameIsTooLon@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      Lemme put it this way. Freedom of speech isn’t freedom of consequences. You talk shit, you’re gonna get hit. Is it truly freedom if you’re infringing on someone else’s rights?

      • John_McMurray@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        1 year ago

        Yeah you don’t have the right to prevent people from drawing pictures of you, but you do have the right not to get hit by some guy you’re drawing.

    • LadyAutumn@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 year ago

      They’re making pornography of women who are not consenting to it when that is an extremely invasive thing to do that has massive social consequences for women and girls. This could (and almost certainly will) be used on kids too right, this can literally be a tool for the production of child pornography.

      Even with regards to adults, do you think this will be used exclusively on public figures? Do you think people aren’t taking pictures of their classmates, of their co-workers, of women and girls they personally know and having this done to pictures of them? It’s fucking disgusting, and horrifying. Have you ever heard of the correlation between revenge porn and suicide? People literally end their lives when pornographic material of them is made and spread without their knowledge and consent. It’s terrifyingly invasive and exploitative. It absolutely can and must be illegal to do this.

      • cley_faye@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        It absolutely can and must be illegal to do this.

        Given that it can be done in a private context and there is absolutely no way to enforce it without looking into random people’s computer unless they post it online publicly, you’re just asking for a new law to reassure people with no effect. That’s useless.

        • WaxedWookie@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          Strange of you to respond to a comment about the fakes being shared in this way…

          Do you have the same prescriptions in relation to someone with a stash of CSAM, and if not, why not?

          • cley_faye@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            No. Because in one case, someone ran a program on his computer and the output might hurt someone else feelings if they ever find out, and in the other case people/kid were exploited for sexual purpose to begin with and their live torn appart regardless of the diffusion of the stuff?

            How is that a hard concept to understand?

            • LadyAutumn@lemmy.blahaj.zone
              link
              fedilink
              English
              arrow-up
              0
              arrow-down
              1
              ·
              edit-2
              1 year ago

              How can you describe your friends, family, co-workers, peers, making and sharing pornography of you, and say that it comes down to hurt feelings??? It’s taking someone’s personhood, their likeness, their autonomy, their privacy, and reducing them down to a sexual act for which they provide no knowledge or consent. And you think this stays private?? Are you kidding me?? Men have literally been caught making snapchat groups dedicated to sharing their partner’s nudes without their consent. You either have no idea what you’re talking about or you are intentionally downplaying the seriousness of what this is. Like I said in my original comment, people contemplate and attempt suicide when pornographic content is made and shared of them without their knowledge and consent. This is an incredibly serious discussion.

              It is people like you, yes you specifically, that provide the framework by which the sexual abuse of women is justified.

    • GhostTheToast@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      8
      ·
      1 year ago

      Don’t get me wrong it’s unsettling, but I agree, I don’t see the initial harm. I see it as creating a physical manifestation of someone’s inner thoughts. I can definitely see how it could become or encourage dangerous situations, but that’s like banning alcohol because it could lead to drunk driving or sexual assault.

      • Mastengwe@lemm.ee
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 year ago

        Innocently drinking alcohol is in NO WAY compared to creating deepfakes of people without consent.

        One is an innocent act that has potentially harsh consequences, the other is a disgusting and invasively violating act that has the potential to ruin an innocent persons life.

  • Mastengwe@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    1 year ago

    As long as there are simps, there will always be this bullshit. And there will always be simps, because it isn’t illegal to be pathetic.

  • afraid_of_zombies@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    1 year ago

    It’s stuff like this that makes me against copyright laws. To me it is clear and obvious that you own your own image, and it is far less obvious to me that a company can own an image whose creator drew multiple decades ago that everyone can identify. And yet one is protected and the other isn’t.

    What the hell do you own if not yourself? How come a corporation has more rights than we do?

  • Ultragigagigantic@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    1 year ago

    It’s gonna suck no matter what once the technology became available. Perhaps in a bunch of generations there will be a massive cultural shift to something less toxic.

    May as well drink the poison if I’m gonna be immersed in it. Cheers.

    • VinnyDaCat@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I was really hoping that with the onset of AI people would be more skeptical of content they see online.

      This was one of the reasons. I don’t think there’s anything we can do to prevent people from acting like this, but what we can do as a society is adjust to it so that it’s not as harmful. I’m still hoping that the eventual onset of it becoming easily accessible and useable will help people to look at all content much more closely.

  • Cris@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    6
    ·
    edit-2
    1 year ago

    God, generative ai is such a fucking caustic technology. I honestly don’t see anything positive and not disgusting enabled by this tech.

    Edit: I see people don’t agree, but like why can’t ai stick to translating stuff and being useful rather than making horrifically unethical porn, taking the humanity out of art, and replacing peoples jobs with statistical content generation. I hate it here.

  • bigkahuna1986@lemmy.ml
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    4
    ·
    1 year ago

    This business is going to get out of control. It’s going to get out of control and we’ll be lucky to live through it.

  • Kuinox@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    3
    ·
    1 year ago

    The root problem is government not enforcing the law on internet. Deepfakes existed for years.
    The law enforcement should be more proactive on internet.

  • echo64@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    25
    ·
    1 year ago

    Every time this comes up, all the tech nerds here like to excuse it as fine and not a bad thing at all. I am hoping this won’t happen this time, but knowing lemmys audience…

    • cley_faye@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      1 year ago

      It’s not a matter of excusing it. Distribution of someone’s picture without their explicit consent, and anything like that, is inexcusable. But we’re talking about the generation of said content, which technically can’t be stopped without seriously restraining everything.

    • Thorny_Insight@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      6
      ·
      1 year ago

      I’m not saying it’s not a bad thing but it’s inevitable. The problem will just be getting worse and there’s no stopping it. It’s something we’re just going to need to accept as a new normal. If we can deal with living under the constant threat of nuclear armageddon then I think we can live with fake nudes aswell.

      • echo64@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        4
        ·
        1 year ago

        Yeah it’s this shit I’m talking about. We have a whole legal and justice system to deal with this. No kne needs to accept sexual abuse as a new normal. This shit is weird.

        • Thorny_Insight@lemm.ee
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          4
          ·
          edit-2
          1 year ago

          I’m not saying there shouldn’t be consequences for someone who is spreading these pictures with the intention to cause harm to someone’s reputation but it’s incredibly naive to think that the justice system is going to stop deepfakes when it can’t even prevent bike theft. 12 year olds are making these with their smartphones. The technology is extremely accessible and easy to use and that is not going to change. I’m sorry but you’re not putting the toothpaste back into the tube. Wait a few years and you can generate photorealistic porn videos of anyone you want.

          • echo64@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            3
            ·
            1 year ago

            We can’t stop biketheft so removed off women, your free game coz this guy said so.

              • echo64@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                3
                ·
                1 year ago

                You might want to look up what strawmanning means. I’m just flat out mocking what you said.

        • Dkarma@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          1 year ago

          No we don’t. What is happening here is not covered by current laws.

        • 0x0@programming.dev
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          8
          ·
          edit-2
          1 year ago

          Sexual abuse?

          Child pornography involves molesting a child and is a crime, as it should be.

          Fake nudes have been a thing for ages and are only an issue if the targeted party takes offense. It may be slander but it’s certainly not sexual abuse.

          No one is accepting sexual abuse so drop it down a notch, Karen.

          • eatthecake@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            1 year ago

            From another comment:

            To people who aren’t sure if this should be illegal or what the big deal is: according to Harvard clinical psychiatrist and instructor Dr. Alok Kanojia (aka Dr. K from HealthyGamerGG), once non-consensual pornography (which deepfakes are classified as) is made public over half of people involved will have the urge to kill themselves. There’s also extreme risk of feeling depressed, angry, anxiety, etc. The analogy given is it’s like watching video the next day of yourself undergoing sex without consent as if you’d been drugged.

            Try to imagine watching a realistic video of yourself being abused, imagine your mother watching. That will absolutely removed some people up, and a lot of those victims are going to be children. Shit is going to get bad.

            • 0x0@programming.dev
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              I wouldn’t put actual non-consensual pornography and fake pornography of any kind in the same bag but, geez, I’m not a Dr.

              Deep fakes do improve on the (technical) realism over 90s photoshop for sure. Doesn’t that still qualify as slander? (Also not a lawyer.)

              • eatthecake@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                The question is why, with an internet full of porn, do men want non consensual pornography that they know women are opposed to. It’s as if the hurtfulness, the lack of consent and the control over the woman in the video are actually the point.