Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.

While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.

  • InternetPerson@lemmings.world
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    13
    ·
    6 months ago

    To add to this:

    Imagine someone would sneak into your home and steal your shoes, socks and underwear just to get off on that or give it to someone who does.

    Wouldn’t that feel wrong? Wouldn’t you feel violated? It’s the same with such AI porn tools. You serve to satisfy the sexual desires of someone else and you are given no choice. Whether you want it or not, you are becoming part of their act. Becoming an unwilling participant in such a way can feel similarly violating.

    They are painting and using a picture of you, which is not as you would like to represent yourself. You don’t have control over this and thus, feel violated.

    This reminds me of that fetish, where one person is basically acting like a submissive pet and gets treated like one by their “master”. They get aroused by doing that in public, one walking with the other on a leash like a dog on hands and knees. People around them become passive participants of that spectactle. And those often feel violated. Becoming unwillingly, unasked a participant, either active or passive, in the sexual act of someone else and having no or not much control over it, feels wrong and violating for a lot of people.
    In principle that even shares some similarities to rape.

    There are countries where you can’t just take pictures of someone without asking them beforehand. Also there are certain rules on how such a picture can be used. Those countries acknowledge and protect the individual’s right to their image.

    • scarilog@lemmy.world
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      5
      ·
      6 months ago

      Just to play devils advocate here, in both of these scenarios:

      Imagine someone would sneak into your home and steal your shoes, socks and underwear just to get off on that or give it to someone who does.

      This reminds me of that fetish, where one person is basically acting like a submissive pet and gets treated like one by their “master”. They get aroused by doing that in public, one walking with the other on a leash like a dog on hands and knees. People around them become passive participants of that spectactle. And those often feel violated.

      The person has the knowledge that this is going on. In he situation with AI nudes, the actual person may never find out.

      Again, not to defend this at all, I think it’s creepy af. But I don’t think your arguments were particularly strong in supporting the AI nudes issue.

      • CleoTheWizard@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        6 months ago

        In every chat I find about this, I see people railing against AI tools like this but I have yet to hear an argument that makes much sense to me about it. I don’t care much either way but I want a grounded position.

        I care about harms to people and in general, people should be free to do what they want until it begins harming someone. And then we get to have a nuanced conversation about it.

        I’ve come up with a hypothetical. Let’s say that you write naughty stuff about someone in your diary. The diary is kept in a secure place and in private. Then, a burglar breaks in and steals your diary and mails that page to whomever you wrote it about. Are you, the writer, in the wrong?

        My argument would be no. You are expressing a desire in private and only through the malice of someone else was the harm done. And no, being “creepy” isn’t an argument either. The consent thing I can maybe see but again do you have a right not to be fantasized about? Not to be written about in private?

        I’m interested in people’s thoughts because this argument bugs me not to have a good answer for.

        • Resonosity@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          6 months ago

          Yeah it’s an interesting problem.

          If we go down the path of ideas in the mind and the representations we create and visualize in our mind’s eye, to forbid people from conceiving of others sexually means there really is no justification for conceiving of people generally.

          If we try to seek for a justification, where is that line drawn? What is sexual, and what is general? How do we enforce this, or at least how do we catch people in the act and shame them into stopping their behavior, especially if we don’t possess the capability of telepathy?

          What is harm? Is it purely physical, or also psychological? Is there a degree of harm that should be allowed, or that is inescapable despite our best intentions?

          The angle that you point out regarding writing things down about people in private can also go different ways. I write things down about my friends because my memory sucks sometimes and I like to keep info in my back pocket for when birthdays, holidays, or special occasions come. What if I collected information about people that I don’t know? What if I studied academics who died in the past to learn about their lives, like Ben Franklin? What if I investigated my neighbors by pointing cameras at their houses, or installing network sniffers or other devices to try to collect information on them? Does the degree of familiarity with those people I collect information about matter, or is the act wrong in and of itself? And do my intentions justify my actions, or do the consequences of said actions justify them?

          Obviously I think it’s a good thing that we as a society try to discourage collecting information on people who don’t want that information collected, but there is a portion of our society specifically allowed to do this: the state. What makes their status deserving of this power? Can this power be used for ill and good purposes? Is there a level of cross collection that can promote trust and collaboration between the state and its public, or even amongst the public itself? I would say that there is a level where if someone or some group knows enough about me, it gets creepy.

          Anyways, lots of questions and no real answers! I’d be interested in learning more about this subject, and I apologize if I steered the convo away from sexual harassment and violation. Consent extends to all parts of our lives, but sexual consent does seem to be a bigger problem given the evidence of this post. Looking forward to learning more!

          • CleoTheWizard@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            6 months ago

            I think we’ve just stumbled on an issue where the rubber meets the road as far as our philosophies about privacy and consent. I view consent as important mostly in areas that pertain to bodily autonomy right? So we give people the rights to use our likeness for profit or promotion or distribution. And what we’re giving people is a mental permission slip to utilize the idea of the body or the body itself for specific purposes.

            However, I don’t think that these things really pertain to private matters. Because the consent issue only applies when there are potential effects on the other person. Like if I talk about celebrities and say that imagining a celebrity sexually does no damage because you don’t know them, I think most people would agree. And so if what we care about is harm, there is no potential for harm.

            With surveillance matters, the consent does matter because we view breaching privacy as potential harm. The reason it doesn’t apply to AI nudes is that privacy is not being breached. The photos aren’t real. So it’s just a fantasy of a breach of privacy.

            So for instance if you do know the person and involve them sexually without their consent, that’s blatantly wrong. But if you imagine them, that doesn’t involve them at all. Is it wrong to create material imaginations of someone sexually? I’d argue it’s only wrong if there is potential for harm and since the tech is already here, I actually view that potential for harm as decreasing in a way. The same is true nonsexually. Is it wrong to deepfake friends into viral videos and post them on twitter? Can be. Depends. But do it in private? I don’t see an issue.

            The problem I see is the public stuff. People sharing it. And it’s already too late to stop most of the private stuff. Instead we should focus on stopping AI porn from being shared and posted and create higher punishments for ANYONE who does so. The impact of fake nudes and real nudes is very similar, so just take them similarly seriously.

        • KidnappedByKitties@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          What I find interesting is that for me personally, writing the fantasy down (rather than referring to it) is against the norm, a.k.a. weird, but not wrong.

          Painting a painting of it is weird and iffy, hanging it in your home is not ok.

          It’s strange how it changes along that progression, but I can’t rightly say why.

      • InternetPerson@lemmings.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 months ago

        The person has the knowledge that this is going on.

        Not necessarily, no. It could be that they might just think they’ve misplaced their socks. If you’ve lived in an apartment building with shared laundry spaces, it’s not so uncommon to loose some minor parts of clothing. But just because they don’t get to know about it, it’s not less wrong or should be less illegal.

        In he situation with AI nudes, the actual person may never find out.

        Also in connection with my remarks before:
        A lot of our laws also apply even if no one is knowingly damaged (yet). (May of course depend on the legislation of wherever you live.)
        Already intending to commit a crime can sometimes be reason enough to bring someone to court.
        We can argue how much sense that makes of course, but at the current state, we, as a society, decided that doing certain things should be illegal, even if the damage has not manifested yet. And I see many good points to handle it that way with such AI porn tools as well.

    • devfuuu@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      11
      ·
      6 months ago

      Traumatizing rape victims with non consentual imagery of them naked and doing sexual things with others and sharing it is totally not going yo fuck up the society even more and lead to a bunch of suicides! /s

      Ai is the future. The future is dark.

      • Kedly@lemm.ee
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        2
        ·
        6 months ago

        tbf, the past and present are pretty dark as well

      • InternetPerson@lemmings.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        6 months ago

        That’s why we need strong legislation. Most countries wordlwide are missing crucial time frames for making such laws. At least some are catching up, like the EU did recently with their first AI act.