Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.

While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.

  • intensely_human@lemm.ee
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    3
    ·
    6 months ago

    This is not okay, but this is nowhere near the most harmful application of AI.

    The most harmful application of AI that I can think of would disrupting a country’s entire culture via gaslighting social media bots, leading to increases in addiction, hatred, suicide, and murder.

    Putting hundreds of millions of people into a state of hopeless depression would be more harmful than creating a picture of a naked woman with a real woman’s face on it.

    • Katrisia@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 months ago

      I don’t want to fall into a slippery slope argument, but I really see this as the tip of a horrible iceberg. Seeing women as sexual objects starts with this kind of non consensual media, but also includes non consensual approaches (like a man that thinks he can subtly touch women in full public transport and excuse himself with the lack of space), sexual harassment, sexual abuse, forced prostitution (it’s hard to know for sure, but possibly the majority of prostitution), human trafficking (in which 75%-79% go into forced prostitution, which causes that human trafficking is mostly done to women), and even other forms of violence, torture, murder, etc.

      Thus, women live their lives in fear (in varying degrees depending on their country and circumstances). They are restricted in many ways. All of this even in first world countries. For example, homeless women fearing going to shelters because of the situation with SA and trafficking that exists there; women retiring from or not entering jobs (military, scientific exploration, etc.) because of their hostile sexual environment; being alert and often scared when alone because they can be targets, etc. I hopefully don’t need to explain the situation in third world countries, just look at what’s legal and imagine from there…

      This is a reality, one that is:

      Putting hundreds of millions of people into a state of hopeless depression

      Again, I want to be very clear, I’m not equating these tools to the horrible things I mentioned. I’m saying that it is part of the same problem in a lighter presentation. It is the tip of the iceberg. It is a symptom of a systemic and cultural problem. The AI by itself may be less catastrophic in consequences, rarely leading to permanent damage (I can only see it being the case if the victim develops chronic or pervasive health problems by the stress of the situation, like social anxiety, or commits suicide). It is still important to acknowledge the whole machinery so we can dimension what we are facing, and to really face it because something must change. The first steps might be against this “on the surface” “not very harmful” forms of sexual violence.