Nearly a year after AI-generated nude images of high school girls upended a community in southern Spain, a juvenile court this summer sentenced 15 of their classmates to a year of probation.

But the artificial intelligence tool used to create the harmful deepfakes is still easily accessible on the internet, promising to “undress any photo” uploaded to the website within seconds.

Now a new effort to shut down the app and others like it is being pursued in California, where San Francisco this week filed a first-of-its-kind lawsuit that experts say could set a precedent but will also face many hurdles.

  • @ikidd@lemmy.world
    link
    fedilink
    English
    141 month ago

    This is a dog and pony show. Between the unlikelihood of a municipality holding any sway over an internet site, it completely ignores the futility of trying to close the Pandora’s box that is AI imagery. I can download half a dozen deepfake video models that would run on a home server, let alone still imaging.

    This about the level of technological savvy I would expect from a city councillor.

    • @Samvega@lemmy.blahaj.zone
      link
      fedilink
      -1
      edit-2
      1 month ago

      Predictions:

      Men: “I didn’t film gay porn, that’s a deepfake.” The issue is dropped.

      Women: “Those videos you’re sharing aren’t real, use my image without my consent, and I want them removed.” She is called a slut.