In 2023, more deepfake abuse videos were shared than in every other year in history combined, according to an analysis by independent researcher Genevieve Oh. What used to take skillful, tech-savvy experts hours to Photoshop can now be whipped up at a moment’s notice with the help of an app. Some deepfake websites even offer tutorials on how to create AI pornography.

What happens if we don’t get this under control? It will further blur the lines between what’s real and what’s not — as politics become more and more polarized. What will happen when voters can’t separate truth from lies? And what are the stakes? As we get closer to the presidential election, democracy itself could be at risk. And, as Ocasio-Cortez points out in our conversation, it’s about much more than imaginary images.

“It’s so important to me that people understand that this is not just a form of interpersonal violence, it’s not just about the harm that’s done to the victim,” she says about nonconsensual deepfake porn. She puts down her spoon and leans forward. “Because this technology threatens to do it at scale — this is about class subjugation. It’s a subjugation of entire people. And then when you do intersect that with abortion, when you do intersect that with debates over bodily autonomy, when you are able to actively subjugate all women in society on a scale of millions, at once digitally, it’s a direct connection [with] taking their rights away.”

  • RainfallSonata@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    2
    ·
    9 months ago

    “The legislation amends the Violence Against Women Act so that people can sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” that the victim did not consent to those images.”–The article.

    • dsemy@lemm.ee
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      5
      ·
      9 months ago

      I read the article… amending a law doesn’t make the problem go away.

      Maybe if more attention was given to the politicians talking about this half a decade ago (instead of focusing on AOC, which honestly realized this issue way too late), something more meaningful could have been done.

        • umbrella@lemmy.ml
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          edit-2
          9 months ago

          It isn’t either/or.

          kind of, this is like doing a blame game for climate change but 30 years in the future.

          im sure dealing with it before it became so ubiquitous would have been easier.

          • DarkThoughts@fedia.io
            link
            fedilink
            arrow-up
            13
            arrow-down
            1
            ·
            9 months ago

            My issue with the topic is that everyone targets the wrong thing and just jumps on the media hysteria. They are not going to be able to stop the production or distribution of deepfakes, and imo they shouldn’t, because they’re basically just an advanced form of photo & video editing that already existed for decades and it did not bother anyone up until now that “AI” became a media scapegoat. What they should be bother to enforce is the illegitimate use of such material, for things like blackmail, bullying, disinformation etc. Some neckbeards wanking one out on a clearly marked deepfake porn video isn’t really going to harm the person depicted. Using such a video to claim it is real on the other hand to smear or blackmail them on the other hand is. And this type of bullying has also been going on for decades through classical photo & video manipulation, and again, it did not bother anyone up until now. And by focusing on this idiotic media hype instead of the real issue we basically make sure that it keeps on happening, just like with climate change.

            • umbrella@lemmy.ml
              link
              fedilink
              English
              arrow-up
              3
              ·
              edit-2
              9 months ago

              alright you have a point, although AI made this process infinetly simpler. if you have a remotely sufficient computer and some knowledge on how to operate the tools, you can do that in a matter of a few hours of actual work.

              im surprised there isnt more deepfakes.

              • DarkThoughts@fedia.io
                link
                fedilink
                arrow-up
                2
                arrow-down
                2
                ·
                9 months ago

                It’s usually quite a hassle to set those tools up, especially if you don’t have much technical knowledge. A lot of the more resource heavy tasks are also not really possible on a home computer and require big servers with multiple GPUs and absurd amounts of VRAM, or very specific APUs but those are still very early. The majority of what you can do at home is typically limited to generating pictures, and even there it takes quite a bit of time if you want some really high quality stuff. For a lot of more complex tasks you’re simply resource limited. And in regards to time I’m just talking for the actual generation process. Getting good results, and to the point of getting them, is another lengthy process that many people underestimate. It’s not a magic button because those LLMs are pretty damn stupid actually.

    • starman@programming.dev
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      9 months ago

      people can sue those who produce, distribute, or receive the deepfake pornography

      So can I send someone deepfake porn and then sue them?