‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • @[email protected]
    link
    fedilink
    English
    27 months ago

    So it’s okay to make nudes of someone as long as they aren’t realistic?

    Where is the line drawn between being too real and not real enough?

    • @[email protected]
      link
      fedilink
      English
      57 months ago

      If you found out that someone had made a bunch of art of you naked you’d probably think that was weird. I’d argue you shouldn’t do that without consent. Draw lines wherever you think is best.

      • @[email protected]
        link
        fedilink
        English
        57 months ago

        I’d definitely think it was weird! And probably not hang out with them anymore (unless it was really good!)

        But I don’t think there should be a law against them doing that. I can moderate them myself by avoiding them and my friends will follow suit.

        At that point, all they have are nudes of me that nobody I care about will pay attention to. It’s a good litmus test for shitbags!

        • @[email protected]
          link
          fedilink
          English
          17 months ago

          Agreed, but legal and moral are different. The law isn’t really about right and wrong per se.

        • @[email protected]
          link
          fedilink
          English
          17 months ago

          This is about producing convincing nude reproductions of other people, however. It has a very different psychological impact.

          This technology allows someone to make pornography of anyone else and spread that pornography on the internet. It can cause massive distress, trauma, and relationship issues and impact peoples careers.

          Your freedom to make nude ai images of other people is not worth that. I don’t understand why anyone would think it was okay.