‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • @[email protected]
    link
    fedilink
    English
    -7
    edit-2
    1 year ago

    That the chat is full of people defending this is disgusting. This is different than cutting someone’s face out of a photo and pasting it on a magazine nude or imagining a person naked. Deepfakes can be difficult to tell apart from real media. This enables the spread of nonconsensual pornography that an arbitrary person cannot necessarily tell is fake. Even if it were easy to tell, it’s an invasion of privacy to use someone’s likeness against their will without their consent for the purposes you’re using it for.

    The fediverse’s high expectations for privacy seem to go right out the window when violating it gets their dick hard. We should be better.

    • @[email protected]
      link
      fedilink
      English
      321 year ago

      What are you arguing with here? No one is saying that. Stop looking for trouble, it’s weird.

      • @[email protected]
        link
        fedilink
        English
        -61 year ago

        So it’s fine to violate someone’s privacy so long as you don’t share it? Weird morals you got there.

        • @[email protected]
          link
          fedilink
          English
          12
          edit-2
          1 year ago

          Am I violating privacy by picturing women naked?

          Because if it’s as cut and dried as you say, then the answer must be yes, and that’s flat out dumb

          I don’t see this as a privacy issue and I’m not sure how you’re squeezing it into that. I am not sure what it is, but you cannot violate a person’s privacy by imagining or coding an image of them. It’s weird creepy and because it can be mistaken for a real image, not proper to share.

          Can you actually stop clutching pearls for a moment to think this through a little better?

          • @[email protected]
            link
            fedilink
            English
            -71 year ago

            Sexualizing strangers isn’t a right or moral afforded to you by society. That’s a braindead take. You can ABSOLUTELY violate someone’s privacy by coding an image of them. That’s both a moral and legal question with an answer.

            Your comment is a self report.

            • @[email protected]
              link
              fedilink
              English
              61 year ago

              That’s a braindead take

              Projection. Since you have no room for new thoughts in your head, I consider this a block request.

    • @[email protected]
      link
      fedilink
      English
      91 year ago

      it’s an invasion of privacy to use someone’s likeness against their will

      Is it? Usually photography in public places is legal.

          • @[email protected]
            link
            fedilink
            English
            -3
            edit-2
            1 year ago

            I think it’s immoral to do street photography to sexualize the subjects of your photographs. I think it’s immoral to then turn that into pornography of them without their consent. I think it’s weird you don’t. If you can’t tell the difference between street photography and using and manipulating photos of people (public or otherwise) into pornography I can’t fuckin help you

            If you go to a park, take photos of people, then go home and masturbate to them you need to seek professional help.

            • @[email protected]
              link
              fedilink
              English
              2
              edit-2
              1 year ago

              What’s so moronic about people like you, is you think that anyone looking to further understand an issue outside of your own current thoughts, clearly is a monster harming people in the worst way you can conjure in your head. The original person saying it’s weird you’re looking for trouble couldn’t have been more dead on.

              • @[email protected]
                link
                fedilink
                English
                -11 year ago

                This is an app that creates nude deepfakes of anyone you want it to. It’s not comparable to street photography in any imaginable way. I don’t have to conjure any monsters bro I found one and they’re indignant about being called out as a monster.

                • @[email protected]
                  link
                  fedilink
                  English
                  21 year ago

                  This has been done with Photoshop for decades. Photocollage for a hundred years before that. Nobody is arguing that it’s not creepy. It’s just that nothing has changed.

    • @[email protected]
      link
      fedilink
      English
      21 year ago

      So it’s okay to make nudes of someone as long as they aren’t realistic?

      Where is the line drawn between being too real and not real enough?

      • @[email protected]
        link
        fedilink
        English
        51 year ago

        If you found out that someone had made a bunch of art of you naked you’d probably think that was weird. I’d argue you shouldn’t do that without consent. Draw lines wherever you think is best.

        • @[email protected]
          link
          fedilink
          English
          51 year ago

          I’d definitely think it was weird! And probably not hang out with them anymore (unless it was really good!)

          But I don’t think there should be a law against them doing that. I can moderate them myself by avoiding them and my friends will follow suit.

          At that point, all they have are nudes of me that nobody I care about will pay attention to. It’s a good litmus test for shitbags!

          • @[email protected]
            link
            fedilink
            English
            11 year ago

            Agreed, but legal and moral are different. The law isn’t really about right and wrong per se.

          • @[email protected]
            link
            fedilink
            English
            11 year ago

            This is about producing convincing nude reproductions of other people, however. It has a very different psychological impact.

            This technology allows someone to make pornography of anyone else and spread that pornography on the internet. It can cause massive distress, trauma, and relationship issues and impact peoples careers.

            Your freedom to make nude ai images of other people is not worth that. I don’t understand why anyone would think it was okay.

    • @[email protected]
      link
      fedilink
      English
      01 year ago

      Unfortunately sounds like par for the course for the internet. I’ve come to believe that the internet has its good uses for things like commerce and general information streaming, but by and large it’s bringing out the worst in humanity far more than the best. Or it’s all run by ultra-horny psychopathic teenagers pretending to be adults yet living on a philosophy of “I’m 13 and this is deep” logic.

      • @[email protected]
        link
        fedilink
        English
        01 year ago

        I dunno why I am perpetually surprised about this though. This is such a cut and dry moral area and the people who say it isn’t are so clearly telling on themselves it’s kind of shocking, but I guess it shouldn’t be

        • @[email protected]
          link
          fedilink
          English
          31 year ago

          I think the distinction is that half of the thread is treating it as a moral issue, and half of it is treating it as a legal issue. Legally, there’s nothing wrong here.

    • @[email protected]
      link
      fedilink
      English
      -11 year ago

      People need better online safety education. Why TF are people even posting public pictures of themselves?