• @[email protected]
    link
    fedilink
    English
    61
    edit-2
    1 year ago

    Deepfakes of an actual child should be considered defamatory use of a person’s image; but they aren’t evidence of actual abuse the way real CSAM is.

    Remember, the original point of the term “child sexual abuse material” was to distinguish images/video made through the actual abuse of a child, from depictions not involving actual abuse – such as erotic Harry Potter fanfiction, anime characters, drawings from imagination, and the like.

    Purely fictional depictions, not involving any actual child being abused, are not evidence of a crime. Even deepfake images depicting a real person, but without their actual involvement, are a different sort of problem from actual child abuse. (And should be considered defamatory, same as deepfakes of an adult.)

    But if a picture does not depict a crime of abuse, and does not depict a real person, it is basically an illustration, same as if it was drawn with a pencil.

    • Uranium3006
      link
      fedilink
      101 year ago

      Remember, the original point of the term “child sexual abuse material” was to distinguish images/video made through the actual abuse of a child, from depictions not involving actual abuse – such as erotic Harry Potter fanfiction, anime characters, drawings from imagination, and the like.

      although that distinction lasted about a week before the same bad actors who cancel people over incest fanfic started calling all the latter CSEM too

      • @[email protected]
        link
        fedilink
        English
        11
        edit-2
        1 year ago

        As a sometime fanfic writer, I do notice when fascists attack sites like AO3 under a pretense of “protecting children”, yes.

        • Uranium3006
          link
          fedilink
          61 year ago

          And it’s usually fascists, or at least people who may not consider themselves as such but think and act like fascists anyways.

    • @[email protected]
      link
      fedilink
      English
      31 year ago

      Add in an extra twist. Hopefully if the sickos are at least happy with AI stuff they won’t need “real”

      Sadly, a lot of it does evolve from wanting to “watch” to wanting to do

      • @[email protected]
        link
        fedilink
        English
        91 year ago

        Sadly, a lot of it does evolve from wanting to “watch” to wanting to do

        This is the part where I disagree and I would love ppl to prove me wrong. Because whether this is true or false, it will probably be the deciding factor in allowing or restricting “artificial CSAM”.

      • @[email protected]
        link
        fedilink
        English
        41 year ago

        Sadly, a lot of it does evolve from wanting to “watch” to wanting to do

        Have you got some source about this ?

        • JohnEdwa
          link
          fedilink
          English
          3
          edit-2
          1 year ago

          Some people are sadists and rapists, yes, regardless of what age group they’d want to do it with.