• Corkyskog@sh.itjust.works
    link
    fedilink
    arrow-up
    2
    arrow-down
    3
    ·
    6 months ago

    It’s not a difficult test. If a person can’t reasonably distinguish it from an actual child, then it’s CSAM.

    • Phoenixz@lemmy.ca
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      6 months ago

      Just to play devil’s advocate:

      What about hentai where little girls get fondled by tentacles? (Please please please don’t make this be my most up voted post)

      • bitfucker@programming.dev
        link
        fedilink
        arrow-up
        2
        ·
        6 months ago

        Yeah, no. The commenter has stated actual child, not cartoon one. It is a different discussion entirely, and a good one too. Because artwork is a part of freedom of expression. An artwork CAN be made without hurting anyone or abusing anyone. We fully know that a human has creative capabilities to come up with something without having those actual something exist beforehand. It implies that humans can come up with CSAM without ever having seen a CSAM.

        • Phoenixz@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          6 months ago

          Yeah but then it gets very messy and complicated fast. What about photo perfect AI pornography of minors? When and where do you draw the line?

    • bitfucker@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      6 months ago

      What he probably means is that for a “photo”, an actual act of photography must be performed. While “artwork” can be fully digital. Now, legal definition aside, the two acts are indeed different even if the resulting “image” is a bit-by-bit equivalent. A computer could just output something akin to a photograph but no actual act of photography has taken place. I said the legal definition aside because I know the legal definition only looks at the resulting image. Just trying to convey the commenter words better.

      Edit to clarify a few things.

    • Madison420@lemmy.world
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      6 months ago

      This would also outlaw “teen” porn as they are explicitly trying to look more childlike as well as models that only appear to be minors.

      I get the reason people think it’s a good thing but all censorship has to be narrowly tailored to content lest it be too vague or overly broad.

      • Corkyskog@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        6 months ago

        And nothing was lost…

        But in seriousness, as you said they are models who are in the industry, verified, etc. It’s not impossible to have a white-list of actors, and if anything there should be more scrutiny on the unknown “actresses” portraying teenagers…

        • Madison420@lemmy.world
          link
          fedilink
          arrow-up
          0
          arrow-down
          1
          ·
          6 months ago

          Except jobs dude, you may not like their work but it’s work. That law ignores verified age, that’s a not insignificant part of my point…