• 0x0001@sh.itjust.works
    link
    fedilink
    arrow-up
    32
    arrow-down
    1
    ·
    4 months ago

    One thing to consider, if this turned out to be accepted, it would make it much harder to prosecute actual csam, they could claim “ai generated” for actual images

    • theherk@lemmy.world
      link
      fedilink
      arrow-up
      14
      arrow-down
      1
      ·
      4 months ago

      I get this position, truly, but I struggle to reconcile it with the feeling that artwork of something and photos of it aren’t equal. In a binary way they are, but with more precision they’re pretty far apart. But I’m not arguing against it, I’m just not super clear how I feel about it yet.

      • Madison420@lemmy.world
        link
        fedilink
        arrow-up
        0
        arrow-down
        1
        ·
        4 months ago

        So long as the generation is without actual model examples that are actual minors there’s nothing technically illegal about having sexual material of what appears to be a child. They would then have a mens rea question and a content question, what actual defines in a visual sense a child? Could those same things equally define a person of smaller stature? And finally could someone like tiny texie be charged for producing csam as she by all appearance or of context looks to be a child.

          • Madison420@lemmy.world
            link
            fedilink
            arrow-up
            0
            arrow-down
            1
            ·
            4 months ago

            Real images that don’t have to be of csam but rather of children, it could theoretically train anything sexual with legal sexual content and let the ai connect the dots.

        • Fungah@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          4 months ago

          It is illegal in Canada to have sexual depictions of a child whether its a real image or you’ve just sat down and drawn it yourself. The rationale being that behavior escalated, and looking at images goes to wanting more

          It borders on thought crime which I feel kind of high about but only pedophiles suffer which I feel great about. There’s no legitimate reason to have sexualized image of a child whether computer geneerate, hand drawn, or whatever.

          • Madison420@lemmy.world
            link
            fedilink
            arrow-up
            0
            arrow-down
            1
            ·
            4 months ago

            This article isn’t about Canada homeboy.

            Also that theory is not provable and never will be, morality crime is thought crime and thought crime is horseshit. We criminalize criminal acts not criminal thoughts.

            Similarly, you didn’t actually offer a counterpoint to any of my points.

      • Corkyskog@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        arrow-down
        3
        ·
        4 months ago

        It’s not a difficult test. If a person can’t reasonably distinguish it from an actual child, then it’s CSAM.

        • phoenixz@lemmy.ca
          link
          fedilink
          arrow-up
          7
          arrow-down
          1
          ·
          4 months ago

          Just to play devil’s advocate:

          What about hentai where little girls get fondled by tentacles? (Please please please don’t make this be my most up voted post)

          • bitfucker@programming.dev
            link
            fedilink
            arrow-up
            2
            ·
            4 months ago

            Yeah, no. The commenter has stated actual child, not cartoon one. It is a different discussion entirely, and a good one too. Because artwork is a part of freedom of expression. An artwork CAN be made without hurting anyone or abusing anyone. We fully know that a human has creative capabilities to come up with something without having those actual something exist beforehand. It implies that humans can come up with CSAM without ever having seen a CSAM.

            • phoenixz@lemmy.ca
              link
              fedilink
              arrow-up
              1
              ·
              4 months ago

              Yeah but then it gets very messy and complicated fast. What about photo perfect AI pornography of minors? When and where do you draw the line?

        • bitfucker@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          4 months ago

          What he probably means is that for a “photo”, an actual act of photography must be performed. While “artwork” can be fully digital. Now, legal definition aside, the two acts are indeed different even if the resulting “image” is a bit-by-bit equivalent. A computer could just output something akin to a photograph but no actual act of photography has taken place. I said the legal definition aside because I know the legal definition only looks at the resulting image. Just trying to convey the commenter words better.

          Edit to clarify a few things.

        • Madison420@lemmy.world
          link
          fedilink
          arrow-up
          0
          arrow-down
          1
          ·
          4 months ago

          This would also outlaw “teen” porn as they are explicitly trying to look more childlike as well as models that only appear to be minors.

          I get the reason people think it’s a good thing but all censorship has to be narrowly tailored to content lest it be too vague or overly broad.

          • Corkyskog@sh.itjust.works
            link
            fedilink
            arrow-up
            0
            ·
            4 months ago

            And nothing was lost…

            But in seriousness, as you said they are models who are in the industry, verified, etc. It’s not impossible to have a white-list of actors, and if anything there should be more scrutiny on the unknown “actresses” portraying teenagers…

            • Madison420@lemmy.world
              link
              fedilink
              arrow-up
              0
              arrow-down
              1
              ·
              4 months ago

              Except jobs dude, you may not like their work but it’s work. That law ignores verified age, that’s a not insignificant part of my point…