- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
AI-generated child sex imagery has every US attorney general calling for action::“A race against time to protect the children of our country from the dangers of AI.”
AI-generated child sex imagery has every US attorney general calling for action::“A race against time to protect the children of our country from the dangers of AI.”
I’m not the guy you’re replying to, but I will say this is a topic that is never going to see a good consensus, because there are two questions of morality at play, which under normal circumstances are completely agreeable. However, when placed into this context, they collide.
Pornography depicting underage persons is reprehensible and should not exist
The production and related abuse of children should absolutely be stopped
To allow AI child porn is to say that to some extent, we allow the material to exist, even if it depicts an approximation of a real person whether they are real or not, but at the potential gain of harming the industry producing the real thing. To make it illegal is to agree with the consensus that it shouldn’t exist, but will maintain the status quo for issue #2 and, in theory, cause more real children to be harmed.
Of course, the argument here goes much deeper than that. If you try to dig into it mentally, you end up going into recursive branches that lead in both directions. I’m not trying to dive into that rabbit hole here, but I simply wanted to illustrate the moral dilemma of it.
So we should ban books like Lolita since it can be interpreted as porn, or is it only visual that should be banned? If books are okay, are an imahe of stick figures with a sign “child” okay? How much detail should the visual image have before it gets banned?
How about 1000 year old dragons in a child’s body? How about images of porn stars with very petite bodies?