A South Korean man has been sentenced to jail for using artificial intelligence to generate exploitative images of children, the first case of its kind in the country as courts around the world encounter the use of new technologies in creating abusive sexual content.
I'd go more in the direction of state sponsored generation and controlled access.
If you want legal unlimited access to AI generated CSM, you need to register with the state for it and in so doing also close off access to positions that would put you in situations where you'd be more able to act on it (i.e. employment in schools, child hospitals, church youth leadership, etc).
If doing that, and no children are harmed in the production of the AI generated CSM, then you have a license to view and possess (but not redistribute) the images registered with the system.
But if you don't have that license (i.e. didn't register as sexually interested in children) and possess them, or are found to be distributing them, then you face the full force of the law.
I think this idea rests on the false premise that people both need and have a right to pornography.
Many adults go about their lives without accessing it/getting off on it. It's not a human need like food or shelter. So government isn't going to become a supplier. Parallels could be made, I suppose, with safe injecting rooms and methadone clinics etc - but that's a medical/health service that protects both the individual and the community. I don't think the same argument could be made for a government sponsored porn bank.
You don't think there's an argument to be made that motivating people sexually attracted to children to self-report that attraction to the state in order to be monitored and kept away from children would have a social good?
I guess I just don't really see eye to eye with you on that then.
I'd go more in the direction of state sponsored generation and controlled access.
If you want legal unlimited access to AI generated CSM, you need to register with the state for it and in so doing also close off access to positions that would put you in situations where you'd be more able to act on it (i.e. employment in schools, child hospitals, church youth leadership, etc).
If doing that, and no children are harmed in the production of the AI generated CSM, then you have a license to view and possess (but not redistribute) the images registered with the system.
But if you don't have that license (i.e. didn't register as sexually interested in children) and possess them, or are found to be distributing them, then you face the full force of the law.
I think this idea rests on the false premise that people both need and have a right to pornography.
Many adults go about their lives without accessing it/getting off on it. It's not a human need like food or shelter. So government isn't going to become a supplier. Parallels could be made, I suppose, with safe injecting rooms and methadone clinics etc - but that's a medical/health service that protects both the individual and the community. I don't think the same argument could be made for a government sponsored porn bank.
You don't think there's an argument to be made that motivating people sexually attracted to children to self-report that attraction to the state in order to be monitored and kept away from children would have a social good?
I guess I just don't really see eye to eye with you on that then.
That component I don't have an issue with at all, actually. But providing government sanctioned ai porn? Unlikely