In a recent government meeting, officials emphasized the urgent need for legislative changes to address the growing threat of computer-generated child sexual abuse material (CSAM), commonly referred to as deep fakes. The discussions highlighted the psychological dangers posed by such material, which not only desensitizes offenders but also normalizes abusive behaviors, making them more likely to commit actual crimes against children.
One key point raised was the necessity of amending existing laws to allow for the prosecution of deep fake CSAM. Currently, legal frameworks primarily focus on material produced with actual children, as established by the Supreme Court's ruling in Ashcroft v. Free Speech Coalition. However, officials noted that there is no First Amendment protection for obscene material, which can be outlawed if it meets a three-part obscenity test: it must appeal to a prurient interest in sex, be patently offensive by contemporary community standards, and lack serious redeeming value.
Drawing from Missouri's legal framework, officials proposed that similar statutes could be implemented to criminalize the possession and distribution of obscene material that appears to depict children. This approach would involve defining such material within the broader context of child pornography laws, thereby creating a pathway to prosecute offenders who utilize computer-generated images.
The meeting underscored a collective recognition of the importance of addressing this issue, with officials expressing confidence in their ability to convince juries of the need for stringent laws against this type of material. The proposed changes aim to enhance protections for children and curb the normalization of harmful behaviors associated with CSAM.