With the expansion of technology comes the expansion of the criminal mind. Artificial Intelligence is just the latest technological advancement that criminals are looking to exploit and prosecutors are looking to prosecute. The US Department of Justice recently announced that it charged a 42 year old man in Wisconsin with producing, distributing and possessing images of minors engaged in sexually explicit activity. Now this charge, which is unquestionably abhorrent, may not seem particularly uncommon on its face. But what makes this case novel is that the images weren’t real. Steven Anderegg is accused of using AI image making software to create over 13,000 Highly photorealistic sexual depictions of children. But again, none of these images were real. The synthetic nature of the images brings this case within the purview of a largely untested area of the law, particularly as it concerns whether the Protect act can be used to punish people for AI generated images. The protect Act is a preeminent federal law that punishes people for creating owning sharing child pornography. The Act criminalizes any kind of visual depiction of a minor engaging in sexually explicit conduct in a graphic or obscene manner. And I’m grateful that this law exists that our government prohibits such disgusting depictions of children. But I also know that the constitutionality of this law hasn’t necessarily been battle tested, as it concerns AI generated images where minors are not actually involved. Said another way, there’s room for defendants to argue that they deserve a pass here because the images did not involve actual children. Now that he’s facing up to at least seven years in prison, Mr. Anderegg will likely argue just that. He may likened his images to the type of pornography that involves adults who are addressed to look like children with the pigtails and the schoolgirl outfits and so on. He may even claim that AI generated images are better for society because they don’t involve abusing actual children. Well, those arguments are trash. There should be no question that child pornography created by AI must be treated the same in court as real child pornography. Our laws prohibit sexually explicit images of children at the state and federal level for good reason. Such depictions of children are dehumanizing abuse of our most vulnerable members of society. Such abuse should never be tolerated under any circumstances, whether the child is alive no longer with us or creation of artificial intelligence, because real or fake sexual depictions of children violate the dignity of all children. And the proliferation of this form of child pornography fuels the normalization of sexualization of children, that desensitization is devastating for our society in the long run. Also, remember that AI generated images aren’t just pulled out of thin air, rather than the product of technology, curating and repurposing large datasets of real images, and generated content depicting miners still relies on scraping images of data of real sexual abuse material. So these fake images capitalize off real harm. The degree of depravity required with sexualizing children is filth, the government should have free rein to prosecute and punish individuals furthering such depravity, particularly now that AI technology is everywhere. And much of those abusing it are operating on the open in September, for example, and Eric allegedly posted on Instagram and his stories are realistic AI generated image of a miner wearing bondage theme leather clothes, and wrote in a message encouraging others to come check out what they’re missing on the messaging app Telegram. As the National Center for Missing and Exploited Children told The Washington Post last month, law enforcement is going to be overwhelmed pretty soon with AI generated images of children. Given the advancements of technology, authorities will struggle to distinguish between what’s real and what’s AI whether an image depicts a real child who may need help, versus one that is a twisted figment of a computer imagination. These authorities shouldn’t have to struggle when it comes to prosecuting defendants. Real or fake child pornography must be prosecuted the same
The Protect Act must also apply to AI-generated child pornography
By Straight Arrow News
The U.S. Department of Justice recently announced that it charged 42-year-old Steven Anderegg with producing, distributing and possessing AI-generated images of minors engaged in sexually explicit activity. This raises the question of whether the Protect Act, which targets perpetrators of child pornography, might apply to AI-generated child pornography in the same way it does to real child pornography.
Watch the above video as Straight Arrow News contributor Adrienne Lawrence argues that there should be no distinction between the two in terms of prosecution. Lawrence says that the government must implement legal safeguards before law enforcement becomes completely overwhelmed.
Be the first to know when Adrienne Lawrence publishes a new opinion every Wednesday! Download the Straight Arrow News app and enable push notifications today!
The following is an excerpt from the video above:
Also, remember that AI-generated images aren’t just pulled out of thin air, rather they are the product of technology curating and repurposing large datasets of real images. And generating content depicting minors still relies on scraping images of data of real sexual abuse material. So these fake images capitalize off real harm.
The degree of depravity required with sexualizing children is filth. The government should have free rein to prosecute and punish individuals furthering such depravity, particularly now that AI technology is everywhere. And much of those abusing it are operating on the open.
In September, for example, [Steven] Anderegg allegedly posted on Instagram and his stories, a realistic AI-generated image of a minor wearing bondage-themed leather clothes and wrote in a message encouraging others to come check out what they’re missing on the messaging app Telegram.
As the National Center for Missing and Exploited Children told The Washington Post last month: “Law enforcement is going to be overwhelmed pretty soon with AI-generated images of children. Given the advancements of technology, authorities will struggle to distinguish between what’s real and what’s AI, whether an image depicts a real child who may need help versus one that is a twisted figment of a computer imagination.”
These authorities shouldn’t have to struggle when it comes to prosecuting defendants. Real or fake, child pornography must be prosecuted the same.
With the expansion of technology comes the expansion of the criminal mind. Artificial Intelligence is just the latest technological advancement that criminals are looking to exploit and prosecutors are looking to prosecute. The US Department of Justice recently announced that it charged a 42 year old man in Wisconsin with producing, distributing and possessing images of minors engaged in sexually explicit activity. Now this charge, which is unquestionably abhorrent, may not seem particularly uncommon on its face. But what makes this case novel is that the images weren’t real. Steven Anderegg is accused of using AI image making software to create over 13,000 Highly photorealistic sexual depictions of children. But again, none of these images were real. The synthetic nature of the images brings this case within the purview of a largely untested area of the law, particularly as it concerns whether the Protect act can be used to punish people for AI generated images. The protect Act is a preeminent federal law that punishes people for creating owning sharing child pornography. The Act criminalizes any kind of visual depiction of a minor engaging in sexually explicit conduct in a graphic or obscene manner. And I’m grateful that this law exists that our government prohibits such disgusting depictions of children. But I also know that the constitutionality of this law hasn’t necessarily been battle tested, as it concerns AI generated images where minors are not actually involved. Said another way, there’s room for defendants to argue that they deserve a pass here because the images did not involve actual children. Now that he’s facing up to at least seven years in prison, Mr. Anderegg will likely argue just that. He may likened his images to the type of pornography that involves adults who are addressed to look like children with the pigtails and the schoolgirl outfits and so on. He may even claim that AI generated images are better for society because they don’t involve abusing actual children. Well, those arguments are trash. There should be no question that child pornography created by AI must be treated the same in court as real child pornography. Our laws prohibit sexually explicit images of children at the state and federal level for good reason. Such depictions of children are dehumanizing abuse of our most vulnerable members of society. Such abuse should never be tolerated under any circumstances, whether the child is alive no longer with us or creation of artificial intelligence, because real or fake sexual depictions of children violate the dignity of all children. And the proliferation of this form of child pornography fuels the normalization of sexualization of children, that desensitization is devastating for our society in the long run. Also, remember that AI generated images aren’t just pulled out of thin air, rather than the product of technology, curating and repurposing large datasets of real images, and generated content depicting miners still relies on scraping images of data of real sexual abuse material. So these fake images capitalize off real harm. The degree of depravity required with sexualizing children is filth, the government should have free rein to prosecute and punish individuals furthering such depravity, particularly now that AI technology is everywhere. And much of those abusing it are operating on the open in September, for example, and Eric allegedly posted on Instagram and his stories are realistic AI generated image of a miner wearing bondage theme leather clothes, and wrote in a message encouraging others to come check out what they’re missing on the messaging app Telegram. As the National Center for Missing and Exploited Children told The Washington Post last month, law enforcement is going to be overwhelmed pretty soon with AI generated images of children. Given the advancements of technology, authorities will struggle to distinguish between what’s real and what’s AI whether an image depicts a real child who may need help, versus one that is a twisted figment of a computer imagination. These authorities shouldn’t have to struggle when it comes to prosecuting defendants. Real or fake child pornography must be prosecuted the same
Conservative activist Leonard Leo a danger to American culture
Linda McMahon is bad news for US education system
How Gov. Gavin Newsom is ‘Trump-proofing’ his state
Sonia Sotomayor can and should remain on Supreme Court
Six disturbing takeaways from Project 2025
Underreported stories from each side
Iran in direct contact with groups in Syria’s new leadership, Reuters reports
14 sources | 10% from the left Getty ImagesThe Onion’s bid to buy Infowars goes before a judge as Alex Jones tries stopping the sale
23 sources | 0% from the right Getty ImagesLatest Stories
FDA may soon ban artificial red dye used in candy, food and drinks
As 2025 approaches, more Americans looking to switch jobs: Report
Israel sends troops into Golan Heights buffer zone in response to coup in Syria
Trump says he’ll ‘most likely’ pardon Jan. 6 rioters on first day
Jay-Z reacts after being accused of raping girl alongside Combs
Popular Opinions
In addition to the facts, we believe it’s vital to hear perspectives from all sides of the political spectrum.
As Trump plans to weaponize FBI, Biden had to pardon Hunter
Friday Dr. Rashad RicheyChina should not control American media
Friday Star ParkerBible-infused school curiculum sets dangerous precedent
Thursday Jordan ReidTrump can nominate Patel, Hegseth, but will Senate confirm?
Wednesday Newt Gingrich