Skip to main content
Opinion

Deepfake nudes show the danger of AI technology

Adrienne Lawrence Legal analyst, law professor & award-winning author
Share

The majority of Americans view artificial intelligence and tools like ChatGPT as a threat to the future of humanity, according to a recent poll. AI is also being used by some to harass women online.

Straight Arrow News contributor Adrienne Lawrence says one TikTok user’s harrowing story about deepfake nude images should serve as a warning about how dangerous AI technology can be in the wrong hands.

Just as with social media and online usage now, women bear the brunt of sex-based shade. According to Pew Research Center, women are at least three times more likely than men to report being sexually harassed online. That’s an experience that one third of women under the age of 35 say that they have endured. With the growing prevalence of AI, we should not be shocked to hear that people are using the technology to generate convincingly fake nudes of women in their social or social media circles. It was really only a matter of time

Back in 2019, the app Deep Nude would generate deep fakes of nude people, but they weren’t necessarily photorealistic. Today’s AI is a whole different world. It’s making it so much more difficult to spot a fake. We recently saw this firsthand. We saw how convincing AI-generated photos can be when millions of people worldwide were duped by a viral image of the Pope in a puffer jacket. If most folks can’t figure out that an 86-year-old Pope Francis wouldn’t have that much drip, I don’t think they’re going to be able to accurately assess whether the purported nude of Savannah in the sales department is legit. 

The technology has gotten better, and the bad guys have still been pretty bad. Creating deep fake nudes of former or aspiring lovers, rivals or strangers–that can have a lasting impact on someone’s life. It’s shame-inducing, particularly when you didn’t consent.

I recently watched a young woman on Tik Tok weeping while explaining that someone had used AI, artificial intelligence, to create nudes of her, transposing her head onto the body of another in a very realistic way. Not only did they make these deep fakes for their entertainment, but also to share with other people in her circle. This young woman seemed devastated, and rightly so. As impactful and advantageous as technological advancements can be, we as a society are not prepared for how evil people will leverage AI to harm others, particularly women. 

Although artificial intelligence is already widely used in data analytics, digital marketing, supply chain logistics, and so on, only now is it making its way into this common usage commercial world. We’re in the midst of an AI awakening, whether we like it or not. We may not have the research out there yet, but I can just about guarantee that when it comes to harmful AI practices, women predominantly will be on the receiving end of the sexual harassment. 

Just as with social media and online usage now, women bear the brunt of sex-based shade. According to Pew Research Center, women are at least three times more likely than men to report being sexually harassed online. That’s an experience that one-third of women under the age of 35 say that they have endured. With the growing prevalence of AI, we should not be shocked to hear that people are using the technology to generate convincingly fake nudes of women in their social or social media circles. It was really only a matter of time. 

Back in 2019, that app Deep Nude would generate deep fakes of nude people, but they weren’t necessarily photorealistic. Today’s AI is a whole different world. It’s making it so much more difficult to spot a fake. We recently saw this firsthand. We saw how convincing AI-generated photos can be when millions of people worldwide were duped by a viral image of the Pope in a puffer jacket. If most folks can’t figure out that an 86-year-old Pope Francis wouldn’t have that much drip, I don’t think they’re going to be able to accurately assess whether the purported nude of Savannah in the sales department is legit. 

The technology has gotten better, and the bad guys have still been pretty bad. Creating deep fake nudes of former or aspiring lovers, rivals or strangers, that can have a lasting impact on someone’s life. It’s shame inducing, particularly when you didn’t consent. We live in this puritanical society that sexualizes a human form and shames nudity. How does that affect a woman’s mental health knowing that she’s gonna have to answer to loved ones and colleagues for sexual photos of her out there that she didn’t even authorize or consent to? How’s it going to impact her economic viability as it concerns job opportunities? When those photos forever exist on the internet in some form, the impact of AI deep fake nudes will be lasting and there’ll become more common. 

Lawmakers need to step up by passing laws that expand the criminalization of AI-generated nudes that enhance the civil penalties available to victims; also, that create more robust restoration options available to victims seeking to scrub pictures of themselves from the cyber world. Also, lawmakers need to allocate greater resources to trained law enforcement authorities who can really devote specialized talent and time to prosecuting unauthorized deepfake creation. 

We need to treat this crime like the crime that it is. 

 

AI can be a godsend. When there’s a clear task and an automated process, it’s wonderful. But society has a duty to protect people from what AI can also be, which is a nightmare.

More from Adrienne Lawrence

Latest Commentary

We know it is important to hear from a diverse range of observers on the complex topics we face and believe our commentary partners will help you reach your own conclusions.

The commentaries published in this section are solely those of the contributors and do not reflect the views of Straight Arrow News.


Latest Opinions

In addition to the facts, we believe it’s vital to hear perspectives from all sides of the political spectrum. We hope these different voices will help you reach your own conclusions.

The opinions published in this section are solely those of the contributors and do not reflect the views of Straight Arrow News.

Weekly Voices

Left Opinion Right Opinion
Tuesday
Left Opinion Right Opinion
Wednesday
Left Opinion Right Opinion
Thursday
Left Opinion Right Opinion
Friday
Left Opinion Right Opinion