Skip to main content
Tech

Deepfake abuse to generate nude photos of kids on the rise

Share

A newly released report by Thorn, a tech company that works to combat the spread of child sex abuse material, showed deepfake abuse is becoming all too common. This is partially due to the rise of artifical intelligence.

Thorn said that is because “undressing” apps are widely available and inexpensive. Individuals can also use other user-friendly, easily-accessible, AI-powered tools to create deepfake versions of nude photographs that look believable.

QR code for SAN app download

Download the SAN app today to stay up-to-date with Unbiased. Straight Facts™.

Point phone camera here

In a recent survey of more than a thousand students ages 9 to 17-years-old, Thorn found 11% of them had friends or classmates who used AI to generate deepfake nudes of classmates. Another 10% of students refused to answer the question.

The survey also found that almost 1 in 4 kids (24%) aged 13 to 17-years-old said they had been sent or shown an actual nude photo or video of a classmate or peer without that person’s knowledge. Only 7% of the students surveyed admitted that they had personally shared a nude photo or video without the other person’s knowledge.

However, sharing real nudes is a fairly large problem among teens. Almost 1 in 3 (31%) teenagers surveyed said that is normal for people their age. Only 17% of them admitted to sharing nudes of themselves, however.

Thorn noted this trend coincides with a rise in sextortion among young people. About 6% of those surveyed said someone had blackmailed them by threatening to share their nude photos unless they sent money or did something else the blackmailer asked.

Tags: , , , , , ,

[LAUREN TAYLOR]

THE RISE IN ARTIFICIAL INTELLIGENCE IS BOTH A GIFT AND A CURSE. 

WHILE IT CAN ULTIMATELY BE USED TO MAKE EVERYDAY LIFE A LITTLE SIMPLER… JUST LIKE ANYTHING ELSE SPAWNING FROM THE INTERNET – IT CAN ALSO BE USED TO TERRORIZE PEOPLE.  

A NEWLY RELEASED REPORT BY “THORN” – A TECH COMPANY THAT WORKS TO COMBAT THE SPREAD OF CHILD SEX ABUSE MATERIAL – SHOWS DEEPFAKE ABUSE IS BECOMING ALL TOO COMMON.  

THORN SAYS THAT’S BECAUSE SO-CALLED “UNDRESSING” APPS ARE WIDELY AVAILABLE AND INEXPENSIVE… AS ARE OTHER USER-FRIENDLY A-I POWERED TOOLS THAT CAN BE USED TO CREATE DEEPFAKE VERSIONS OF NUDE PHOTOGRAPHS THAT LOOK ALARMINGLY BELIEVABLE. 

IN A RECENT SURVEY OF MORE THAN A THOUSAND STUDENTS AGES 9 TO 17 YEARS OLD – THORN FOUND 11 PERCENT OF THEM HAD FRIENDS OR CLASSMATES WHO’D USED A-I TO GENERATE DEEPFAKE NUDES OF CLASSMATES… 

WHILE ANOTHER 10 PERCENT REFUSED TO ANSWER THE QUESTION. 

THE SURVEY ALSO FOUND THAT ALMOST 1 IN 4 KIDS 13 TO 17 YEARS OLD SAID THEY HAD BEEN SENT OR SHOWN AN *ACTUAL* NUDE PHOTO OR VIDEO OF A CLASSMATE OR PEER WITHOUT THAT PERSON’S KNOWLEDGE. 

AND – PERHAPS UNSURPRISINGLY – ONLY 7 PERCENT OF THE STUDENTS SURVEYED ADMITTED THAT THEY HAD PERSONALLY SHARED A NUDE PHOTO OR VIDEO WITHOUT THE OTHER PERSON’S KNOWLEDGE. 

HOWEVER – SHARING *REAL* NUDES IS A PRETTY BIG PROBLEM AMONG TEENS… WITH ALMOST 1 IN 3 (31%) SAYING THAT’S NORMAL FOR PEOPLE THEIR AGE… 

BUT ONLY 17 PERCENT OF THEM ADMITTED TO SHARING NUDES OF THEMSELVES. 

THORN NOTES THIS TREND COINCIDES WITH A RISE IN SEXTORTION AMONG YOUNG PEOPLE… WITH 6 PERCENT OF THOSE SURVEYED SAYING SOMEONE HAD BLACKMAILED THEM – THREATENING TO SHARE THEIR NUDE PHOTOS UNLESS THEY SENT MONEY OR DID SOMETHING ELSE THE BLACKMAILER ASKED.