Health & Science

Warning for parents: AI fake nudes trend is targeting girls

Pinterest LinkedIn Tumblr

Parents with teens and tweens may worry that their child will someday be coerced or make a risky choice to send a nude photo. However, with the recent disturbing trend of AI-generated fake nudes, the risks of exploitation multiply.

Now, you don’t even have to take a nude photo to discover very real-looking (deepfake) nude photos — or even convincing pornographic videos — of yourself circulating on the internet. Especially if you are a woman or a girl.

AI fake nudes used as pornography is a fast-growing problem. Genevieve Oh, an AI industry analyst, recently told the Washington Post that on the top 10 websites that host AI-generated porn photos, instances of fake nudes have increased by more than 290 percent since 2018.

This nightmare of a trend is “now very much targeting girls…young girls and women who aren’t in the public eye,” warned Sophie Maddocks, a researcher and digital rights advocate at the University of Pennsylvania.

What are “deepfake” images and videos, and why are they a problem?

A lot of social media users may be used to seeing fake or altered images, which are oftentimes created just for fun — for example, “face swapping” or “age yourself” apps in which friends can share what they would look like in another form. Most people can tell that these types of images are doctored.

However, “deepfake” images and videos are faked using deep learning ​​(where an algorithm is fed real-world examples and learns to produce new images/videos that resemble the examples). With advances in AI technology, deepfakes are becoming more and more real-looking, to the point where most people struggle to figure out what’s real or not.

When real people’s images or videos are used as vehicles to create deepfakes (especially without their consent), an erosion of trust is bound to happen. Even climate activist Greta Thunberg was deepfaked recently — that particular video was labeled “satire,” but not every creator is as transparent.

Due to the rise in cheap AI technology such as “nudifier” apps, fake nudes can be created quickly and easily. As a result, sexual exploitation abounds in the world of AI fakes. A study by Sensity AI, a company that monitors deepfakes, found that 96 percent of deepfake images are pornography, and 99 percent of those photos are of women.

How fake nudes impact teens’ mental health and safety

AI fake nude images can be used to bully, blackmail, or exploit young women — sometimes under the guise of personal entertainment. On pornography forums, users have been encouraged to share pictures of their “crush” in order for others to generate fake nude images for them.

Just as real nude photos can, fake nude photos that are shared online can effectively destroy someone’s social life, academic or career path. Many forms of cyberbullying and online exploitation make victims feel humiliated, isolated, and powerless — increasing risks of anxiety, depression, insomnia, self-harm, substance abuse, suicide, and more. If kids are too embarrassed or afraid to get help, the effects are likely to be worse.

What parents can do

Policies are evolving, but have yet to catch up to all of the AI technology and social media trends out there that are impacting kids and families. While exploitation can happen to anyone, awareness and discussion can help your child stay on the safer side. Here are some tips for parents:

And above all, commit to having open, nonjudgmental conversations with kids about these issues so that they won’t be afraid to come to you with any problems, mistakes, or questions in the future.

Joanna Eng is a staff writer and digital content specialist at ParentsTogether. She lives with her wife and two kids in New York, where she loves to hike, try new foods, and check out way too many books from the library.