The men believed what they saw online: a website filled with explicit AI-generated images of her face on someone else’s body, alongside her real home address. The site claimed her home was a brothel.

Sweet Anita, one of Twitch’s most recognizable streamers, understood the risks of being a woman online and took precautions. She hid her real name, set boundaries, and hired a team. None of that stopped someone from deepfaking her.

Nothing prepared her for what came next.

Sweet Anita’s story

An anonymous man commissioned AI-generated pornography of her, then built a fake website framing her as a sex worker — and listed her home address. The entire setup was designed to lure men to her home with the false expectation they could force her into sex. “After he did that, I started getting men coming to my house, especially when I would livestream and I was away from home,” she said. “I don’t feel safe in my own home.”

Strangers with duffel bags tried to break in. She was afraid to get mail from the postman. She had to stop answering her door and leaving the house unattended. She has moved before, but they kept finding her.

And the abuse wasn’t limited to her front step. Her late mother, proud of her daughter’s work, would check Anita’s social media and be forced to scroll past explicit, AI-generated images. Friends and coworkers saw them too. “I could delete everything off of YouTube, Twitch, all my socials — and there would still be loads of images of my face in videos that could be used as a reference,” she said.

Deepfakes are the new bullying

Deepfakes use AI to generate hyper-realistic photos and videos of people without their consent. They are often created for sexual abuse, harassment, or intimidation. Most of the victims are women and girls.

According to research by Girlguiding(nuova finestra), more than one in four teenagers say they’ve seen an explicit deepfake of someone they know — a classmate, a teacher, a friend, or even themselves. CNN reports(nuova finestra) that 40% of students and 29% of teachers said they were aware of a deepfake of someone connected to their school in the past year, and 15% of students said they had seen explicit versions. Most schools don’t have policies for dealing with this form of abuse.

The harms are severe. Victims can suffer anxiety, depression, and even PTSD. They may withdraw socially, lose friends, or avoid applying for jobs or colleges because explicit fakes could surface during a search. Some are even forced to spend money trying to scrub content from the internet.

The World Economic Forum notes(nuova finestra) that while deepfakes didn’t destabilize elections as many feared, they have become a tool for harassment and scams. Nonconsensual pornography is the most extreme form, but attackers also use cloned voices and fake video calls to commit fraud. In 2024, a finance worker in Hong Kong was conned into paying $25 million after joining a Zoom call in which every participant — including the CFO — was a deepfake(nuova finestra).

“For the rest of my life, I don’t see a way I could ever personally avoid this,” Anita said. “And I don’t think I should be responsible for that either. It’s not my issue, and it’s not my fault.”

This could happen to anyone

It only takes a handful of images. A public Instagram. A school photo. A shared album. Once a face is online, it can be copied, scraped, and misused.

According to Professor Carsten Maple from the University of Warwick’s Cyber Security Centre, with today’s AI tools, as few as 20 photos are enough to create a realistic profile of someone, or even a 30- second video, expanding the scope of potential dangers — a fact that 53% of recently surveyed parents did not know

The New York Times reported on the rise of so-called “nudifier apps”(nuova finestra) that can strip clothes from photos using AI. These apps are cheap, easy to use, and widely available. Investigators estimate the industry brings in around $36 million a year. Despite a new US law making it illegal to post nonconsensual fake nudes, the apps themselves remain legal. “Any kid with access to the internet can both be a victim or a perpetrator,” Alexios Mantzarlis, whose team investigated 85 such sites, told the New York Times.

Private accounts are not a guarantee of safety. Abusers are often people the victim knows. Even seemingly harmless posts, such as birthday parties, can reveal details that fuel identity theft, which affected an estimated 1.1 million children in 2024.

“People who [make deepfakes] often forget that it’s creepy, that it’s inappropriate — like, if their co-workers and friends knew, they’d get fired and ostracized. They’d lose everyone they know, their parents would be disappointed in them, right?” she said. “But, when you’re online and you’re making this material for each other with a bunch of like-minded creeps, then it’s just a hobby and a pastime to them. They completely get to sidestep and forget and be deluded about the impact it has on people’s lives.”

The deepfake risk goes far beyond social media

Today, the biggest risk may come from what gets shared on social media. But what about all the photos you don’t share at all?

Google, Apple, Amazon, Meta and others all offer cloud storage for your photos, sometimes free of charge. But that convenience still comes at a cost. Google’s been using your photos to train AI since at least 2015, when it was forced to apologize for classifying a Black couple as “gorillas.”(nuova finestra) Since then, all the major storage providers have gotten much better at facial recognition. How do you think that happened? More photos, constantly feeding the AI.

The billions of photos and videos being stored with Big Tech are a vast trove of material that could one day be fodder for generating incredibly detailed and lifelike deepfakes.

All it takes is a quiet change in the Terms of Service.

Protect your memories like you protect your passwords

You can’t control what others do with your image. But you can control where your images live and who has access to them. Proton Drive gives you a private, encrypted way to store the photos and files that matter most.

  • End-to-end encryption: Your files are encrypted before they ever leave your device. Unlike Big Tech, not even Proton can see them.
  • Private by default: Your data is never sold, shared, or used to train AI.
  • Secure sharing: Add passwords, set expiration dates, or revoke access anytime.
  • Automatic backup: Even if your phone is lost or stolen, your memories stay safe.

With Proton Drive, your data stays yours — always.