Proton
adolescence and the internet we handed to kids

Netflix’s recent hit show, Adolescence, hit a nerve, and for good reason. Inspired by true events, it shows how easily a teenager can spiral when outrage, shame, and silence collide — especially in a digital world designed to keep children scrolling, reacting, and comparing rather than protecting their online privacy.

The internet was never designed for kids but for growth, speed, and engagement. Those priorities haven’t changed. While parents worry about screen time(nuova finestra) and schools barely scratch the surface(nuova finestra) with digital literacy, social platforms quietly shape how kids think, feel, and see the world, often invisibly and almost always without their informed consent.

In this article, we look into the systems sculpting the digital lives of children, what needs to change, and how to ensure those changes are actually implemented.

What is Adolescence about, and why does it matter?

Adolescence is a four-part British drama that follows the police investigation of a violent act involving a 13-year-old boy. Told through unbroken, single-take episodes, it immerses viewers in tense conversations between teenagers, parents, psychologists, and police — leaving little room to look away.

It also captures the quiet devastation on parents realizing they weren’t equipped with the tools to see it coming, let alone stop it. Its intent is clear: to examine how easily kids can be pulled into dark emotional spirals when no one around them understands what they’re going through. The result is unsettling, intimate, and deeply believable.

How platforms track, manipulate, and radicalize

Adolescence shows how easy it is for kids to get caught in systems that influence them without realizing it and how this influence takes effect long before anything goes wrong. The show never depicts the algorithm directly, but it hints at how it works: Jamie’s father casually mentions how a simple search for gym tips led him into manosphere content. It’s a subtle example of how algorithms don’t radicalize in a single leap, but through many emotional nudges that redirect curiosity toward something darker.

Even though laws like COPPA (Children’s Online Privacy Protection Act) in the US are supposed to protect kids under 13, many platforms(nuova finestra) still track location, device data, browsing habits, and engagement, often without parental consent or by skirting regulation.

Once the algorithm collects enough data, it shows content that triggers strong emotions like excitement, fear, and outrage. This keeps kids glued to screens(nuova finestra), shortens their attention spans, and makes them more impulsive. It also floods them with narrow ideas of beauty, success, and popularity, shaping how they see themselves(nuova finestra) and the world. Over time, kids are pushed toward products, trends, (nuova finestra)radical ideas(nuova finestra), and harmful content without realizing it.

The limits of parental control

Adolescence also illustrates how out of touch adults can be — not out of neglect, but because they simply didn’t know what to look for. Jamie’s parents believed he was safe in his room on his computer, but they had no idea what he was watching or how isolated he had become.

Parental control tools like screen time limits, content filters(nuova finestra), monitoring apps, and device restrictions are helpful, but they alone cannot prevent kids from encountering disturbing content online. Even if your child’s personal device is locked down, friends’ devices or school computers might be less protected. Besides, kids are wired to explore and discover, and peer pressure is powerful. If your child’s friends use certain apps or visit risky sites, they may feel left out and motivated to bypass restrictions.

Harm can also come through “safe-looking” content like toxic influencers(nuova finestra), subtle cyberbullying, or extremist ideas disguised as memes(nuova finestra) — the kind of threats that often slip through platform filters focused only on explicit dangers. Adolescence captures this disconnect by showing adults failing to recognize the “secret language” of teens — like emojis used to humiliate and provoke — even when it’s happening in plain sight on public Instagram posts.

To stay safe, kids need internal skills such as critical thinking, emotional resilience, and the confidence to say no — things that tech tools cannot provide. Ongoing communication, trust, and digital education are far more effective than relying on parental controls alone.

What needs to change

The current social media system works exactly as it was designed to — to capture attention, extract data, and maximize engagement. Protecting the next generation will take systemic change. Here are four areas where change needs to happen:

Design defaults that put people first

Most major platforms are systems built for engagement. Features like infinite scroll(nuova finestra), autoplay, and endless notifications are deliberate design choices to keep you hooked at the cost of your time, attention, and mental health.

New engagement-driven features are often introduced by default, hidden inside terms and conditions, and treated as if a quick click to “accept” counts as meaningful consent. But kids aren’t in a position to give meaningful consent, and parents often aren’t even aware what their children are agreeing to.

A platform that truly respects its community would leave emotional targeting features off by default — and give both kids and parents a clear, honest choice to opt in. Small steps, like teen accounts(nuova finestra) with private default settings, are a start.

Better moderation tools with real accountability

Content moderation today is reactive, inconsistent, and often fails to protect people. Social media platforms have missed harmful content(nuova finestra), allowed it to spread, and stepped back from fact-checking(nuova finestra).

Harmful ideas that subtly promote violence should be taken just as seriously as explicit material. Platforms that let these harmful communities grow should face not just bad press and pay-to-forget fines but real, enforceable consequences that make it harder to ignore harm and profit from it.

Algorithmic transparency that can be verified

Recommendation systems control what people see and how they experience the internet, but most platforms still treat their inner workings as trade secrets(nuova finestra).

For instance, TikTok offers basic procedural transparency (explaining(nuova finestra) in broad terms that engagement shapes recommendations), but it stops short of real openness — the actual source code is not public, independent audits are rare or nonexistent, and people have no meaningful way to challenge how the system operates.

True algorithmic transparency would make it possible to hold platforms accountable(nuova finestra) for the environments they create, especially when they affect a vulnerable audience like adolescents.

Open-source model for digital education

Digital education should borrow from open-source software’s approach, which prioritizes collaboration, transparency, and constant evolution. This is the only way concerned parents and policymakers can keep up with billion-dollar platforms.

Here’s what all of us can do:

  • Governments should introduce digital safety laws like the EU’s Digital Services Act(nuova finestra) but with clear requirements for protecting young users, set age-appropriate design standards, and fund nationwide digital literacy programs.
  • Schools should teach how algorithms push certain content that triggers strong emotional reactions, how outrage is monetized, and how to question what they see online to make informed decisions.
  • Parents should be given real tools and support to talk openly with their kids about how social media platforms are designed to be addictive, not just limit screen time or block apps.
  • Tech companies committed to doing better should design for privacy by default and support real digital literacy efforts instead of leaving parents and children to figure it out alone.

At Proton, we follow that principle by building open-source, independently audited apps that protect your privacy by default. Our products don’t rely on ads or surveillance — and they’re designed to give people, not algorithms, control over their digital lives.

Articoli correlati

file management for teams
en
Learn how to set up a secure, efficient file management system for your team and keep everything organized with Proton Drive.
'A very perilous moment': Journalists feel under assault
en
At Proton, we believe the best way to protect press freedom is to give journalists tools that make them harder to target — and easier to trust.
The cover image for a Proton Pass blog about World Password Day, showing a purple globe with three password fields on it
en
  • Aggiornamenti dei prodotti
  • Proton Pass
On World Password Day, Proton Pass asks the question: Do we need passwords anymore? Are there better alternatives? Find out in our blog.
Illustration of a QR code inside an envelope with an alert symbol suggesting the code is part of a scam
en
Quishing works by tricking people into scanning a scam QR code disguised to look legitimate. Here's how to protect yourself from quishing scams.
Can I change my Gmail address without creating a new account?
en
This article will guide you through several ways you can edit your Gmail address by adding a few small tweaks.
A stylized icon resembling the Google Photos logo with a dark triangular warning sign containing an exclamation mark, representing Google Photos safety issues
en
Is Google Photos secure to share private pictures? Here's how its protections can fall short — and other options for safer photo sharing.