Children’s Privacy Online: Who’s Actually Protecting It?

A young child illuminated by a tablet screen, surrounded by ghostly icons of data and social media in a dark room

Published in: Regulation Watch + Ethics

Estimated reading time: 6 minutes


Introduction

From smart toys and educational apps to YouTube and social media, children today are more connected than ever. But in the rush to digitize everything—classrooms, games, entertainment—one crucial question often goes unanswered: Who’s actually protecting kids’ privacy online?

Despite laws like the Children’s Online Privacy Protection Act (COPPA), many platforms still collect, track, and monetize kids’ data in ways that are anything but child-friendly. In this post, we examine the legal landscape, enforcement challenges, and ethical dilemmas at the heart of children’s digital privacy.


What the Law Says (And Doesn’t Say)

The United States’ main federal law protecting children’s privacy is COPPA, enacted in 1998. It requires websites and apps to:

  • Obtain verifiable parental consent before collecting data from children under 13
  • Provide a clear privacy policy
  • Allow parents to review and delete collected information

But here’s the catch: COPPA was written before smartphones, TikTok, AI algorithms, and YouTube Kids. Its age threshold is limited. Enforcement is slow. And kids over 13? They’re treated like adults under most U.S. privacy laws.

Other countries have taken a more expansive approach:

  • The U.K. introduced the Age-Appropriate Design Code
  • Europe’s GDPR includes specific protections for minors, although age limits vary
  • California’s CPRA adds safeguards for those under 16

Yet loopholes, vague language, and limited oversight remain common threads.


The Loopholes Platforms Love

Big Tech doesn’t always break the law—but it often bends it:

  • “General audience” apps like YouTube have skirted COPPA by claiming not to knowingly target kids (until the FTC fined Google $170 million in 2019).
  • EdTech platforms frequently collect data in school settings, often without meaningful parental consent or clear data use disclosures.
  • Game apps and mobile games embed tracking tools that aren’t transparent—even when labeled “kid-safe.”

In many cases, the burden falls on parents and schools to navigate complex terms, configure privacy settings, and trust platforms to do the right thing.


Ethical Questions We’re Not Asking

Even when laws are followed, ethical red flags remain:

  • Should kids be tracked at all?
  • Is it fair to monetize their behavior for ad revenue or algorithmic optimization?
  • Can children truly consent—or even understand—what they’re agreeing to?
  • Are digital products being designed with children’s rights in mind, or just their engagement?

The issue isn’t just legality—it’s what kind of digital future we’re building for the next generation.


Who’s Really Watching the Watchers?

While regulators have begun to ramp up enforcement, they face limitations:

  • The FTC’s power is reactive, not preventive.
  • Federal privacy legislation remains stalled.
  • States like California are leading the way, but a patchwork of rules creates confusion.

Advocacy groups like Common Sense MediaElectronic Frontier Foundation, and Campaign for a Commercial-Free Childhood continue to push for stronger protections, but progress is slow.


A Better Model: Privacy by Design for Kids

Some companies are beginning to adopt privacy-by-design frameworks tailored for children. Best practices include:

  • Turning off data collection by default
  • Limiting personalization and ads
  • Providing child-friendly explanations of data use
  • Building opt-in rather than opt-out models
  • Enabling full parental control and review access

Ultimately, we need a shift from “what can we get away with?” to “what do children deserve?”


Conclusion: Beyond Compliance

Children’s data is not just another asset class. It reflects their thoughts, behaviors, locations, friendships, and future selves. Protecting that data should be a shared obligation—between parents, regulators, educators, and the companies that profit from young users.

Until laws catch up and enforcement gains teeth, we need to ask tough questions, demand transparency, and hold digital platforms accountable—not just for compliance, but for care.


Want more privacy updates like this?

Subscribe to The Privacy Brief for policy analysis, ethics deep dives, and the latest on digital rights.

Leave a Comment

Your email address will not be published. Required fields are marked *