The Federal Foundation: COPPA
Children's data privacy in the United States is governed primarily by the Children's Online Privacy Protection Act (COPPA), a federal law enacted in 1998 and significantly updated by FTC rulemaking in 2013. COPPA is one of the few areas of U.S. law where federal privacy protections are genuinely strong.
COPPA applies to operators of websites and online services directed at children under 13, and to general-audience websites that knowingly collect personal information from children under 13. It requires these operators to:
β
Provide clear notice of data collection practices
β
Obtain verifiable parental consent before collecting, using, or sharing any personal information from children under 13
β
Give parents access to the data collected about their child
β
Give parents the ability to delete their child's data
β
Maintain the confidentiality and security of children's data
β
Not make participation in activities contingent on disclosing more information than is necessary
The FTC enforces COPPA and has levied some of the largest privacy fines in U.S. history against violators β including a $170 million settlement with YouTube and a $5.7 million settlement with TikTok predecessor Musical.ly for COPPA violations.
COPPA's Limitations
Despite being 25+ years old, COPPA has significant gaps that advocates have long pushed to close:
The 13+ loophole: COPPA only protects children under 13. Teenagers aged 13β17 have almost no federal privacy protections specific to their age group. Yet the teenage years are precisely when social media use, behavioral data collection, and algorithmic influence are most impactful and potentially harmful.
The "general audience" exception: Sites not specifically directed at children can often avoid COPPA by claiming they don't "knowingly" collect children's data β even when they have every reason to believe children are using their services. YouTube's COPPA settlement was partly about this: YouTube claimed it wasn't directed at children, even while running a massive children's content operation.
Verification challenges: "Verifiable parental consent" remains difficult to actually verify online. Self-reported age and a parent's email address don't provide much assurance that actual parental consent was obtained.
Federal COPPA 2.0 proposals: Congress has repeatedly attempted to update COPPA to cover teenagers (13β17), require age verification, and address social media platforms specifically. As of 2025, a federal update remains in progress.
State Laws: Children's Enhanced Protections
Many states have enacted or proposed laws that go beyond COPPA to protect minors:
California Age-Appropriate Design Code (AADC, SB 316): California's landmark 2022 law (though facing legal challenges) would require online services likely to be accessed by children under 18 to design their products with children's best interests in mind, disable certain features for children by default, and conduct data protection impact assessments.
Maryland's Age-Appropriate Design Code: Maryland enacted its own version of the AADC in 2024, applying to minors under 18 and requiring privacy-by-default design.
State privacy laws' sensitive data provisions: Under California, Virginia, Colorado, Connecticut, and most other state privacy laws, personal data of children (under 13 or sometimes 16) is classified as sensitive personal data β requiring explicit opt-in consent rather than just opt-out rights.
The KOSA Act (Federal): The Kids Online Safety Act, passed by the Senate in 2024, would create federal duties of care for platforms to protect minors from harmful content. Its fate in the House remains uncertain as of this writing.
What Parents Can Do Right Now
Under COPPA, parents of children under 13 have the following rights with any COPPA-covered service:
Right to review data: Request to see what personal information has been collected about your child. Services must provide this and must verify that you are indeed the parent or legal guardian.
Right to delete: Request deletion of your child's personal information. This should include any data shared with third parties, though the service may be limited in its ability to require third parties to delete already-shared data.
Right to stop collection: Withdraw consent for future collection of your child's personal information (which will likely mean your child can no longer use the service).
To exercise these rights: visit the service's privacy policy (look for a section on "Parental Rights" or "Children's Privacy"), find their designated contact method for COPPA requests, and submit a request in writing with proof of your relationship to the child.
Age Verification: The Unresolved Problem
The fundamental challenge in children's privacy is age verification: how do you confirm that a user is actually the age they claim to be, without invasively collecting identity documents from everyone?
This question has no easy answer, and different states and proposals are taking different approaches:
Self-declaration plus parental consent: The current COPPA approach β users declare their age, and if under 13, parental consent is required. Easy to circumvent by simply lying about age.
Algorithmic age estimation: Some states have proposed allowing platforms to use AI-based analysis of user behavior or device data to estimate whether a user is a minor. Privacy concerns are significant.
Digital ID verification: Some proposals would require age-verified digital IDs, perhaps through state-issued digital driver's licenses. Civil liberties concerns are significant β this would require giving platforms access to government ID information.
Device-level age verification: Some proposals would move age verification to the operating system level (Apple/Google) so individual apps could query "is this user an adult?" without receiving identity information. This approach has garnered the most bipartisan support.
The debate continues, and parents should expect significant legislative activity on this issue in the coming years at both state and federal levels.