The Federal Trade Commission (FTC) closed out its rulemaking year with a proposal aimed at improving privacy online for underage users, calling for new terms to be added to the existing Children’s Online Privacy Protection Rule (COPPA). The new amendments would bolster children’s privacy by further restricting how companies can collect, use and monetize the data of underage users, shifting a greater deal of responsibility in this area to service providers.
The notice of proposed rulemaking was issued just before Christmas, and is currently in a mandatory 60-day comment period. Members of Congress have been calling for an even more expansive update of COPPA, which was last amended in 2013, since 2022; some FTC Commissioners have in turn responded by calling on Congress to first pass new legislation before COPPA updates or new data privacy terms are taken up by the agency.
FTC tackles long-awaited children’s privacy update
The COPPA Rule has been in place since the turn of the millennium, and specifically addresses the handling of personal data of internet users under the age of 13. Current children’s privacy protections include a requirement for websites and services to verify parental permission before collecting or making use of children’s personal information, and limitations on what kinds of data can be collected and how it must be secured and retained. The 2013 update addressed issues of privacy online specifically related to smartphones and social media platforms, expanding the scope of covered material to include geolocation information and media.
The FTC has been deliberating over another COPPA update since 2019 when it collected public comment on desired children’s privacy protections in the face of rapidly changing technology. Some members of Congress have since pressed the agency to move faster on a revision, but the position of some Commissioners has been to wait for Congress to address the possibility of a broader data privacy bill that might pre-empt or conflict with any new COPPA terms regarding privacy online.
If the revision is adopted, it would expand an existing parental consent requirement to include the provision of children’s data to any third parties (unless the data collection is considered “integral” to provision of the service). Companies would also be forbidden from requiring consent to collection of these protected categories of data as a term of service. This includes individual elements of a site or service, such as requiring a child to furnish personal information to participate in a contest or game.
The new rules would also close something of a loophole in the existing children’s privacy terms, in which a service provider is exempted from COPPA consent and notification requirements if they declare that a collected personal identifier is solely for the internal use of the service. Service providers would have to publicly disclose the specific internal function such identifiers are collected for and specify how they are ensuring that the identifier cannot be otherwise used to violate children’s privacy online.
Push notifications that serve as “nudges” to stay online would also be subject to new restrictions. These “nudges” would not be allowed to draw on children’s personal information, and any use of such notifications would have to be flagged in mandatory direct and online notices. This would include the child’s phone number, essentially forbidding unsolicited text messages prompting the child to return to the platform. Service providers would also be required to establish a security program that specifically addresses children’s privacy and documents exactly what safeguards are in place to ensure privacy online.
Current guidance that directs the Ed Tech industry to separate collected children’s information from commercial operations would also be codified, and data retention limits for all service providers would be tightened to require that the information can only be held long enough to fulfill the specific purpose for which it was collected.
Violations of minor privacy online developing much faster than new legislation
As COPPA’s timeline demonstrates, updates to children’s privacy regulations often take several years to put into place. Means of violating privacy online that skirt these regulations or exploit previously unaddressed legal ground develop at a much faster pace.
The primary shield that companies use to head off regulatory trouble in this area is to ask users for their age when they access a service, but these systems are almost entirely self-reported and not subject to parental verification. And even when a company does attempt to go above and beyond in policing underage user access and addressing children’s privacy, wily adolescents quickly find workarounds; this was demonstrated in recent years by a large wave of underage Instagram users converting their personal accounts to the free “business” version, which removed restrictions on communication and display of personal information regardless of age.
While both the FTC and Congress continue to work on assorted legislation to improve privacy online, the regulator spent 2023 taking aim at certain specific targets that overstepped children’s privacy bounds. The FTC slapped Meta with a blanket prohibition on collecting and using the data of minors in May, citing prior COPPA violations by the social media giant and forbidding it from monetizing information collected from children. Meta recently appealed this decision to a federal judge, asserting that it is unconstitutional.