TikTok continues to remain entangled in children’s privacy issues, as an ongoing investigation by the Federal Trade Commission (FTC) has now been referred to the Department of Justice (DOJ) for potential violation of the Children’s Online Privacy Protection Act (COPPA).
News of the FTC investigation broke in March. An anonymous inside source told media outlets the agency has been examining TikTok’s data and security practices in light of news that engineers in China continued to have access to US user data into 2023, in spite of the company’s promises to silo data geographically. That source had mentioned that potential children’s privacy violations were also being looked into. The FTC has now made a formal statement on the investigation, but the DOJ has declined to comment as of yet.
Children’s privacy case could date back to Musical.y settlement
The FTC does not routinely issue public notifications when it refers investigations to the DOJ, but said that it has done so in this case due to significant public interest.
The agency also indicated that its investigation included a compliance review of a settlement that was reached with TikTok in 2019 over prior children’s privacy violations. That settlement dates back to the prior US incarnation of the app as Musical.y, which paid a $5.7 million settlement over allegations that the app collected personal information from minors without parental consent going back to 2017. A violation of those settlement terms may be what triggered the referral to the DOJ over COPPA concerns.
TikTok said that it has been working on the issue of Chinese employee access for over a year now with the FTC and other authorities, and is disappointed by the referral of the investigation to the DOJ. The move allows the FTC to potentially pursue civil penalties for TikTok, but the DOJ will first have to opt to take on the case within 45 days. If it refuses, the case returns to the FTC.
COPPA regulations are limited to children under the age of 13. If a website or digital service knowingly onboards minors as users or has content directed to minors, and collects personal information, it must implement certain privacy policy terms that include permission from a parent or guardian. The platform must also take measures to keep the personal information of minors from use in targeted advertising, and control what advertising is displayed to minors to avoid harmful material.
A TikTok statement noted that it had made many changes to children’s privacy measures over the last few years, including the addition of assorted safety features targeted at users under the age of 16 (such as family account pairing and screentime limits) and more active measures in detecting and removing underage users falsely reporting their ages.
Children’s privacy concerns remain unresolved as forced sale deadline looms
Children’s privacy has been one of the major regulatory issues TikTok has grappled with ever since the app first launched, and these issues have not been limited to the US. The UK’s privacy watchdog hit TikTok with a fine equalling about $16 million USD in 2023 for allowing some 14 million children under the age of 13 to use the app in 2020. The Irish Data Protection Commission (DPC) followed that up with a separate €345 million fine for similar issues later that year. TikTok had already implemented “age gating” at that point, but a relatively weak self-verification system that savvy kids could easily bypass if parents were not closely monitoring their phone.
TikTok’s user base has a decidedly young skew, with the largest group (about a third given different studies at different times) under the age of 20 and almost another third under the age of 30. TikTok users under the age of 13 are supposed to be guided into a gated area of the app that severely limits their contact with older users and advertising as well as any potentially harmful content, but kids naturally want to be part of the “big kids” version and often circumvent these measures if parents are not keeping close tabs on their device use. TikTok has faced heavy criticism for not doing enough to police signups of younger users who are falsely reporting their age to make a standard account, though it claims it has stepped up algorithmic detection and spot investigations of accounts by its staff.
All of this unfolds as the “sell or quit” deadline set by the Biden administration continues to loom. TikTok has until early January to sell the business to a US-based entity or essentially quit the country, as the app will be banned from app stores. This deadline may end up being extended, however, as TikTok has filed a lawsuit against the legislation arguing that it violates the free speech rights of both the company and its massive American user base.