TikTok continues to struggle with child privacy issues as the Justice Department has announced a new lawsuit against it, citing violations of the Children’s Online Privacy Protection Act from 2019 to present.
The suit was filed in California and will also see participation from the Federal Trade Commission (FTC), which initiated an investigation into the company’s child privacy practices several months ago. The company remains under a prior court order that dates back to its initial appearance in app stores as “Musical.ly” and requires it to implement enhanced measures for the screening and verification of accounts of users aged 13 or younger.
TikTok accused of failing to implement required child privacy measures
Under the 2019 order that addressed the company’s prior COPPA violations, TikTok is required to restrict children from creating regular accounts. It is instead supposed to steer these users to special age-restricted accounts that are siloed from interaction with older platform users to a great degree. There are also special data handling requirements for child accounts, and the company is required to delete these accounts and all the data they hold promptly upon parental request.
The FTC’s recent investigation found that the company is falling short on all of these counts. The Justice Department says that TikTok has not done enough to prevent children from creating regular accounts, citing “deficient and ineffectual internal policies and processes” for identifying them. It additionally finds that even when a child’s account is properly made in “Kids Mode,” TikTok collected and retained email addresses and other elements of personal data that it is not supposed to store. And when parents requested that child accounts be deleted, the company is accused of failing to consistently honor these requests and putting parents through a “convoluted” request process.
The data of younger platform users also made its way to third parties, including Facebook and third-party analytics firm AppFlyer, for use in targeting users that had not logged into the platform in some time. The Justice Department notes that “Kids Mode” did not prevent data from being shared in this way.
Since the 2019 child privacy actions, TikTok has implemented age gating; however, it is mostly voluntary for account creation and readily bypassed if parental controls have not been placed on the child’s phone. Even when parental controls are present, the suit notes that they can be circumvented by using Google or Instagram credentials to log in. TikTok’s enforcement of age requirements generally comes after the fact, with algorithms and human moderators looking through user videos to spot those that may be misrepresenting their age (or requiring proof of age to apply for the ability to monetize an account).
The prior child privacy action against Musical.ly and TikTok required the company to pay a $5.7 million fine and delete all of the data collected from underage users. If it is found to be in violation it could be subject to penalties of $51,000 per instance per day. TikTok has issued a statement disputing the Justice Department and FTC findings, claiming that it meets its obligations in detecting and removing underage users and that it is within bounds of compliance regarding its existing child privacy and parental controls. The Justice Department points to internal TikTok communications obtained by the FTC investigation that indicate the company has not been staffing a sufficient amount of moderators to keep pace with the amount of accounts being created. The suit claims that in 2020, these moderators were spending an average of only five to seven seconds per account in verifying user age.
TikTok legal issues may come to a head in 2025
TikTok has been under fire for child privacy violations since the app’s initial launch, and not just in the United States; it was fined $368 million in Ireland last year for making the videos of teen users public by default, and faces ongoing privacy investigations of various types around the world. The Justice Department’s suit also comes as the Senate just passed a bill to extend COPPA’s terms to children up to the age of 17 and ban all targeted advertising to minors, though that remains subject to a House vote and will likely not be addressed until later in the year.
All of these child privacy developments run parallel to the looming prospect of a forced TikTok sale, something mandated by the Biden administration as part of a national security package passed in April. ByteDance has taken the issue to court, arguing that it violates constitutional rights and that the same standards are not applied to other social media giants. The Justice Department recently issued a statement indicating that it believes US residents do not have a “First Amendment right to TikTok.” The battle has become something of a political issue in an election year, not just due to the app’s massive popularity in the country but also the fact that thousands of Americans now make at least part of their living from monetizing their videos.