Popular video app TikTok, which rocketed into the ranks of the world’s most-downloaded social media apps in late 2018, has even more child privacy regulation problems. The embattled app has already paid one monster fine for allowing children under the age of 13 to sign up for a TikTok account without parental consent; the new complaint alleges that some videos posted by minors are still available, and that the method of demonstrating parental consent when creating a new account is inadequate to the task.
TikTok’s ongoing child privacy problems
Founded by Beijing-based Bytedance in 2016, TikTok made its way to the rest of the world in 2017. A merger with similar service Musical.ly in the summer of 2018 propelled the app to mass popularity, making it the most-downloaded social media app in late 2018 and one of the top 10 most-downloaded apps in the world.
Prior to the merger, Musical.ly had been allowing children under the age of 13 to sign up without demonstrating parental consent. The app did not ask for a new user’s age until mid-2017, and for some time after that allowed new users to sidestep the age verification by signing up through several major social media platforms. That put it in violation of the Children’s Online Privacy Protection Act (COPPA), which requires parental consent to be obtained if any personal information is collected from children in this age group. When TikTok acquired Musical.ly, it also acquired all of its prior COPPA violations — something that ultimately cost the company $5.7 million in the form of a 2019 FTC fine.
Part of the prior judgment against TikTok was a requirement to remove any videos existing on the platform that were uploaded by users under the age of 13. The current complaint alleges that this has not happened; a coalition of about 20 advocacy groups claims that some of these videos are still available, and that they have identified numerous accounts belonging to parties under the age of 13 that continue to share new videos. The coalition consists of a diverse array of groups: The Campaign for a Commercial-Free Childhood, Center For Digital Democracy, and the United Church of Christ for just a few examples.
The crux of the complaint is that TikTok is not making a reasonable effort to collect parental consent when new accounts are created. The complaint notes that TikTok never asks for any parent contact information or makes any attempt to get in touch with them, nor does it prevent a child who initially discloses their real age from re-registering with a different account in which they lie about their age.
TikTok now shunts new signups under the age of 13 to a special “younger users account” that prevents children from sharing videos, but the complaint also alleges that these accounts are collecting an unacceptable amount of personal information without parental consent: geolocation data, email addresses, the content of messages sent through the platform, and usage history, some of which is shared with third-party advertisers.
The complaint also notes several other violations of COPPA’s child privacy terms: failure to make the privacy notice clear and easily accessible, failure to notify parents of the procedures to access the child’s data, inadequate business contact information, and lack of a mechanism for parents to opt the child out of collection and retention of their personal information.
Child privacy is a particular problem for TikTok because of the youthful skew of the platform’s demographics. At least 41% of the platform’s users are under the age of 24, but this percentage is most likely higher as it does not account for users under 16 that are lying about their age. Many of TikTok’s most-followed performers are under the age of 18. The complaint also makes an informal observation that many of the platform’s most-liked videos are clearly targeted at children, as is much of the advertising.
Concerns about child privacy are also greatly heightened during the coronavirus pandemic, with kids out of school for the spring 2020 semester and the possibility of closures continuing into the fall.
A TikTok spokesperson told The Verge that the company is ” … committed to helping ensure that TikTok continues to be a safe and entertaining community for our users.” In April, the company rolled out a new feature called “Family Pairing” that allows parents to link their accounts to kids to do things like look in on messages and limit usage time.
Evolving COPPA rules and enforcement
COPPA went into effect in 2000, and was revised in 2013 in response to the growing popularity of social media platforms and the growing ubiquity of mobile devices. Enforcement related to child privacy has stepped up in recent years as various data breaches and questionable advertising practices have put more watchful eyes on the larger social media platforms.
TikTok is far from the only company to get caught up in this wave of child privacy mishaps. A $170 million fine of YouTube for COPPA violations in 2019 led to sweeping changes to the platform early this year, requiring content creators to clearly label any videos intended for minors and preventing targeted ads from collecting data from children under 13. And Facebook’s Messenger Kids app was investigated by Congress after a design flaw was found that allowed kids to enter group chats with unfamiliar adults.
The FTC is also currently considering another revision to COPPA. It announced in mid-2019 that it would be taking comments on how services are currently policing content directed at minors and how parental control is handled, and how these aspects might be improved.
TikTok has had broader security and safety issues that go beyond child privacy violations. Though the platform has most recently been in the news for showcasing dancing nurses, prior to that it was banned by the TSA, Dept. of Homeland Security and US Armed Forces over concerns about the app’s potential ties to the Chinese government.