TikTok has been in the news lately as a subject of government bans over national security concerns, but it has been dealing with regulatory issues for years that have been more focused on how it protects the privacy and personal information of minors. The app has already been warned or fined several times in relation to its use of children’s data, and that tally has now increased by £12.7 million courtesy of the UK’s Information Commissioner’s Office (ICO).
The fine centers on TikTok’s failure to police underage users that sneak onto the platform, and collection and use of children’s data without required parental consent. ICO says that TikTok should have been aware that some one million underage users were present in the system and failed to implement adequate checks to screen them out.
Children’s data collected without required authorization from parents or carers
ICO found that about 1.4 million UK children under the age of 13 were using TikTok from 2018 to 2020, below the minimum required age to have an account on the platform according to its terms of service. TikTok was hit with the fine due to failure to keep these accounts from being established in the first place without parental involvement, and also for collecting and making use of children’s data.
Social media platforms are allowed to have users under the age of 13 in the UK, but the national data privacy law requires that parents or carers provide consent if children’s data is going to be collected. That tends to conflict with the way its biggest user base, ages 10 to 19 (who make up about 1/3 to 1/4 of the platform according to several studies), want to use it. Underage users simply lie about their age to avoid parental involvement, and thus TikTok has an obligation to detect these users to avoid collecting children’s data.
ICO has established that TikTok should reasonably be aware that underage users will attempt to sneak onto the platform given that it appeals primarily to younger people. The investigation found that TikTok did have an internal conversation among senior employees about underage users not being removed from the platform, but ultimately not enough was done about it and children’s data continued to be collected.
The platform was also fined for not being transparent enough about children’s data collection to meet General Data Protection Regulation (GDPR) requirements. ICO noted that children could not be reasonably expected to understand how their data was being collected, used and shared as they made use of the app.
The fine is a reduced amount, as ICO’s original notice of intent had called for £27 million. The large reduction appears to be owed to a change in direction in prosecuting unlawful use of special category data; the fine is based solely on collection of children’s data rather than adding on the broader charge of also processing sensitive personal demographic identifiers in an improper way.
In addition to the creation of personal advertising profiles for children, regulators have long been concerned with how TikTok’s algorithm chooses to deliver content to them. The rise of TikTok “challenges,” beginning with the infamous “Tide Pod challenge,” sparked these concerns. Children are sometimes exhorted by these trends to do dangerous or even illegal things, and it has never been entirely clear if the platform’s proprietary algorithm recognizes that it knows it is prompting kids to eat concentrated laundry detergent or to choke themselves unconscious (the “blackout challenge”, which has been linked to at least 20 deaths thus far).
Children’s data being hoarded by brokers before they are even issued identification
Apps such as TikTok now feed children’s data to brokers and advertising networks years before most will ever get a government-issued photo ID or have a credit reporting file established.
TikTok has indicated that its primary measure for policing underage users on the system is reactive; a team of some 40,000 safety moderators are trained to look for signs that a child under 13 is using an account, and to manually flag it for further review. But ICO has said that it is not doing enough in this area, not just under the prior GDPR terms of this fine period but under current UK law as well.
TikTok has previously been fined €750,000 by the Dutch Data Protection Authority (DPA) for similar issues involving children’s data, and $5.7 million in the United States in a case that stretched back to its initial days as Musical.ly. Ireland’s Data Protection Commission (DPC) has an ongoing investigation into the company that was initiated in September 2021, but is expected to require resolution by the European Data Protection Board and could still take more than a year to settle.
Ray Walsh, Digital Privacy Expert at ProPrivacy, feels that none of these actions (drops in the bucket compared to the tens of billions the app earns annually) will do much to change company behavior: “TikTok’s consistent misuse of children’s data serves as further evidence of ByteDance’s complete lack of commitment to user privacy. We can only hope that the UK regulators’ fine will encourage TikTok to treat user data, particularly that of children, with more respect.”