TikTok logo on smartphone showing UK ICO finding on TikTok for children's privacy

TikTok Faces £27 Million Fine From UK ICO Over Failure To Protect Children’s Privacy

Video sharing sensation TikTok is particularly popular with minors, and a UK ICO investigation has reached a provisional finding that the platform failed to protect children’s privacy from 2018 to 2020. If this finding holds up, TikTok could be on the hook for a £27 million fine given the extended window and the nature of the violations.

UK ICO could levy largest fine yet for TikTok privacy violations

If the findings and the fine amount hold up, it would be the largest penalty yet issued by UK ICO. British Airways is the current record holder with a £20 million fine in 2020 for exposing personal and financial information of over 400,000 customers in a breach, and Marriott International was also hit with an £18.4 million penalty around the same time for a breach that leaked the personal information of hundreds of millions of hotel guests. UK law allows for a maximum fine of up to 4% of annual global turnover; TikTok is estimated to have made a little over $4 billion in revenue in 2021.

At present, UK ICO has issued a letter of intent to TikTok based on its provisional findings, essentially a warning that it intends to issue a fine. TikTok now has a chance to respond during a 30-day period before a final decision is made, and this process has reduced fines in the past for other companies. Marriott was initially facing a proposed £100 million fine in 2020, but the amount was reduced substantially based on its actions in promptly contacting customers and correcting the security deficiencies that led to the breach.

UK ICO’s investigation has found that TikTok was in breach of the national data protection law between May 2018 and July 2020, with the key issues involving children under the age of 13. The company was found to have processed the data of minors without the required parental consent, processed “special category” data (such as ethnicity or biometric data) without legal grounds, and failed to inform its users of data collection practices in a clear, concise and transparent way.

UK ICO information commissioner John Edwards said that the investigation into TikTok is part of a larger campaign of surveying online platforms for potential children’s privacy issues, with 50 in total being examined and six having active investigations opened into them.

TikTok’s history of children’s privacy issues stretches back to the beginning

The UK ICO’s proposed action is far from the first trouble TikTok has faced in this area. The video sharing app has been dogged with children’s privacy issues since it debuted as “Musical.ly” in 2014. The app’s user base has always skewed very young, and in its early years it was lax about screening users by age. The availability of profile information of underage users and the processing of their data for advertising eventually culminated in a Federal Trade Commission investigation and a $5.7 million settlement in 2019. That settlement also led to a separate version of the app for users under the age of 13, and an end to the sending of virtual gifts to users under the age of 18 in the standard version of the app.

TikTok’s initial problems with children’s privacy were mostly about making their profile information too accessible and making it too easy to communicate with them (the app allowed users to see everyone else within a 50-mile radius until 2016). Over time, focus has shifted more to the use of children’s personal information and browsing habits for targeted advertising and to feed algorithmic recommendations. Ad algorithms sometimes show kids ads for games with gambling aspects or intentionally addictive components that are fed by micropayments, and TikTok has received heavy criticism for promoting dangerous “challenges” that spread among underage users and that can cause serious harm.

TikTok has faced prior problems of this nature throughout Europe as well. A class action lawsuit initially filed by a 12-year-old was recently approved to go forward by the UK’s High Court, though it was later dropped due to concerns about financial risk to the participants. The data protection authority of the Netherlands fined TikTok €750,000 in 2021 for children’s privacy issues (along with failure to communicate policies clearly in Dutch). And in July of this year, Italy’s privacy watchdog agency formally warned the platform that it was in breach of EU privacy rules.

TikTok has generally not faced as much regulatory scrutiny in these regions as bigger tech companies such as Facebook and Google have, but attention to children’s privacy protection is expected to pick up as the platform has grown to field nearly as much traffic from underage users as YouTube does.