TikTok app on screen showing GDPR fine in children's privacy case

TikTok Receives €345 Million GDPR Fine in Years-Old Children’s Privacy Case

A children’s privacy complaint that dates back to 2021 has resulted in a major GDPR fine for TikTok. The complaint is something of a “second generation” issue for the company, as it stems from inadequate safety measures meant to address prior issues with keeping minors safe on the platform.

The decision by the Irish DPC found that the “Family Pairing” safety setting did not ensure that accounts were paired to an actual family member, and that privacy protections were inadequate for users under the age of 13.

TikTok hit with €345 Million GDPR fine, required to make changes within three months

The investigation began in September 2021, but addresses changes made to protect children’s privacy that began in 2020. The company made the changes after being fined in the US the prior year, but has been dealing with assorted issues related to children’s privacy since it debuted as Musical.ly in the mid-2010s.

It is also beginning to build a record of GDPR fines and regulatory actions in this category, with a similar prior action by the Netherlands’ DPA in 2021 (though for a much smaller amount). The UK ICO also handed down a £12.7 million fine earlier this year for GDPR-related children’s privacy failings running from 2018 to 2020. And after a young girl’s accidental death due to a “TikTok challenge” she had seen on the platform in 2021, Italy issued an emergency order for users whose age could not be verified to be blocked.

The current GDPR fine largely centers on the “Family Pairing” feature introduced in 2020. This mode allows a parent or guardian to link their own account with that of their child and manage the child’s messaging settings and screen time. However, there was no real verification process ensuring that the linked parent account actually belonged to a parent. An outside adult in control of a linked account could then lift the child’s content and messaging restrictions and allow direct messaging from that person.

And in spite of all the children’s privacy issues the company faced to close out the 2010s, the investigation found that accounts for children still had public viewing and comments enabled by default. The GDPR fine was also based on failure to keep contact and other personal information of children under 13 sufficiently private.

Eight articles of the regulation were cited in total in the GDPR fine, but one area where TikTok did manage to escape trouble is in the evaluation of its age verification process. The platform has had legal troubles in this area before, as it has been forced to do more than just take the word of the user about their age but not go so far as “carding” all of its userbase. The compromise it has settled on is to implement algorithms that detect when a user may be under 13 and lying about their age based on their posting, and flag the account for further verification (such as an ID check or facial age estimation via selfie). The decision did not find this sufficient, but also found that it did not have enough evidence about the technology in use to determine that there is a GDPR violation.

Children’s privacy issues increasingly becoming expensive oversights for tech platforms

TikTok now has three months to bring the platform into compliance, or it could face further GDPR fines. The company has already announced that it will clarify the difference between public and private accounts to users, and that users aged 16 or 17 will now have accounts default to “private” when they are made. TikTok had already changed the default setting to private for users aged 15 or younger in January 2021, after the period for which the present fine was levied.

The main concern for TikTok has always been protecting children from predators, but the platform is also home to a variety of trendy “challenges” that have proven harmful or even fatal to young users (typified by the “Tide Pod Challenge” several years ago). Regulators have focused on the platform algorithm’s role in picking up and promoting these challenges, particularly in how it uses age and personal information to recommend videos about them to minors. The new terms of the Digital Services Act, which went into effect late last year, put a greater share of legal obligation for identification and removal of such potentially harmful content on big social platforms like TikTok.

Big GDPR fines and other punitive measures are meant to demonstrate to platforms that ignoring children’s privacy and safety is no longer an option. Meta was already hit for a €405 million fine in a similar case last year, after it was found that underage Instagram users could sign up for business accounts to circumvent the platform’s age restrictions.