Child holding smartphone with TikTok icon showing FTC investigation on privacy and security issues

“Privacy and Security Issues” Involving Children, Chinese Data Access Prompt Reported FTC Investigation of TikTok

Anonymous sources have told Politico and CNN that a new FTC investigation of TikTok is taking place, with the probe covering both data access by China-based engineers and whether the platform is adequately protecting the privacy and security of children.

The remote data access issue has been in the news for some time, and is part of what prompted the Biden administration to threaten TikTok with a ban from US app stores if parent company ByteDance does not sell it to a US buyer. But the FTC is also reportedly looking at potential violations of the Children’s Online Privacy Protection Act (COPPA), which could result in a new lawsuit against the company.

FTC investigation includes new charges involving under-13 users

The new FTC investigation comes as ByteDance is watching a bill develop that might eventually force it to either divest TikTok or cease officially operating in the US. The bill passed the House with very strong bipartisan support in mid-March, and the issue will now be taken up by the Senate. Should the present terms ultimately hold, ByteDance would be given six months to find a US buyer for TikTok or see it pulled from app stores.

The anonymous sources say that a separate COPPA privacy and security issue is now being investigated, likely involving consent from parents for underage platform users or proper notification of their activity. The probe also includes a possible violation of the FTC Act for “unfair or deceptive” business practices, with regards to the company’s prior promises that engineers in China would be separated from US user data. Internal leaks published last year revealed that ByteDance’s China-based team was still able to access US data in a number of cases, seemingly due to a lack of staff on the US side that understand the deepest technical details of the platform.

The insiders did not provide much more detail on the FTC investigation, but did say that it is in its late stages. While it does not have a set end date, it could wrap up in a matter of weeks. That could then lead to an announcement of a suit against TikTok, or possibly a settlement. Any suit would have to first be referred to the Justice Department, which would have 45 days to decide whether to take the case on or send it back to the FTC for further action.

TikTok’s long chain of privacy and security failings comes to a head amidst FTC investigation

TikTok has already faced several actions involving the privacy and security of children, who have always been one of the biggest demographic groups on the platform. In 2019, the company paid a $5.7 million settlement in an FTC action that dated back to when the platform launched as Musical.ly several years prior. TikTok was accused of failing to police the creation of accounts by users under the age of 13, leading to what the FTC claimed were thousands of complaints by parents.

This led to the creation of a separate segment of the app meant exclusively for users under the age of 13, which is supposed to be almost completely siloed off from communication with older users. But TikTok ran into trouble again in the UK in 2022, when it was fined £27 million under that country’s Children’s Code for similar privacy and security issues.

There was another privacy and security fine of the equivalent of $368 million in the EU September 2023, for violations involving children that dated back to 2020. The Irish Data Protection Commission (DPC) specifically cited TikTok for setting children’s profiles to public by default, and allowing older platform users to comment on their videos, issues that may come to light again as part of the FTC investigation.

The central concern about children’s privacy and security on TikTok is the ability of predators to make contact with and track them, but there are other issues that have contributed to regulatory scrutiny and calls for bans and age restrictions. The periodic “challenges” the platform has become known for, most infamously the “Tide Pod Challenge,” have caused serious injury or death in multiple cases now. Rampant cyber bullying on the app has also become a major parental concern, as has the ease with which children can access inappropriate content (and even be enticed into making it themselves).

Concerns such as these (and the issues raised by the FTC investigation) led to a general social media ban in the state of Florida that was signed into law a week ago, prohibiting anyone under the age of 14 from using platforms like TikTok and requiring parental permission for those aged 14 and 15. Similar legislation is being considered in several other states, but they are likely waiting to see the results of inevitable legal challenges to the Florida law before going forward.