In 2019 the Federal Trade Commission (FTC) served Facebook with an order requiring it to make sweeping privacy improvements, with the terms coming into effect in 2020. The agency has since found that not only has Facebook not fully complied with those terms, but that it has also been in violation of children’s privacy regulations with its Messenger Kids app.
The possible consequences of this include a blanket ban on profiting from the personal data of anyone under the age of 18, restrictions on the company’s use of facial recognition technology, and subjecting any new products to a mandatory review process. This would apply to all subsidiary companies, such as WhatsApp and Instagram, but is not yet a lock as Meta is allowed a 30-day response period to be followed at some point by a final vote.
String of children’s privacy, data handling issues dates back to 2012
This new development is an extension of a legal chain of events that began over a decade ago. Facebook received its first privacy order from the FTC in 2012, after a series of complaints that its privacy policy and in-app information misrepresented how user data was being handled and handed off to third parties. It took years to reach a settlement, which was then almost immediately violated with the Cambridge Analytica scandal. This in turn led to the FTC handing the company a $5 billion fine in 2019 and subjecting it to an array of new privacy requirements.
The new proposal notes that Facebook never really seemed to get into gear with that order, with issues continuing to be rampant even after the 2020 deadline to get into compliance had passed. In terms of children’s privacy, the FTC says that the Messenger Kids app made parents believe that they had greater ability to restrict who kids communicate with than was actually in place.
The determination came from an independent third-party assessor assigned to periodically inspect Facebook’s compliance practices. The children’s privacy findings are new, as the assessor says that the Messenger Kids app was out of compliance from late 2017 into mid-2019. At the time the app had made it appear to parents as if they had complete control over the contacts that children could communicate with, but children could get outside of this fence via group text chats and video calls. The assessor says that this is a violation of the Children’s Online Privacy Protection Act of 1998 (COPPA), which requires online services to fully inform parents and verify that their consent has been obtained before collecting the personal information of users age 13 or younger.
Meta is now facing the prospect of being barred from monetizing the data of anyone under the age of 18 in any of its apps, and there would be additional restrictions on what the company could collect. It would be limited to collecting information that is strictly necessary for the function of that particular service, or for legitimate security purposes. Meta would also have to silo this data in perpetuity, keeping it out of its commercial advertising streams after the user turns 18.
The company would also be more limited in the circumstances in which it is allowed to use facial recognition technology, and would have to present any new apps and products to the assessor for a privacy inspection before they are released.
Meta poised to fight order, calls it a “political stunt”
Meta’s counter-argument seems to ignore the children’s privacy issues entirely, framing the whole thing as a “stunt” that has been designed by the FTC to “usurp” power from Congress. Meta pointed to TikTok, claiming that the service “operates without constraint” on American soil (though TikTok itself has already faced penalties for children’s privacy issues and continues to be threatened with a full ban from the country).
Meta said that it plans to “vigorously” fight the FTC’s proposal. The company may be able to successfully make a case, as Commissioner Alvaro Bedoya voted in favor but stated that he was not certain the FTC could legally modify its original order to the proposed extent.
Related concerns about children’s privacy also ended up forcing Meta to indefinitely suspend its plans for an “Instagram for kids” targeted at users age 13 and under. The company announced that project in March of 2021, but by September of that year had suspended it as it was peppered with criticism and petitions from the majority of the state attorneys general.
The FTC has found that not only has Facebook not fully complied with its obligations, but that it has also been in violation of children's #privacy regulations with its Messenger Kids app. #respectdataClick to TweetOther popular platforms have tried spinning off a “kids only” version to head off children’s privacy concerns, but the strategy does not appear to be working for everyone thus far. A lawsuit against YouTube Kids was filed at the end of 2022 that alleges COPPA violations, claiming that between 2013 and 2020 a number of popular children’s brands (such as Cartoon Network and Hasbro) improperly lured kids to their channels to target them with ads.