A class action lawsuit against YouTube could cost the tech giant $3.2 billion if decided in favor of the plaintiffs. The lawsuit asserts at least five million children’s privacy violations in the UK as a result of YouTube’s data collection practices. The suit cites both the General Data Protection Regulation (GDPR), which the UK remains bound to the terms of until 2021, as well as the similarly-structured UK Data Protection Act.
Children’s privacy violations endemic at YouTube?
The case centers on a privacy violation problem that is common to big tech companies that deal in targeted advertising: screening out minors who are entitled to enhanced data protection rights. YouTube was already taken to task for a very similar issue of children’s privacy protection in the United States in 2019, receiving a $170 million fine from the Federal Trade Commission (FTC) and New York Attorney General for collecting the data of minors for targeted advertising purposes without parental consent and in violation of the Children’s Online Privacy Protection Act (COPPA).
The penalties are much more substantial under the GDPR terms, should a violation of children’s privacy be determined by EU data protection authorities (DPAs). But that is a separate issue from the pending class action lawsuit; Articles 80 and 82 of the GDPR allow individuals to seek damages in this way in addition to any potential fines that might be levied. The lawsuit could potentially translate into hundreds of GBP for each claimant.
YouTube has stated that the platform is “not for children under 13,” but a substantial amount of its programming is designed to appeal to young viewers. A 2019 report from a UK media regulator found that 80% of UK children aged 5 to 15 are video-on-demand consumers, as well as about 50% of children aged 3 to 4. The second-most viewed video in YouTube history is the “Baby Shark Dance” at over 6.5 billion views, and two videos from the Cocomelon Nursery Rhymes channel were among 2018’s top-viewed videos with more than two billion views each. YouTube’s highest earner for 2019, toy unboxer Ryan Kaji, made $26 million dollars from the platform’s ads and is eight years old. The lawsuit also points out that children’s brands such as Hasbro and Mattel promote YouTube as a favorite website of kids in their marketing materials.
The video platform has safeguards during account creation intended to steer those under the age of 13 to the alternative YouTube Kids app, which does not collect personal data and has extra content filtering. It is assumed that a parent will have control of this process, however, and there is little stopping an older child from creating their own account and reporting a false age. Not that any of this necessarily matters in terms of potential privacy violations, as the site’s content is almost entirely available without logging in. YouTube tracks even those viewers that are not logged in, using cookies and other measures to collect data that ties into its web-spanning targeted ads network.
Though the fine amount was trivial for the tech giant, the 2019 FTC ruling did trigger some big changes at YouTube that kicked in this year and are aimed squarely at protecting children’s privacy. New labels are required for any content that is intended for children, and the site applies its AI algorithms to identify these types of videos and ensure that they are labeled properly. YouTube content creators were also notified of COPPA requirements and must manually flag each video they produce that is intended for kids.
One of the most important changes YouTube made was to stop delivering targeted ads to anyone on the platform who had watched a video intended for children, regardless of their age. If this technology is properly in place, it would prevent future privacy violations (and lawsuits) of this nature but would not apply to claims prior to the recent implementation of these new platform rules.
Can YouTube really keep kids off the platform?
YouTube finds itself in something of an untenable position in terms of these privacy violations, stuck as it is between its userbase, monetization systems and methods of content delivery. It’s impossible to protect children and screen them out from the mainstream site without mandating both user account logins and some sort of identity check, which would be clear non-starters for the rest of the user base.
The new policy of automatically removing viewers from the targeted ads ecosystem after watching just one video intended for children illustrates how precarious the position is, and indicates that the platform may need to look to new monetization methods before long. While its problems are limited to the EU and US at present, an ongoing trend of data protection laws being adopted around the world mean that YouTube could be facing fines and legal complications in nearly every major market before long. Traditional non-targeted advertising based on more general demographic observations rather than use of personal data would solve the problem of violating children’s privacy, but would also likely reduce ad revenues substantially and have a major ripple effect across the entire content creation ecosystem.