EU legislators have agreed to final terms on the Digital Services Act, a new law that focuses on large social media and retail platforms. The full text has yet to be released to the public, but the European Parliament and European Commission have outlined some of its central terms; these include new restrictions on how targeted advertising can use sensitive personal information, a ban on dark patterns and a requirement that the inner workings of recommender algorithms be visible to the public.
Digital Services Act moves forward with final terms, awaits final vote
The Digital Services Act has been moving in tandem with the Digital Markets Act, which received similar agreement on its terms in late March, as a legislative check on the powers of large “gatekeeper” platforms in the social media, messaging and online retail spaces. The terms have been hammered out between the European Parliament, Council and Commission in proposals that date back to early 2021.
Though the focus of these bills has been on so-called gatekeepers, which has been defined previously as platforms with at least 45 million active users per month, there are some terms in the Digital Services Act pertaining to personal information handling that appear to also be applicable to smaller platforms.
The full text of the Digital Services Act is yet to be made available to the public, but the European Parliament has enumerated many of its key terms in press releases. One of the headline items is “algorithmic accountability,” requiring a look under the hood at the processes that use user activity and personal information to furnish recommendations on things like new videos to watch or products to buy. There are also numerous new safeguards aimed at curtailing a range of illegal activity, and more required measures of user control over personal data.
Fines are also steeper than those listed by the General Data Protection Regulation (GDPR), and enforcement is handled in a different way. Gatekeeper-class violators will answer directly to the European Commission and face maximum penalties of up to 6% of global annual turnover.
Recommender algorithms a point of focus
One of the headline Digital Services Act items is the requirement that recommender algorithms be more transparent to both legislators and the general public. The European Commission is requiring that gatekeeper platforms provide both it and EU member states access to the inner workings of recommender algorithms for scrutiny, and the public will be offered more information about them as well.
Though aimed at EU residents, this particular item will have an impact on the operations of big tech companies around the world. It is likely that large platforms will simply implement universal policies that voluntarily provide users in other countries with more information about recommender algorithms; even if they do not, more information about the technical guts of these platforms will inevitably make its way out to the rest of the world.
The Digital Services Act also stipulates that the end user must be offered at least one recommender algorithm option that is “not based on profiling,” or any of the sort of information usually collected for targeted advertising. One simple model is the “chronological feed” which lists all posts from those on a user’s following list in the order they were posted, rather than engaging a recommender algorithm as a filter. This feed type was more common on social media platforms toward the early 2010s, and several (including Twitter and Instagram) have already re-introduced it as an option.
The tighter regulation of recommender algorithms is based on growing concerns about the subtle harm that they can do. Cases have been made that recommender algorithms tend to promote more sensational and extreme content, enable the spread of disinformation, discriminate by design, and take users down a “rabbit hole” of increasingly divisive ideological content that can ultimately radicalize them.
Targeted advertising also faces tougher restrictions
Already under substantial restrictions from the GDPR, and increasing voluntary limitations imposed by companies such as Apple, the targeted advertising industry is not getting any good news from the Digital Services Act.
Targeted advertising is being banned entirely from using certain categories of sensitive personal information: a recent press release names sexual orientation, religion and ethnicity as specific examples. It is also banned from targeting minors. The bill also promises all users “better control” over how personal data is used.
The “dark patterns” that also often govern the opt-out process of information sharing are also being banned entirely. And users must now be able to just as easily opt out of services that make use of targeted advertising as they opted in.
In addition to the restrictions on targeted advertising, merchant platforms will also face more stringent requirements to identify and remove illegal products and services. Notices of collection of personal information for targeted advertising must also be non-arbitrary and non-discriminatory and respect fundamental rights such as freedom of expression and data protection.
Mandar Shinde, CEO of Blotout, hopes to see this move spur similar legislation in the homeland of most of these giant tech platforms: “This is a step in the right direction; while businesses and consumers benefit from these large platforms, it is important to keep their power in check. Archaic laws of the US set in the late nineties are not relevant in 2022. Steps can be taken to hold them accountable like this new legislation passed by the EU but the United States is slow to make progress on this front. The Digital Services Act ensures consumers and businesses benefit while being protected, even if at the cost of shareholders — which has really been the only focus for these companies.”
Digital Services Act would bring new restrictions on how #targetedadvertising can use #personaldata and a requirement that the inner workings of recommender algorithms be visible to the public. #privacy #respectdataClick to PostThe Digital Services Act is currently being reviewed by legal and technical experts before heading to a final vote by the EU Parliament and Council, a process it is expected to easily clear. It would come into force 20 days after publication in the EU Official Journal and enforcement would begin 15 months after that point, though an unspecified “longer period to adapt” for impacted small and medium sized businesses has been promised.