European law

DSA is an EU effort to manage systemic online risks

The EU’s Digital Services Act (DSA) is innovative legislation. It tackles some serious problems with online communication, including platforms’ content moderation to prevent illegal as well as “awful but lawful” content.
The toughest new EU rules apply to online giants like Facebook, Twitter and Google. picture-alliance/NurPhoto/Jakub Porzycki The toughest new EU rules apply to online giants like Facebook, Twitter and Google.

If the DSA proves effective, it may have an impact far beyond European borders. The act is quite complex, and not all its norms are defined precisely. Jurists warn that implementation will be difficult.

The DSA came into force in November last year, and all of its obligation will apply in early 2024 at the latest. Its double goal is:

  • to ensure the online sphere is safe and open, with all parties’ fundamental rights being protected and
  • to facilitate healthy business competition in ways that encourage entrepreneurship and foster innovation.

In regard to fact-based journalism and its important opinion-shaping function in democratic societies, freedom of speech matters in particular. This fundamental right is sometimes misunderstood as a permission to lie and deceive. For good reason, the DSA takes a more nuanced approach in its effort to promote truthfulness in the public sphere.

The DSA is asymmetric. Very Large Online Platforms (VLOPs) and Very Large Search Engines (VLSEs) must fulfil the most duties. They are defined to be “very large” if they attract a monthly 45 million users in the EU. Giving them particular onerous obligations makes sense because they pose the greatest risks. On the other hand, forcing each and every start-up to fulfil the same duties would give the internet giants a competitive advantage because smaller businesses obviously do not have the same resources as they do.

The DSA gives users’ a minimum say in the recommendation systems that the VLOPs use. There must be an option to switch off individual profiling, for example. Moreover, users should be given choices regarding what kind of content they want to be made aware of.

The DSA aspires to restrict illegal content such as hate speech. Nonetheless, it does not make the VLOPs liable for distributing such content unless they are fully aware of its illegal nature. The background is that stringent liability might make them excessively careful and restrict speech that should be free. However, the DSA obliges the VLOPs to establish a system which allows users to report problematic posts easily and, on that basis, to remove obviously illegal content immediately. They must pay particular attention to “trusted flaggers” with a history of giving useful input. VLOPs, moreover, must inform the authorities in cases of doubt. In this regard, the DSA mainstreams existing rules in EU member states.

While conspiracy theories and disinformation are not always illegal, the DSA is meant to put a check on them too. For this purpose, the DSA introduces new due diligence obligations for content moderation. Platforms basically moderate content according to their own rules. The DSA demands that they spell out clear terms and conditions regarding what they permit or forbid. The DSA thus respects the fact that private businesses must be able to manage their affairs as they think best, but it also acknowledges that the VLOPs’ impact on the public sphere is too strong for them to be left entirely to themselves.

To protect users’ rights, the DSA demands that terms and conditions must be unambiguous and non-discriminatory. Moreover, the VLOPs must explain every restrictive decision to the user concerned, and that person must have an opportunity to express an objection and appeal for reversal.

The DSA aspires not only to protect individual users. It is also supposed to manage the systemic risks that result from the spread of disinformation online. For this purpose, it obliges the VLOPs to provide annual assessments of the systemic risks they perceive and report on how they intend to modify their content moderation accordingly. On request, the platforms must share relevant data with European authorities as well as with independent researchers vetted by European authorities.

Scholars’ assessment

However, the DSA neither spells out clearly how these new transparency requirements relate to private-sector companies’ protected trade secrets (concerning algorithms, for example) nor how they affect users’ data privacy. Florence G’Sell, who teaches digital governance at Science-Po University in Paris, regrets that “the provisions on controlling systemic risks and fostering dialogue between platforms and regulators on this matter are not more precise”.

According to Mattias Wendel, a law professor at Leipzig University, problems of this kind haunt the entire DSA. He explicitly wonders whether the EU is “taking or escaping legislative responsibility”.

The background is that the DSA tries to balance several fundamental rights, including the freedom of speech, data privacy and business freedoms (contract, occupation, property). Moreover, it is balancing private businesses’ profit interests with safeguarding the public sphere. Finally, it relies on different sets of rules, which include not only EU legislation, but also member states’ laws as well as corporate terms and conditions. In Wendel’s eyes, the DSA often fails to define precisely which norm gets preference in which context. He therefore expects law courts to resolve many open questions case by case. Final decisions will thus rest with the European Court of Justice (ECJ).

Recent history shows that EU regulations sometimes have an impact on the internet even beyond European borders. One reason is that they are copied in other jurisdictions. Moreover, online giants may comply with European rules voluntarily for either business or political reasons.

As stated above, there is generally reason to worry about censorship when government authorities get involved in issues of content distribution. The EU is probably setting a good example since the DSA is keenly aware of the fundamental rights of both providers and users. “The aim is to strengthen and consolidate European democrac(ies), and this proves that the European Union sees itself not only as an economic union but as a political union with common fundamental values,” argues Antje von Ungern-Sternberg, who has edited a useful collection of essays on content regulation in the EU (full disclosure: every literal quote in this article is from the book).

As a law professor at Trier University, she sees ample reason for optimism in the experience that “there is nothing to suggest so far that freedom of expression is not in good hands with the ECJ”. The European Court of Justice, of course, will be the ultimate arbiter in DSA enforcement. To what extent the DSA actually protects business interests and to what extent it safeguards public discourse will depend on its judgements.

Link
von Ungern-Sternberg, A., ed., 2023: Content regulation in the European Union
https://irdt-schriften.uni-trier.de/index.php/irdt/catalog/view/3/3/25

Hans Dembowski is the editor-in-chief of D+C/E+Z. He drafted the manuscript after a long conversation.
euz.editor@dandc.eu