The Digital Services Act (DSA) is an EU Regulation that became legally binding, at large, on 17 February 2024. The DSA applies to entities based both within and outside the EU that deliver content to the Single Market. Although the DSA states that it aims to “create a safer online environment for digital users and companies”, its interpretation and application might, in fact, violate the fundamental right to freedom of expression, contained in Article 11 of the EU Charter and Article 10 of the European Convention on Human Rights (ECHR).
Illegal content
The DSA introduces obligations for platforms to block “illegal” and “harmful” content and counter “disinformation” but leaves these concepts largely undefined. For example, the DSA mentions that “illegal content” refers to information that, in itself or in relation to an activity, such as the sale of products or provision of services, is not compliant with Union law or the national law of a Member State, regardless of the subject or nature of that law. Anyone, be it an individual or an entity, can flag content they believe to be illegal, and the online platforms would have to conform by removing content by fear of large financial penalties which might be imposed by the European Commission.
Hate speech, misinformation, and information manipulation
The concept of “hate speech,” which the DSA aims to tackle, is not found in any international convention; its legal basis is the EU’s Framework Decision of 28 November 2008, which defines “hate speech” as incitement to violence or hatred against a protected group of persons or a member of such a group. This circular definition of “hate speech” as incitement to hatred has triggered well-founded criticism by authors and members of civil society, who argue that whoever has the power to define hatred, has the power to determine what speech is deemed “legitimate and lawful” and which is classified as “unlawful”, and therefore subject to removal and criminalisation.
Because of their loose and subjective nature, “hate speech” laws are inconsistently interpreted and arbitrarily enforced and rely more on the subjective perception of hearers than the objective harm done. In view of these conceptual issues, the definition of “hate speech” is not harmonised at the EU level, meaning that what is deemed illegal in one country may not be in another.
The terms “misinformation” and “information manipulation” are even more difficult to define, interpret, and apply. While the DSA does not use “misinformation” in its main articles, it employs the term no less than 13 times in the recitals and identifies it as a key phenomenon to counter when discussing, in Article 36, the crisis response mechanism to extraordinary circumstances that pose a serious threat to public safety or health.
Freedom of expression
Through its global scope, the DSA’s application is likely to result in limitations to free speech, amounting to a potential infringement of Article 11 of the EU Charter and Article 10 of the ECHR.
The DSA’s approach to loose concepts such as “misinformation,” “disinformation,” “hate speech,” and “information manipulation” may lead to wide-sweeping removal of online content (wrongfully qualified as “illegal speech”) that would not meet the criteria of the strict limitations test of the EU Charter and ECHR (necessary, proportionate, legitimate, the least restrictive means for the achievement of one of the legitimate goals). Instead, the DSA lays the ground for shadow banning and institutionalised censorship, as was highlighted by several Members of the European Parliament (EP) in the recent EP Plenary debate on “the need to enforce the DSA to protect democracy on social media platforms including against foreign interference and biased algorithms”.
Similar concerns were raised in the Parliamentary Assembly of the Council of Europe (PACE) debate on Report 16089 on Regulating content moderation of social media to safeguard freedom of expression, which was adopted on 30 January of this year. In its Explanatory Memorandum (paras. 6.4-6.7), the Rapporteur referenced cases prior to the entry into force of the DSA, in which freedom of expression had a dubious application to social media platforms (for example, see this case of 2021, where the German Federal Court of Justice found Facebook not to be bound by freedom of expression in the same way as the State).
Although the Commission has the power to determine what is considered “illegal content,” by the DSA’s design, such a determination is final, i.e. not subject to judicial review by an independent court or tribunal. This clear lack of checks and balances is not in line with the rule of law, one of the core values of the EU, and may be a key factor in a future court ruling on the DSA’s infringement on free speech.
0 Comments