Abstract
Following the adoption of the Digital Services Act (DSA), a breakthrough legislation on platform regulation and content moderation, many were awaiting the decision of the Grand Chamber of the European Court of Human Rights (ECtHR) in the Sanchez vs. France case. The main legal issue centred on whether the European-style safe harbour model and the accompanying ban on tracking or monitoring private users would apply. The Grand Chamber's judgment, however, has raised more questions than it has answered. In the present study, we argue that the Grand Chamber's decision in the Sanchez vs. France case represents a striking departure from previous ECtHR judgments and the principles set forth by the Court, particularly in the Delfi and Tamiz cases. We claim that as a result of the Sanchez judgment it may appear that certain individuals, such as public figures and politicians, may be subject to continuous monitoring to avoid liability for third-party comments on their social media pages.
1 Responsibility for Internet content in the European Union in the 2000s
Media in Europe play a prominent role in maintaining and transmitting democratic rules and in maintaining, developing and disseminating cultural, social and societal values.1 It is worth noting that the Commission's 1999 statement has remained unchanged over the years: ‘Audiovisual media play a central role in the functioning of modern democratic societies. Without the free flow of information, such societies cannot function. Moreover, audiovisual media also play a fundamental role in developing and transmitting social values.’2
This is how the concept of information society services was born in Europe and it has become one of the key concepts3 in the twenty years since the adoption of the Directive on electronic commerce in digital services (hereinafter ‘E-Commerce Directive’)4 on 8 June 2000. The central rules on the liability regime for online content were set out in Section 4 of the E-Commerce Directive, entitled ‘Liability of intermediary service providers’. The set of rules used a threefold conceptual framework, with the first two (‘mere conduit’5 and ‘caching’6) giving providers immunity from liability under the US system.
More interesting, however, was the issue of the liability of hosting providers, the rules for which are set out in Article 14. Under this, the provider is - as a general rule - responsible for the content hosted on it and is only exempted from liability if:
it has no actual knowledge of any unlawful activity or information and, as regards claims for damages, no knowledge of facts or circumstances which would clearly indicate unlawful activity or information;7 or
as soon as it becomes aware of such facts, it promptly takes steps to remove or restrict access to the information.8
Thus, the (relative) novelty of the European system9 in 2000 was the ‘notice and takedown system’ (NTDS10), which has since been commonly used and thus introduced a multi-stage system of conditions and procedures: the intermediary service provider has to have a certain knowledge of content that is clearly illegal and has to take steps to remove it within a certain time. It can, therefore, be concluded that, in contrast to the US legislation, the European Union has opted for a different model (commonly referred to as the ‘safe harbour model’), which focuses on a non-automatic exemption.11
In addition to the NTDS, special mention should be made of Article 15 of the E-Commerce Directive,12 under which Member States shall not impose a general obligation on service providers to A) monitor information they transmit or store, or B) investigate facts or circumstances indicating illegal activity (‘general prohibition of monitoring or tracking’).13 This rule, therefore, does not oblige service providers, and thus not social media, to monitor content uploaded to them on a continuous basis.14
2 Relevant ECtHR case law
One of the significant problems with the E-Commerce Directive was that it did not provide a satisfactory answer to a number of questions in the case law, for example:
when it can be declared that the provider has actual knowledge;
what constitutes manifestly illegal content;
what is the time limit within which the provider must act;
whether we are talking about an active or passive type of provider when analysing liability issues.
Without examining these questions, the fundamental question of whether content has been removed lawfully or whether there are censorship effects cannot be answered. For this reason, the jurisprudence of international supranational courts and tribunals is of particular importance, as it can offer several points of reference even if it does not follow a clear pattern on specific issues. The ECtHR has clarified the application of the rules of the E-Commerce Directive in many important cases.
A key feature of the ECtHR's practice is that it consciously seeks to establish more generally applicable tests that can assist parties as well as national enforcers. The ECtHR's test of liability follows this path and clarifies the criteria for assessing whether the E-Commerce Directive and the European Convention on Human Rights (ECHR) are applicable. Thus, it is necessary to examine:
the context of the comments;
the measures taken by the portal to ‘prevent or remove’ the comments in question;
the possibility of holding the actual author(s) liable as an alternative to the portal's liability;
the consequences of the national judgment for the portal;
the conduct of the victim; and
the consequences of the comments for the victim.15
In addition to the tests, the ECtHR has made important clarifying statements about the NTDS, which, as it said in Delfi, is in many cases an appropriate means of balancing the rights and interests of all concerned.16 Nevertheless, the ECtHR has, in its decisions, clearly recognised that different types of procedures would be necessary for various types of content: with particular emphasis on the distinction between content that reaches or falls below the level of hate speech.17 In the Høiness case, the ECtHR held that, although some comments may be vulgar, the fundamental question to be examined is whether they reached the level of hate speech and, in addition, whether they did not amount to an incitement to violence.18 If these can be ruled out, then there is no need for an in-depth examination of the nature of the comments.19
It is worth noting that in several cases the ECtHR has distinguished between platform providers and other fora on the Internet and has not considered the Delfi judgment authoritative in relation to the latter.20 In the same case, the ECtHR stressed that content considered defamatory must reach ‘a certain level of seriousness’21 since millions of users post on the internet every day, and most of the content is insignificant or of low reach so it cannot cause significant damage to someone's reputation. In addition, politicians have an additional duty of tolerance compared to the average user. In addition to this, the ECtHR has consistently maintained in its decisions that only and exclusively the neutral, passive type of service provider can be exempted from liability in the application of the rules.
3 Responsibility for content on the internet in the European Union after the adoption of the DSA
While the ECtHR also made efforts to enhance the practice through its pertaining decisions, the relevant legislation was adopted in the 2020s, resulting from a grave concern regarding platforms' own content filtering mechanisms. To elaborate on this issue, it can be argued that giant platforms have, in many cases, acted like quasi-states concerning the content posted on their sites,22 and the European Union has sought to establish itself as a model for regulating technology companies, following the path it followed when it adopted the General Data Protection Regulation, the GDPR.23 This active, proactive role on the part of the European Union has ‘shifted the balance of power away from the United States and towards Europe for the first time in the history of the internet’.24 The Digital Services Act (DSA)25 was published in the Official Journal on 27 October 2022 and entered into force on 16 November 2022. The Regulation demonstratively seeks to put on a new footing a regulatory environment that has remained essentially unchanged for 20 years and which has been overtaken by time and technological progress.
With regard to the subject of the present study, i.e. liability for online content, the Regulation builds significantly on the previous tripartite division of intermediary service providers and the NTDS practice and clarifies rather than fundamentally transforms it. It maintains the tripartite division of liability for intermediary service providers (‘mere conduit’, ‘caching,’ and hosting)26 without fundametally changing the liability formulae as established in the E-Commerce Directive.
However, the NTDS was replaced by a so-called ‘notice and action mechanism’ (NAM27), which differs from the previous rules in some smaller respects. Under it, if the service provider receives a four-element notification,28 it will be deemed to have been effectively brought to the attention of the service provider29 and ‘a decision will be taken in a timely, diligent, non-arbitrary and objective manner in relation to the information which is the subject of the notification.’30 Of particular importance is the fact that the DSA also confirms the prohibition on requiring general monitoring.31
All of these issues were raised before the ECtHR prior to the DSA entering into force, when an individual of French nationality, Julien Sanchez, applied to the ECtHR in early autumn 2015, alleging a violation of his rights under Article 10 ECHR.
4 The national procedure
‘This BIGWIG has turned NIMES into ALGIERS, there’s not a street without a KEBAB SHOP and MOSQUE; DRUG DEALERS AND PROSTITUTES REIGN SUPREME, NO SURPRISE HE’S CHOSEN BRUSSELS CAPITAL OF THE NEW WORLD ORDER OF SHARIA… CHEERS UMPS [amalgam of UMP and PS, Socialist Party], AT LEAST WE DON’T HAVE TO PAY FOR THE FLIGHTS AND HOTEL… JUST LOVE this free version of CLUB MED… Thanks FRANCK and KISSES TO LEILLA… AT LAST, A BLOG THAT CHANGES OUR LIFE …’32
After S.B.’s comment, L.R. also made three comments, which, both in their wording and content, illustratively and undoubtedly blamed Muslims for the difficulties experienced by the city and its white population. A few days later, L.T., who was F.P.’s partner, noticed the post and the comments and felt that they were racist and directly and personally offensive to him. One of the reasons for this was that S.B. had added an additional ‘l’ to L.T.’s name in his comment to make it sound more North African, thus discrediting F.P.’s political work. A day later, Sanchez asked the FB wall users to ‘be careful with the content of the comments’33 but left the comments already posted intact and did not moderate or remove them.
The investigation by the French gendarmerie revealed that S.B. did not know that Sanchez's FB wall was public, and it also became clear that he was referring to L.T. in his post. During the investigation, Sanchez indicated two things: firstly, that he had 1,800 contacts and could not monitor all the comments that appeared, and secondly, that his post was not directed against any ethnic group and that he was not the author of the comments in question. However, he argued that such comments should ‘fit within the limits’ of freedom of expression in a political debate, especially during an election period. Still, he made his FB wall private a few days before the gendarmerie hearing to ‘avoid further incidents that were not his fault.’34
The Criminal Court of Nîmes’ (hereinafter ‘CCN’) position was that the text of the comments made it clear that the Muslim community and those of the Muslim faith were targeted and that the comments were likely to incite hostility, hatred or violence against them. The distorted use of the name Leilla only added a little extra to this effect and was capable of insulting L.T. personally. As for Sanchez's FB wall, the CCN concluded that since he had created this electronic communication platform of his own volition and had not removed the offending posts, he was liable as the ‘publisher’ of an online public communication site.35 The CCN also found Sanchez, S.B. and L.R. guilty according to the indictment and ordered each of them to pay a fine of €4,000. Sanchez and S.B. were also ordered to jointly pay €1,000 to L.T. as compensation for the non-material damage he suffered. Although the prosecutor's office had also requested that Sanchez be disqualified from the elections, the CCN did not consider it necessary to impose this sanction.
Sanchez appealed the judgment; however, while the Nîmes Court of Appeal (hereinafter ‘NCA’) upheld the lower court's decision, it reduced the fine to €3,000. In its reasoning, the NCA followed the arguments made by the CCN, adding two additional comments. Firstly, it considered that Sanchez had deliberately made his FB wall public in order to reach a wider audience during the election campaign and, secondly, that his political position required him to be more vigilant than the average user.36 The French Court of Cassation has rejected Sanchez's application for review, and he has exhausted the national remedies available to him.
5 Decisions of the ECtHR
5.1 Decision of the Fifth Section
On 15 September 2015, Sanchez filed an application with the ECtHR, alleging a violation of Article 10 ECHR, the article providing for freedom of expression. The members of the Fifth Section of the ECtHR heard the case. The main issue in the case was whether Mr Sanchez, as an individual, was responsible for hateful and exclusionary comments posted by third parties on his social networking site.
As the applicant, Sanchez argued that he had taken diligent care concerning comments on social media platforms to ensure that comments deemed unlawful were removed and highlighted the ECtHR's practice regarding the specific protection of political speech (expression) in the Court's case law.37 Mr Sanchez stressed, referring again to ECtHR case law,38 that it would have been an excessive commitment for him to close the space where the comments could be made in order to prevent the publication of offensive content, as this would have had a chilling effect39 on freedom of expression.40 On the contrary, in its argument, the defendant pointed out that FB walls ‘fall within the category of online public communication’ and that it is the content producer ‘who initiated the creation of an online public communication service for the exchange of opinions on predefined topics’, effectively posing Sanchez's page as a discussion forum.41 Relying on Section 24 of the nearly 150-year-old Act of 29 July 1881, which penalises conduct that creates feelings of hostility, rejection or hatred towards members of a community, the defendant argued that the interference with Sanchez's right to freedom of expression was lawful and served the purposes recognised by the ECHR, also taking into account the recognition of the applicant's higher right to freedom of expression resulting from his political affiliation.42 With regard to the comments, the defendant underlined that the objected comments conveyed a strong sense of rejection and hatred in terms of their purpose and that the under-posting of L.R.’s comments, in particular concerning the salience of the political context, was likely to convey the message that Sanchez did not consider exclusionary expressions to be problematic.43
The ECtHR used the well-established three-part test44 to assess the case at hand, reviewing the legality, the legitimate aims pursued and the necessity of the interference. In the following segments, the particular parts of the tripartite test will be summarised with special attention to the question of necessity. As for the legality of the interference, an essential point of reasoning in the ECtHR judgment was that the liability of the Facebook account owner for the content posted on the FB wall was a new legal issue, yet the judges found that the restriction did not violate the principles of availability and foreseeability. In light of this, the Court concluded that the interference was in accordance with the law and did not infringe on the person's right to freedom of expression.45 As regards the existence of a legitimate aim,46 the Court found that the intervention served the legitimate aim of protecting the reputation or rights of others.47
The question of necessity focused on two key issues to be underlined: whether the impugned comments posted by third parties constituted hate speech and, if so, whether Sanchez should be held liable for them. As for the hateful nature of the comments, the judges referred to the general principles for assessing the necessity of interference with freedom of expression, which the Court has often confirmed in previous cases.48 As is already well known – and richly documented49 – from ECtHR case law, freedom of expression is the ‘cornerstone’ of a democratic society;50 and freedom of expression implies the principle that its exercise applies not only to ‘accepted ideas’ but also to offensive or even disturbing communications.51 The Court ‘scrutinised’ the contextualised application of these principles to the present case, particularly regarding the comments posted by S.B. and L.R., discussed above. It was clearly established that the content of the comments constituted unlawful and hate speech, and it was also clear that the comments were anti-Muslim. It is of particular relevance that, although the Court underlined that political communications do indeed enjoy broader protection, the State – in this case, the French State – has a margin of appreciation regarding the application of restrictions.52 In an electoral context, racist or xenophobic discourse may contribute to incitement to hatred and intolerance since candidates may rely on slogans rather than rational arguments, and the effect of such discourse becomes even more damaging.
Concerning the hateful nature of the expressions, the Court also stressed the responsibility of politicians to combat hate speech. In examining the comments made by S.B. and L.R., who were not politicians or official representatives of a political party, the Court concluded that the findings of the domestic courts were well founded.53 Here, the Court had to consider whether the statements were lawful regardless of the political context in which they were made. The Court found that irrespective of their political nature, the comments were manifestly inflammatory and hateful. The ECtHR supported this reasoning by arguing that incitement to hatred does not necessarily require an explicit call to violence but includes attacks on individuals or groups that undermine their dignity or safety,54 so it is clear that the intention of the comments55 was to degrade Muslims.56
The second instrumental issue of the case involves the ECtHR's consideration of the issue of online liability.57 In this context, the Court has again noted that, despite the political context, there are limits to comments made in the context of political debate, particularly regarding respect for the reputation and rights of others.58 The Court emphasised the need to combat racial discrimination and intolerance, with reference also to the Le Pen and Féret cases,59 particularly in an electoral context characterised in this case by tensions specific to the city concerned. The ECtHR also examined Sanchez's position and underlined that his status as a politician does not exempt him from restrictions on hate speech and, drawing on previous case law of the Court, held that politicians, when making public statements, must avoid making comments that promote intolerance and must be particularly careful to protect democracy and its principles - all the more necessary during elections.60 In addition to the status test, the Court pointed out that the applicant's argument that he did not have time to check the comments on his platform was also unfounded, as Sanchez himself indicated that he checked his FB wall on a daily basis.61 Related to this, it was a significant finding by the ECtHR that Sanchez was also responsible for the comments because, in addition to daily monitoring, he should have been subject to increased monitoring and action concerning polemical comments as a political actor who made them public.62 The ECtHR also underlined the timeliness of the comments and their removal; while S.B.’s comment was removed within twenty-four hours, L.R.’s comment was visible and accessible on Sanchez's platform for almost six weeks.63
In sum, the Court established that (1) the comments disseminated by third parties on Sanchez's Facebook wall constituted hate speech and (2) Sanchez, as the ‘publisher’ of the site, was aware of the presence of hateful comments on the site, yet failed to act to remove them. Therefore, the applicant was liable for the comments.64 Consequently, the Court found that the judgment of the national courts was well founded and that interference with Sanchez was necessary in a democratic society; therefore, no violation of Article 10 ECHR could be established.65
In the case, Judge Mouru-Vikström dissented, referring to the ECtHR's previous judgment in the Delfi case, and pointed out that, when it comes to determining liability for third-party online comments, a distinction must be made between commercially run, professionally operated news portals and platform providers and platforms of individuals registered on social media platforms.66 According to the judge, Sanchez's FB fell into the second category, i.e. the category of blogs registered as private individuals, and he stressed that, in this light, the ECtHR's adoption of a judgment imposing an excessive duty of control on Sanchez was the wrong decision. Against this background, Judge Mouru-Vikström criticised the majority approach and expressed fears that the judgment would mean a reduction in freedom of expression on the Internet.
5.2 Decision of the Grand Chamber
Following the decision of the Fifth Section, the case was heard by the Grand Chamber of the ECtHR (hereinafter ‘Grand Chamber’), as Sanchez had requested that the case be referred. The Grand Chamber delivered its final judgment in Sanchez's case on 15 May 2023.
As noted under the summary of the Fifth Section's judgment, the primary issues to be deliberated were the necessity of the interference with Sanchez's right to freedom of expression and the liability of the ‘owner’ of a Facebook wall in the context of allegedly hateful speech disseminated on the platform by third parties. The Grand Chamber found that Sanchez's criminal conviction constituted an interference with Sanchez's right to freedom of expression under Article 10 ECHR. The judges also agreed that the interference met the requirement of necessity and served a legitimate aim, and therefore examined whether the interference was necessary and proportionate in a democratic society.67 To investigate the necessity issue more profoundly, the Grand Chamber analysed three critical circumstances based on the ECtHR's case law as previously described: A) the context of the comments, B) the conduct and actions of the applicant, Sanchez, in seeking to have the comments removed, and C) the legal action taken against Sanchez.
In its application, the applicant submitted that it had not been informed that the polemical comments should have been removed and that it would impose an excessive obligation on it as the ‘publisher’ of the FB wall to check the legality of the comments, as such an obligation would in essence amount to censorship.68 Sanchez contested the judgment against him on all criteria, claiming that it was not foreseeable that he would be convicted and contesting the necessity of the interference. In the latter context, he stressed that one of the messages for which he was convicted was removed within twenty-four hours and that the ‘left-on’ messages posted by another user (L.R.) were not manifestly illegal and were part of political discourse.69 The defendant maintained that the interference was justified and compatible with the ECHR. The applicant's arguments concerning the lack of accessibility, foreseeability and necessity of the interference were supported by the defendant's explanations of the applicable laws and regulations.70
Four third-party interveners in the case, namely A) the Government of the Slovak Republic, B) the Government of the Czech Republic, C) Media Defence and the Electronic Frontier Foundation and D) the European Information Society Institute, also submitted comments. The Slovak Government stressed that the internet is the primary venue for political discourse and public debate and presented statistics on the amount of political content accessed in Slovakia71 while proposing a cautious approach to criminalising politicians for online hate speech.72 The Czech government underlined that criminal prosecution could have a chilling effect and that the Court should, therefore, clarify the liability of a social media platform's owner, particularly for compliance with the principle of foreseeability.73 Media Defence and the Electronic Frontier Foundation argue that different users of social media should not be obliged to decide whether or not posts made by third parties on their accounts are lawful, as this is a matter for national courts.74 The European Information Society Institute echoed the latter, which stressed the principle of gradualism in intervention and supported the NTDS model in cases such as Sanchez's.75
Similar to the assessment of the Fifth Section, the Grand Chamber applied the aforementioned tripartite test. The Grand Chamber first examined the legality of the intervention in the light of foreseeability. Here, the judges explained in detail the principles of foreseeability, with ample reference to previous decisions,76 and concluded that, although the application of French law in the context of an FB wall is a legal novelty and creates a new legal situation, the interpretation of the national courts was lawful. As it is a borderline case, the mere fact that the application of the law is not entirely foreseeable does not render the intervention clearly unlawful.77 The Grand Chamber also found that the legitimate aim of the intervention was justified since it served not only the legitimate aim of protecting the reputation or the rights of others but also the legitimate aim of preventing disorder and crime.78 The key question, therefore, remained whether the intervention was necessary in a democratic society. As mentioned above, the two fundamental questions before the Court were the deliberation on whether the comments constituted hate speech and, subsequently, whether Sanchez should be held liable for them.
In its judgment, the Grand Chamber stressed that free political debate is an essential function of a democratic society and that the margin of discretion of Member States in such matters is narrow.79 Yet, political communication is not ‘absolute’ and can be restricted if it constitutes an infringement,80 in particular, if it is discriminatory and conveys hatred towards certain groups.81 In this context, the Grand Chamber also noted that as a political actor, Sanchez should take particular care to avoid speech that could reinforce intolerance, drawing an analogy with the Erbakan case.82 Finally, with regard to the political context, and again with reference to the Féret case,83 it was also noted that increased freedom of political discourse may also carry the risk that in such a situation, the effects of xenophobic and racist discourse may be greater and more damaging.
The Grand Chamber then discussed whether Sanchez was liable for comments posted by third parties on his FB page, particularly in light of the Delfi case.84 In this context, the first point highlighted by the judges was that his comments clearly constituted hate speech. The terms used in the comments, such as ‘kebab’, ‘sharia’, ‘drug trafficking by Muslims’, or even ‘veiled women’, clearly - and, as the judgment put it, ‘perfectly’ described a particular subject, the Muslim community.85 The Court then considered the steps taken by the applicant. Here, the Court first of all indicated that a minimum degree of filtering of unlawful (in particular manifestly unlawful) content by an account holder would be welcome86 but also underlined that in the present case, there was no requirement for Sanchez to filter comments. Nevertheless, the ECtHR found Sanchez liable; given that his FB page was publicly accessible and given the political context of the elections, Sanchez must have been aware that his page could become a source of tension.87
The Grand Chamber's reasoning even cited statements by Facebook to establish Sanchez's liability, which is a novelty. Although Facebook's rules state that each user is responsible for the legality of the content they post, the ECtHR drew attention to the fact that Sanchez had posted a message on his FB wall in which he stressed that his followers should ‘be careful’ about the content of his posts.88 According to the Court, this proves the presumption that Sanchez knew, or at least could have foreseen, that comments on his FB wall could contain potentially illegal content(s). The ECtHR also highlighted that Sanchez monitored his social media page on a daily basis. In addition to this, and perhaps the most important argument put forward by the Grand Chamber, was the finding that ‘notoriety and representativeness necessarily confer a certain resonance and authority on the words, acts or omissions of the person in question’.89 By declaring an omission, the ECtHR ultimately expressed that Sanchez was responsible for the comments posted on his site and not removed.
In light of the above, the Grand Chamber found that the interference was proportionate and lawful concerning the restriction of the right under Article 10 ECHR.90 The judgment was the subject of one concurring opinion, two separate opinions containing a wide range of arguments and one joint opinion.
In his dissenting opinion, Judge Bošnjak explains that, contrary to the majority decision, the interference violates Article 10 ECHR. Judge Bošnjak bases his dissenting opinion on two arguments. First, he disagrees with the majority's view that the applicant's conviction was foreseeable. Judge Bošnjak submits that the wording of the law itself does not support the possibility of prosecuting both parties in the context of ‘cascading’ criminal liability. Secondly, in the dissenting opinion, Judge Bošnjak questions the need to convict the applicant because of the comment published by S.B. He considers that the applicant should not have been held liable for the failure to delete the comment, given the factual circumstances of the case.91 This opinion was joined by Judge Kūris, who ultimately delivered a concurring opinion, arguing that, although the majority was right that the applicant's right under Article 10 ECHR had not been infringed, he considered that the Court should have given a more substantial and more precise judgment in cases involving hate speech.92
In a dissenting opinion, Judge Ravarani expressed regret that he was unable to vote in favour of the operative part of the judgment despite agreeing with much of the Court's reasoning. While agreeing with the legality of the interference as a whole, he was concerned that the concept of ‘publisher’ was unfounded. Judge Ravarani also agreed with the majority's conclusion that the domestic courts had not breached Article 10 ECHR in relation to the comments posted by L.R. on the applicant's FB wall but questioned whether S.B.’s comment was also one of the disputed comments, as it was deleted within twenty-four hours, and by the author of the comment and not the applicant. Judge Ravarani argues that by imposing liability on Sanchez in this regard, the ECtHR is overly extending the scope of the obligations imposed on him.93
In their joint dissenting opinion, Judges Wojtyczek and Zünd argue that applying French law to the present situation does not satisfy the requirement of foreseeability. Here, the judges underline that the requirement of foreseeability must be assessed from the point of view of the addressees of the norm, i.e. the ordinary people, which was not the case in the majority decision. In line with the obligation dilemma articulated by Judge Ravarani, the judges indicate that the monitoring of comments and the assumption of a quasi-censorship role by the owner of an FB wall impose a disproportionate and excessive obligation and suggested the introduction of a proportionate time limit for prior notification and removal of illegal content in such cases in the future.94
6 Brief reflections on the likely consequences of the case
The long-term effects of the Sanchez decision are to be treated with caution, but it is evident that it will strongly influence online political discourse. While the ECtHR has repeatedly underlined in the judgment itself the narrow scope for restricting political communication and political discourse, the present decision may be considered as a contrast to the consistent practice to date concerning political speech and communication.95 While not disputing the illegality of the comments, it is important to recall that the ECtHR has maintained that the protection of Article 10 ECHR extends to political and social information, even if it may be ‘shocking’ or offensive.96 As for the ‘shockingness’ of the impugned comments, Jacob van de Kerkhof notes that it can be argued that the expressions, in this case, were ‘not so clearly unlawful that an uninformed individual – such as a person monitoring their Facebook profile – would be able to spot their unequivocal unlawfulness.’97 It should also be noted that, although the ECtHR has underlined that the French election campaign was a tense period, and the Erbakan case cited above may undoubtedly be a case in point, Sanchez himself did not publish any hate content. The Court's argument that the requirement of foreseeability becomes quasi ‘malleable’ when a new legal situation comes before the courts may seem desperate, and this could become a hotbed for self-proclaimed censors, with a chilling effect on commentators - which would be inconsistent with the protection and promotion of political dialogue, as protected by the ECtHR, as outlined above. The consistency of the decision with the value of pluralism of opinion and the principle of public information and participation in the debate is also questionable.98
From a practical point of view, the ruling raises several controversies and its precedent value. Although the Sanchez case is local, can it be applied to a national election? Would the ECtHR have ruled similarly if, instead of an election in an area with a population of 150,000, presumably the most local activity-generating election, similar hate speech had been disseminated on a candidate's page in Paris?
From another standpoint, it is instrumental to highlight that the Sanchez ruling is not in harmonious line with the current legislative ecosphere concerning content liability introduced by the European Union, best exemplified by the Digital Services Act (DSA) adopted in October 2022.99 The DSA, though extensively widening the scope of liability, does not impose individual obligations on users of online platforms to engage in content moderation. Instead, the DSA proposes a standardized and more transparent mechanism for users to report allegedly unlawful content,100 obliging the intermediaries – as per the terminus technicus of the DSA, online platforms – to moderate, provide reasoning and transparently101 enforce content moderation on their respective platforms. In the Sanchez ruling, however, the Court opted for a much broader understanding of content moderation – one that expands far beyond intermediaries, commercial organisations and online platforms; one which obliges the user to act.
The table below compiling the key differences in the Sanchez ruling and anterior case law precedent:
Anterior case law precedent | Sanchez ruling | |
Heightened protection of politicians' speech despite its vehemency during a public debate or political campaign. | ←→ | Heightened accountability obligations for politicians to monitor supposedly hateful and/or xenophobic expressions as they are magnified during a political debate. |
Limited liability for comments on online platforms as presented best in the Delfi case. | ←→ | Heightened liability for users regarding unlawful comments of others. |
Symbolic sanctions for unlawful expression of politicians during a political debate or campaign period. | ←→ | Rigorous sanctions for failure to monitor other's unlawful expression as a politician. |
Limited scope of liability as an internet intermediary. | ←→ | Potentially, all users can be held liable as an internet intermediary. |
Consistent attitude to prevent a possible chilling effect even in the light of offensive speech disseminated. | ←→ | New attitude to set the level of protection on a lower threshold and possibly open the possibility of a wider margin of interference, in the case of offensive speech. |
7 Summary
Political debates are a cornerstone of our society, especially in an election process. If states can easily restrict freedom of expression in such cases, democratic discourse, and ultimately democratic societies, may suffer.102 On the one hand, departing from the position adopted in Delfi and Tamiz,103 the ECtHR did not merely impose liability for comments on large commercially run and professionally operated portals but also made it seem as if certain individuals were now subject to continuous monitoring. ‘In practice, this ruling means that some prominent social media personalities would have the same obligations as platforms.’104 Such an extension of responsibility for online content carries with it a number of dangers: it could expose certain prominent users to incriminating attacks that would be almost impossible to defend against.105 And all it takes is a few offending comments that are not removed, whether made by real people, trolls or bots.106
To further support our claim that the decision at hand is a somewhat opaque (or, as critics of the decision would suggest, ‘awkward at best… and untenable at worst’107) approach to the assessment of responsibility regarding hate speech issues, it is vital to revisit once again the severity of the sanction imposed on Sanchez. Päivi Korpisaari recalls for comparison the Brasilier v. France case,108 in which Benoît Brasilier was fined a symbolic 1 franc in damages for disseminating defaming slogans against his political opposition.109 In the Sanchez ruling, however, the applicant was fined 5,000 Euros in total,110 resulting in a much more rigorous approach to imposing sanctions on politicians participating in a political debate. The severity of the sanction, as well as the Court's decision to find that the administrator's liability, which has been traditionally liaised with large commercial organisations, can be extended to politicians' Facebook walls as well, further suggests a striking divergence from previous case law findings and principles. Though Korpisaari finds that the Court's decision is adequate concerning the combat against hateful and xenophobic expression, and he rightly recites the margin of appreciation doctrine awarded to the state,111 we also suggest taking into account van de Kerkhof's alarming provision regarding the existing possibility of holding everyone with a social media presence liable as a possible intermediary entity.112
Two paths then emerge, neither of which is reassuring:
politicians and public figures will prevent certain people – purely based on their views – from accessing their public walls and commenting, as this could put those participating in democratic debates at legal risk or
out of caution, because they lack the knowledge and time to judge illegality, they will delete all potentially dangerous comments, which will undoubtedly result in the removal of legitimate content.113
Either way, the chilling effect – which Koen Lemmens warned against in his study immediately after the Fifth Section's judgment114 – will help to keep the lively democratic debate burning at a lower flame. Unfortunately, the path the ECtHR has taken with this decision has brought us one step closer to an already growing digital authoritarianism.115
Acknowledgment
Gergely Gosztonyi is supported by the János Bolyai Research Scholarship of the Hungarian Academy of Sciences.
Gergely Ferenc Lendvai is supported by the EKÖP-24-3 New National Excellence Program of the Ministry for Culture and Innovation from the source of the National Research, Development and Innovation Fund.
References
Alves Das Chagas, C., ‘Balancing Competences and the Margin of Appreciation: Structuring Deference at the ECtHR’ (2022) 1 ICL Journal 1–26. https://doi.org/10.1515/icl-2021-0009.
Atamanchuk v. Russia App no. 4493/11 (11 February 2020).
Bietti, E., ‘A genealogy of digital platform regulation’ (2023) 7 Georgetown Law Technology Review 1–68.
Bradford, A., Digital Empires. The Global Battle to Regulate Technology (Oxford University Press 2023). https://doi.org/10.1093/oso/9780197649268.003.0002.
Brasilier v. France App no. 71343/01 (11 April 2006).
Castells v. Spain App no. 11798/85 (23 April 1992).
Celeste, E., Palladino, N., Redeker, D. and Yilma, K., The Content Governance Dilemma. Digital Constitutionalism, Social Media and the Search for a Global Standard (Palgrave Macmillan 2023). https://doi.org/10.1007/978-3-031-32924-1.
Communication from the Commission to the Council, the European Parliament, the Economic and Social Committee and the Committee of the Regions: Principles and guidelines for the Community’s audiovisual policy in the digital age. COM/99/0657 final.
De Streel, A., Defreyne, E., Jacquemin, H., Ledger, M., Michel, A., Innesti, A., Goubet, M. and Ustowski, D., Online Platforms’ Moderation of Illegal Content Online. Law, Practices and Options for Reform (European Parliament 2020).
Delfi AS v. Estonia App no. 64569/09 (16 June 2015).
Digital Millennium Copyright Act (Pub. L. No. 105–304, 112 Stat. 2860 (Oct. 28, 1998).
Dink v. Turkey App nos. 2668/07, 6102/08, 30079/08 et al. (14 September 2010).
Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (Directive on electronic commerce).
Erbakan v. Turkey App no. 59405/00 (6 July 2006).
Féret v. Belgium App no. 15615/07 (16 July 2009).
Gerards, J., ‘How to improve the necessity test of the European Court of Human Rights’ (2013) 11 International Journal of Constitutional Law 466–90. https://doi.org/10.1093/icon/mot004.
Gosztonyi, G., Censorship from Plato to Social Media (Springer 2023). https://doi.org/10.1007/978-3-031-46529-1_6.
Gosztonyi, G., Galewska, E., and Školkay, A., ‘Challenges of Monitoring Obligations in the European Union’s Digital Services Act’ (2024) 1 ELTE Law Journal 45–60. https://doi.org/10.54148/ELTELJ.2024.1.45.
Gosztonyi, G. and Lendvai, G. F., ‘Online platforms and legal responsibility: a contemporary perspective in view of the recent U.S. developments’ (2024) 18 Masaryk University Journal of Law and Technology 125–41. https://doi.org/10.5817/MUJLT2024-1-5.
Gowder, P., The Networked Leviathan. For democratic platforms (Cambridge University Press 2023). https://doi.org/10.1017/9781108975438.
Handyside v. United Kingdom App no. 5493/72 (7 December 1976).
Høiness v. Norway App no. 43624/14 (19 March 2019).
Husovec, M., ‘Rising above liability: The Digital Services Act as a blueprint for the second generation of global Internet rules’ (2023) 38 Berkeley Technology Law Journal 883–920. https://doi.org/10.2139/ssrn.4598426.
Husovec, M., Grote, T., Mazhar, Y., Mikhaeil, C., Escalona, H. M., Kumar, P. S. and Sreenath, S., ‘Grand confusion after Sanchez v. France: Seven reasons for concern about Strasbourg jurisprudence on intermediaries’ (2024) Maastricht Journal of European and Comparative Law (Ahead of print). https://doi.org/10.1177/1023263X241268436.
Khadija Ismayilova v. Azerbaijan App nos. 65286/13 57270/14 (10 January 2019).
Kim, N. S. and Telman, D. A. J., ‘Internet Giants as Quasi-Governmental Actors and the Limits of Contractual Consent’ (2015) 3 Missouri Law Review 723–70.
Korpisaari, P., ‘From Delfi to Sanchez – when can an online communication platform be responsible for third-party comments? An analysis of the practice of the ECtHR and some reflections on the Digital Services Act’ (2022) 2 Journal of Media Law 352–77. https://doi.org/10.1080/17577632.2022.2148335.
Korpisaari, P., ‘Sanchez v. France: ECtHR judgment raises questions about politician’s liability to moderate his own Facebook wall’ (2023) Journal of Media Law 1–12. https://doi.org/10.1080/17577632.2023.2287954.
Kovács, A., ‘Botok, automatizált fiókok a közösségi médiában’ (Bots, automated accounts on social media) in Nagy, M. and Fazekas, M. (eds), Jogi Tanulmányok 2022 (Eötvös Loránd Tudományegyetem Állam- és Jogtudományi Kar Állam- és Jogtudományi Doktori Iskola 2022) 209–23. https://doi.org/10.56966/2022.14.Kovacs.
Kovács, I., Polyák, G. and Urbán, Á., Media landscape after a long storm. The Hungarian media politics since 2010 (Mérték Médiaelemző Műhely 2021).
Le Pen v. France App no. 18788/09 (20 April 2010).
Lemmens, K., ‘Freedom of Expression on the Internet after Sanchez v. France: How the European Court of Human Rights Accepts Third-Party Censorship’ (2022) 3 European Convention on Human Rights Law Review 525–50. https://doi.org/10.1163/26663236-bja10046.
Lendvai, G. F., ‘Media in War: An Overview of the European Restrictions on Russian Media’ (2023) 8 European Papers 1235–45. https://doi.org/10.15166/2499-8249/715.
Magyar Tartalomszolgáltatók Egyesülete and Index.hu Zrt. v. Hungary App no. 22947/13 (2 February 2016).
Melnychuk v. Ukraine App no. 7707/02 (19 October 2004).
Müller and Others v. Switzerland App no. 10737/84 (24 May 1998).
Observer and Guardian v. the United Kingdom App no. 13585/88 (26 November 1991).
OHCHR: The Rabat Plan of Action, A/HRC/22/17/Add.4.
Oster, J., Media Freedom as a Fundamental Right (Cambridge University Press 2015). https://doi.org/10.1017/CBO9781316162736.
Oster, J., European and International Media Law (Cambridge University Press 2017). https://doi.org/10.1017/9781139208116.
Peguera, M., ‘The DMCA Safe Harbors and Their European Counterparts: A Comparative Analysis of Some Common Problems’ (2009) 4 Columbia Journal of Law & the Arts 481–512.
Penney, J., ‘Understanding Chilling Effects’ (2021) 106 Minnesota Law Review 1451–530.
Perinçek v. Switzerland App no. 27510/08 (15 October 2015).
Pihl v. Sweden App no. 74742/14 (9 March 2017).
Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation).
Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act).
Sanchez v. France App no. 45581/15 (2 September 2021).
Sanchez v. France App no. 45581/15 (15 May 2023).
Stoll v. Switzerland App no. 69698/01 (10 December 2007).
Tamiz v. United Kingdom App no. 3877/14 (19 September 2017).
UNHRC Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression. 2011, UN Doc A/HRC/17/27.
Uran, P., ‘Freedom of Expression as the Cornerstone of Democracy’ (2010) 3 International Journal of Arts and Sciences 483–93.
van de Kerkhof, J., ‘Sanchez v. France: The Expansion of Intermediary Liability in the Context of Online Hate Speech’, Strasbourg Observers (17 July 2023) <https://strasbourgobservers.com/2023/07/17/sanchez-v-france-the-expansion-of-intermediary-liability-in-the-context-of-online-hate-speech> accessed 10 October 2024.
Van Hoboken, J., Quintais, J., Poort, J. and Eijk, N., Hosting intermediary services and illegal content online. An analysis of the scope of article 14 ECD in light of developments in the online service landscape: final report (Luxembourg, European Commission 2018).
Vérités Santé Pratique Sarl v. France App no. 74766/01 (1 December 2005).
Voorhoof, D., ‘Same Standards, Different Tools? The ECtHR and the Protection and Limitations of Freedom of Expression in the Digital Environment’ in O’Boyle, M. (ed), Human Rights Challenges in the Digital Age: Judicial Perspectives (Council of Europe 2020) 11–46.
Voorhoof, D., ‘European Court of Human Rights: Sanchez v. France’ (2021) 1 IRIS 1–4.
Willem v. France App no. 10883/05 (16 July 2009).
Wilman, F., The Responsibility of Online Intermediaries for Illegal User Content in the EU and the US (Elgar 2020). https://doi.org/10.4337/9781839104831.
Wingrove v. the United Kingdom App no. 17419/90 (25 November 1996).
Links
Link1: ‘European Court of Human Rights: Dangerous decision for free speech online’, Article-19 (15 May 2022) <www.article19.org/resources/european-court-of-human-rights-dangerous-decision-for-free-speech-online> accessed 10 October 2024.
Link2: ‘Intervention in Sanchez v. France’, Media Defence (8 April 2022) <https://www.mediadefence.org/wp-content/uploads/2022/05/20220408-FINAL-Sanchez-v-France-MD-EFF-Written-Comments-1.pdf> accessed 10 October 2024.
Link3: ‘The return of digital authoritarianism’, AccessNow (24 May 2022) <www.accessnow.org/cms/assets/uploads/2022/05/2021-KIO-Report-May-24-2022.pdf> accessed 10 October 2024.
Communication from the Commission to the Council, the European Parliament, the Economic and Social Committee and the Committee of the Regions: Principles and guidelines for the Community's audiovisual policy in the digital age. COM/99/0657 final.
On different platform regulation concepts cf. Bietti (2023) 32–40.
Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (Directive on electronic commerce).
E-Commerce Directive Article 12.
E-Commerce Directive Article 13.
E-Commerce Directive Article 14.
The procedure first appeared in the US Digital Millennium Copyright Act (Pub. L. No. 105–304, 112 Stat. 2860 (Oct. 28, 1998)), but it applies only to copyright infringement; Peguera (2009).
However, it is important to note that under Article 14(3) of the E-Commerce Directive, Member States have the possibility to establish procedures to regulate the removal of information or the withdrawal of access.
Oster (2017) 234–36.
It should be stressed, however, that according to Recital 47 of the E-Commerce Directive: ‘this does not concern monitoring obligations in a specific case’.
Van Hoboken et al. (2018) 45–47.
Delfi AS v. Estonia App no. 64569/09 (16 June 2015), [144]–[161]; Magyar Tartalomszolgáltatók Egyesülete and Index.hu Zrt. v. Hungary App no. 22947/13 (2 February 2016), [80]–[85]; Pihl v. Sweden App no. 74742/14 (9 March 2017).
Tamiz v. United Kingdom App no. 3877/14 (19 September 2017), [85]; Delfi AS v. Estonia App no. 64569/09 (16 June 2015), [159].
Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation).
Bradford (2023). Also, cf. Wilman (2020) and Gosztonyi and Lendvai (2024).
Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act).
DSA Articles 4–6.
DSA Article 16.
DSA Article 16(2).
DSA Article 16(3).
DSA Article 16(6).
DSA Article 8. Cf. Gosztonyi et al. (2024).
Korpisaari (2022) 369.
Magyar Tartalomszolgáltatók Egyesülete and Index.hu Zrt. v. Hungary App no. 22947/13 (2 February 2016).
Penney (2021); UNHRC: Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression. 2011, UN Doc A/HRC/17/27, 26., 28.
Oster (2015) 123–24.
Handyside v. United Kingdom App no. 5493/72 (7 December 1976); Perinçek v. Switzerland App no. 27510/08 (15 October 2015).
Handyside v. United Kingdom App no. 5493/72 (7 December 1976), [49]; Dink v. Turkey App nos. 2668/07, 6102/08, 30079/08 et al. (14 September 2010), [137]; Vérités Santé Pratique Sarl v. France App no. 74766/01 (1 December 2005); Observer and Guardian v. the United Kingdom App no. 13585/88 (26 November 1991), [59]; Khadija Ismayilova v. Azerbaijan App nos. 65286/13 57270/14 (10 January 2019), [158].
Uran (2010) 483–85.
Le Pen v. France App no. 18788/09 (20 April 2010); Féret v. Belgium App no. 15615/07 (16 July 2009).
Sanchez v. France App no. 45581/15 (15 May 2023), [112]; Cengiz and Others v. Turkey App nos 48226/10 and 14027/11 (1 December 2015), [49].
Sanchez v. France App no. 45581/15 (15 May 2023), [144]; Perinçek v. Switzerland App no. 27510/08 (15 October 2015), [153].
Sanchez v. France App no. 45581/15 (15 May 2023), [146]; Willem v. France App no. 10883/05 (16 July 2009).
Wingrove v. the United Kingdom App no. 17419/90 (25 November 1996), [58]; Stoll v. Switzerland App no. 69698/01 (10 December 2007), [106].
Melnychuk v. Ukraine App no. 7707/02 (19 October 2004), [2]; Magyar Helsinki Bizottság v. Hungary App no. 18030/11 (8 November 2016), [163].
Cf. Lendvai (2023).
Cf. Gowder (2023).
Kovács (2022) 209–12.