On 20 July 2022, ARTICLE 19 and Digital Security Lab Ukraine submitted a third-party intervention to the European Court of Human Rights in a case concerning a user’s liability for third-party comments posted under his social media post. The Applicant in the case was ordered to remove the comments and fined for damages by a national court. This case is extremely concerning as it de facto imposes content moderation obligations on everyday users, alongside social media companies. We urge the European Court to reject such an interpretation of intermediary liability and exclude ordinary users from intermediary liability for third-party comments.
In the case of Patrascu v Romania, the applicant, an ordinary Facebook user, had been using the platform to publish content related to his love of opera. His Facebook posts were open for third-party comments. He was later found liable by the High Court for comments, posted under one of his social media posts, related to tensions within the Bucharest National Opera. The Applicant was ordered to delete the comments and pay damages to two employees of the Bucharest National Opera. The national court heavily relied on the European Court’s previous decision Delfi AS v Estonia, in which the European Court established the criteria for examining the proportionality of the interference with the freedom of expression of an intermediary.
In our view, people who use social media for their own personal comments should under no circumstances be made liable for the comments posted under or alongside their social media posts. Social media companies are themselves already struggling with content moderation activities, and ordinary users have no capacity and skills to act as content moderators themselves. If ordinary individuals are given the same responsibilities as social media companies, they will most likely disable comments on their social media accounts altogether. As online exchanges via comment threads are considered sources of information and contribute to diversifying online engagement, users’ removal of these sections severely restricts not only their right to information but also others’ right to freedom of expression. This could be qualified as a form of self-censorship and stop individuals from using social media to participate in public debate.
In our submission, we urge the Court to reject the idea that ordinary users should be considered liable for third-party comments based on the following:
- Social media users should not be deciding what content should be considered ‘legal’ or not. This can only be decided by a court or other independent authority.
- Imposing similar obligations on ordinary users as those imposed on social media platforms is deeply problematic. Social media platforms are already struggling with consistency when it comes to their content moderation practices. Content moderation is a deep and complex issue and ordinary users are not equipped to determine what content should be removed or not.
- In the wider context, this liability could be instrumentalised to target specific people or organisations. For example, individuals could comment under posts with the intention of making that specific person or individual liable. This could include civil society organisations, but also ordinary users who find themselves at the receiving end of another individual’s ill-intent.