Following the leak of the EU’s draft Terrorist Content Regulation (Draft Regulation) on 6 March 2020, ARTICLE 19 has in a new briefing set out our serious concerns with what is, in our view, an extremely regressive piece of legislation, that fails to protect human rights, in particular the rights to freedom of expression and privacy and data protection. We are particularly concerned about proposals for mandatory filters, obligations to remove broadly defined terrorist content within one hour and insufficient procedural safeguards for the protection of freedom of expression and privacy online.
The Draft Regulation is currently under discussion in trialogue negotiations between the European Parliament, the European Commission and the European Council. As the end of the trialogue negotiations draws near, ARTICLE 19 urges the European Commission and the European Council to follow the lead of the European Parliament to protect users’ rights to freedom of expression and data protection and amend the draft Regulation.
The Commission and Council’s proposals to the Draft Regulation continue to display the fanciful thinking that automated filters can solve difficult free speech issues within one hour, and that rights can be protected as an afterthought with remedies that are, more often than not, underdeveloped and under-funded both within companies and at state level. The drafters appear not to have considered the potential impact of automated filters on the freedom of speech of marginalised communities. In addition, barely any mention seems to have been made of the General Data Protection Regulation (GDPR), despite the fact that filters infringe the rights to privacy and data protection.
Importantly, it is not clear that the Draft Regulation is necessary, coming as it does on the heels of the Terrorism Directive, which was also supposed to deal with online terrorist content. The Directive is still being implemented at domestic level, so its impact has yet to be determined.
Although not perfect, the European Parliament’s proposal goes a long way towards mitigating the worst aspects of the draft Regulation. As such ARTICLE 19 largely supports it, along with the recommendations of our partners at European Digital Rights Initiative.
We call on the EU institutions to protect freedom of expression by:
- Limiting the scope of the Terrorist Content Regulation to the public dissemination of ‘terrorist’ content so that private messaging and other communication services are out of scope;
- Making clear that content published for journalistic, educational, research, artistic or other lawful purposes is excluded from the definition of terrorism;
- Ensuring that removal orders are made by courts or tribunals and that compliance takes place within a reasonable timeframe, e.g. 7 days. In urgent cases, e.g. someone’s life is at risk, law enforcement should be given statutory powers to order the immediate removal or blocking of access to the content at issue. Any such order should be confirmed by a court within a specified period of time, e.g. 48 hours;
- Ensuring that there is no general obligation to monitor or mandatory proactive measures. Specific measures should only be ordered by a court and only apply to content that is identical to content that has been declared unlawful by a court;
- Ensuring that hosting providers are transparent about their use of automated tools, including by providing meaningful information allowing the public to understand how algorithms are used to moderate ‘terrorist’ content; they should also ensure that automated tools are only used to identify or detect content rather than to remove content, which should involve human decision-making and verification.
- Ensuring that users have a right to challenge removal decisions as part of the cross-border consultation or cooperation processes. Further or in the alternative, the European institutions should explore the possibility for a public interest advocate or defender of fundamental rights to be informed of removal decisions so as to raise their compatibility with fundamental rights and take the matter to court, where appropriate. Remedies and complaint mechanisms should be appropriately resourced to be effective.
In addition, we believe that the relationship between the Draft Regulation and other relevant EU legislation, including the E-Commerce Directive and the Audio-Visual Media Services Directive, should be clarified. Moreover, the European institutions should explain the extent to which the proposed measures under the Terrorist Content Regulation are compatible with the GDPR.
The Draft Regulation could significantly influence the types of measures we see in the upcoming EU Digital Services Act. No one is denying that governments should do something about terrorist content, but they must also respect our free speech and data protection rights. As the end of the trialogue negotiations approaches, now is the time for the EU institutions to negotiate an alternative that will protect those European values.