ARTICLE 19 calls on the EU institutions to ensure strong protection of freedom of expression in the draft Regulation on Terrorist Content Online. Current proposals do not go far enough: mandatory automated censorship is not ruled out and cross-border removal orders could lead to a race to the bottom on the protection of freedom of expression. We urge EU lawmakers to listen to the concerns and recommendations of civil society to ensure that the draft Regulation fully respects fundamental rights.
On 29 September 2020, the German Presidency released its proposal for the EU Regulation on preventing the dissemination of Terrorism Content Online (TCO). This proposal comes on the heels of the fourth trilogue meeting between the European Parliament, the European Commission and the European Council on 24 September 2020. ARTICLE 19 had previously analysed a leaked draft Regulation in March 2020. Unfortunately, this new proposal largely fails to address the concerns we had outlined.
We call on the rapporteurs leading the negotiations to stand up for freedom of expression and other internet users’ rights and to heed our recommendations. In particular:
- Overbroad definition of terrorist content must be removed or narrowed: Under the new proposal, the definition of ‘terrorist content’ would include “material that is actually used by terrorist groups for recruitment and radicalisation purposes”. This means that “material soliciting criminal activities of a terrorist group should be included.” Although the German presidency says that this proposal is in line with the spirit of the Directive on Combating Terrorism which criminalises supporting activities as well as the jurisprudence of the European Court of Human Rights, it does not explain why this is the case. It also does not cite case-law from the European Court of Human Rights in support of its assertion. As AccessNow and EDRi have noted, the new proposed terms such as supplying information or material resources, funding terrorist activities in any way, or otherwise supporting criminal activities, are very vague and are highly likely to lead to abuse and overreach by the authorities. These terms should be removed or further narrowed.
- The scope of the free expression exemption must be broadened: The overbroad definition is not remedied by the proposed exemption for “freedom of expression and information, the freedom of the arts and sciences, the freedom and pluralism of the media protected under Union law”. While welcome, it is insufficient. ARTICLE 19 reminds EU legislators that when the drafters of the EU Charter of fundamental rights and the European Convention Human Rights put forward text protecting the right to freedom of expression, they did not specify that the right was protected only for a limited number of purposes. The danger is that content might be removed despite the fact that it may be perfectly legitimate for other reasons. Humour, for instance, is not mentioned as a ‘legitimate purpose’ under which ‘terrorist’ content may be removed. This means that an individual making a joke in bad taste could see his or her content removed because it does not fall under the limited exceptions proposed by the German Presidency. Equally, ‘freedom of the media’ suggests a very narrow understanding of journalism, one that excludes civil society, bloggers and others who engage in the dissemination of information in the public interest. Freedom of expression should not be protected à la carte for a limited number of purposes deemed legitimate by governments. We would therefore invite EU legislators to broaden the scope of the exemption for the protection of freedom of expression.
- Content removal orders must be made by courts and be geographically limited: The German Presidency’s proposal is essentially premised on the assumption that content must be removed first and mistakes rectified, if at all, later. It therefore turns the principle and the exception underpinning the protection of human rights on their head. Under international human rights law, states must justify restrictions on fundamental rights, not the other way around. Therefore, the question is not whether a provider has “reasonable grounds to believe that the removal order manifestly and seriously breaches the fundamental rights and freedoms set out in the EU-Charter of Fundamental Rights”. It is for the authorities to seek a court order and demonstrate that the content at issue is unlawful and that a removal order is necessary and proportionate.
Moreover, the Presidency wrongly assumes that the courts of all Member States interpret ‘incitement to terrorism’ laws in the same way. Terrorism is not just international in nature. On the contrary, it often reflects domestic or other regional issues. In 2017, for instance, the Spanish authorities sought to block websites that encourage participation in the referendum on Catalonian independence. It is not difficult to see how such calls could rapidly be equated with incitement to terrorism when tensions are high. What might be dubbed terrorist activity in some Member States could well be viewed as legitimate protest movements in other Member States. The upshot of pan-European blocking orders is that freedom of expression would be diminished in all other EU countries. In our view, content removal orders should be made by courts and limited geographically to the country where the content is deemed illegal.
- Unduly short timeframes should be abandoned: Despite the inclusion of a 12h warning period, timeframes for removal remain unduly short. Companies are required to remove content within one hour of receipt of a removal order. It is clearly insufficient for companies to consider whether the order should be challenged or appealed because it might be infringing the right to freedom of expression. Instead of imposing rigid and unduly short timeframes, the text could make reference to ‘prompt’ removals.
- Specific measures akin to general monitoring should excluded: ARTICLE 19 is concerned that specific measures are ultimately a by-word for proactive measures, i.e. automated filters. It is unclear how the blocking of e.g. particular images, let alone text, can be done without filtering every other pieces of content online. In other words, we are concerned that specific monitoring requires general monitoring to take place. Moreover, even if a particular image is illegal when used with intent to incite the commission of acts of terrorism, it may well be legitimate when used for journalistic purposes. Filters are currently unable to tell the difference or take context into account. In practice, this means that specific measures may well end up targeting perfectly legitimate content. In our view, ‘specific measures’ should be removed from the text.
- Designated competent authorities must be independent: ARTICLE 19 is very concerned by proposals that seek to avoid a requirement for removal orders to be issued by courts or independent administrative authorities. It is simply not good enough for law enforcement authorities to issue such orders. Insofar as other public authorities such as broadcasting regulators might be put in charge of issuing orders, we note that their independence is likely to be called into question in some Member States. It is vital for the independence of any authority issuing content removal orders to be guaranteed in the text.
- Public authorities should not be able to rely on referrals: we note that the proposal put forward by the Presidency makes reference to ‘referrals’ that were found in earlier versions of the draft Terrorist Online Content Regulation. As we noted previously, we believe that referrals are not good practice. They enable law enforcement agencies and other public authorities to sidestep the normal legal process. They are also generally harder to challenge since companies retain an extremely wide discretion in the application of their terms of service. Public authorities should not be able to rely on referrals.