On 13 May 2020, the French National Assembly adopted the Bill on Countering Online Hatred, or so called Avia Law (Projet de loi Avia). The Law requires the removal of “manifestly illegal” hate speech within 24 hours of notice. Systemic failure to cooperate with the authorities could lead to fines of up to 4% of global turnover. ARTICLE 19 deplores the adoption of the Law that will incentivise over-removal of content at the expense of freedom of expression.
ARTICLE 19 has previously raised concerns that the draft Avia Law failed to comply with international standards on freedom of expression. The final version of the Law is no different. In practice, the law itself has not substantially changed since our last analysis. Its scope remains incredibly broad both in terms of companies covered and subject area.
Removals within unduly short timeframes
A key concern is that it requires the removal of a broad range of “manifestly illegal” content within 24 hours of notice. Failure to remove a piece of content could lead to a EUR 250,000 fine for individuals and EUR 1,250,000 for companies. Intent can be derived from the absence of a proportionate and necessary examination of the notified content. Whilst the latter element is meant to offer protection to intermediaries, it is unclear how a ‘proportionate and necessary’ examination of the content is needed if it is “manifestly” unlawful.
In addition, the Avia Law now requires websites and ISPs to remove ‘terrorist’ content and child sex abuse material within one hour of being notified by a special branch of the police. It therefore seems to pre-empt the adoption at EU level of similar provisions as part of the controversial Terrorist Content Regulation.
More generally, the Avia Law seeks to make it easier for Internet users to report ‘illegal’ content. In particular, the law mandates uniform and easily accessible notification mechanisms and requires less information for notices to be valid. By contrast, notifications made in bad faith are punishable by one-year imprisonment and a fine of 15,000 Euros.
Failure to comply with duty to cooperate can lead to 4% of global turnover fine
As we noted previously, the Conseil Supérieur de l’Audiovisuel (CSA) will be tasked with overseeing that companies within scope comply with their ‘duty to cooperate’. In practice, this means compliance with a range of transparency and due process obligations, including:
- Providing clear information about their content moderation processes;
- Informing the author of the notification of its follow-up and reasons for the decision;
- Putting in place internal complaints mechanisms;
- Providing information about the technical and human means to deal with content removal requests;
- Promptly informing competent authorities about illegal content notified to them;
- Designating a single point of contact responsible for receiving requests from the judicial authorities or CSA.
The CSA can request all information necessary for its supervisory function and ensuring that companies within scope comply with their obligations under the law. In particular, the CSA can demand access to the principles and methods used to develop algorithms as well as the data used to train them. In determining whether or not a platform or search engine has failed to comply with its obligations under the law, the CSA takes into account the procedural and technical/human means put in place by the company.
If the platform fails to comply with the CSA’s formal notices and recommendations, the CSA may impose a pecuniary fine. The amount must take into account the seriousness of the breach of its obligations and may not exceed 20 million euros or 4% of the total annual worldwide turnover for the previous financial year, the highest amount to be the applicable one. Formal notices and penalties may be made public.
In addition, the CSA can encourage the adoption of cooperation tools to prevent the dissemination of content identical or specifically equivalent to content removed under the notification procedure. Similarly, the final version of the law retains provisions that enable administrative authorities to order the blocking and filtering of online services mirroring content that has been deemed illegal by a court decision.
A missed opportunity
The Avia Law is also a missed opportunity to tackle the challenges linked the dominance of some social media platforms. Although the new obligations should only apply to platforms or search engines with a certain number of users whose threshold is set by government decree, the law fails to take into account other factors in determining the applicability of the new rules, such as the gatekeeping role in access to the market.
Crucially, the law fails to introduce new unbundling or interoperability requirements that would help tackle the imbalance of power between dominant platforms and users and would more meaningfully address the current information asymmetry as well as the barriers to entry in social media markets.
Instead, the Avia Law risks entrenching the gatekeeping role of the biggest players by giving them powers to decide what content is illegal or not. In practice, it is highly unlikely that anyone but the most powerful companies will be able to afford the human and technological means needed to meet the targets set by the regulator.
Next steps
A group of MPs are challenging the constitutionality of the law before the Conseil constitutionnel. Notwithstanding this challenge, the adoption of the Avia law is likely to have a significant impact on the debate about the Digital Services Act (DSA) that will impose new rules on online platforms in the EU. It will put pressure on the European Commission and the European Council to design similar notice and takedown rules for illegal content whilst imposing significant fines for systemic failures to comply with the recommendations of the new regulator.
ARTICLE 19 believes that the French model seriously undermines freedom of expression. It is also likely to be counter-productive and lead to the removal of content of those whose voices the law is ostensibly seeking to protect. Some LGBTQI groups are already raising the alarm that Facebook is pre-emptively applying the law by relying on aggressive filters after over 30 Facebook accounts had their content removed. ARTICLE 19 supports those whose voices are not heard as part of our Missing Voices campaign. As the next battleground for freedom of expression online heads to Brussels, we will advocate for strong immunity from liability for online intermediaries and a notice and action model that protects free expression and due process rights.