In August 2017, ARTICLE 19 analysed the Act to Improve Enforcement of The Law on Social Networks (the Act), which the Federal Council (Upper Chamber) of the German Parliament approved on 7 July 2017, and is expected to be published in the Gazette shortly. The Act will enter into force on 1 October 2017. ARTICLE 19 analysed the earlier draft of the Act.
ARTICLE 19 is deeply concerned that the Act will severely undermine freedom of expression in Germany, and is already setting a dangerous example to other countries that more vigorously apply criminal provisions to quash dissent and criticism, including against journalists and human rights defenders.
The Act establishes an intermediary liability regime that incentivises, through severe administrative penalties of up to 5 million Euros, the removal and blocking of “clearly violating content” and “violating content”, within time periods of 24 hours and 7 days respectively. As regulatory offences, it is possible for the maximum sanction to be multiplied by ten to 50 million Euros.
Though the Act does not create new content restrictions, it compels content removals on the basis of select provisions from the German Criminal Code. Many of these provisions raise serious freedom of expression concerns in and of themselves, including prohibitions on “defamation of religion”, broad concepts of “hate speech”, and criminal defamation and insult. Deputising private companies to engage in censorship on the basis of these provisions is deeply troubling, as they should not be criminal offences in the first place.
We are particularly concerned that the obligation to remove or block content applies without any prior determination of the legality of the content at issue by a court, and with no guidance to Social Networks on respecting the right to freedom of expression. Such private enterprises are not competent to make these complex factual and legal determinations, and the Act provides no recourse to users whose content is blocked or deleted unfairly. Though the Act creates a system for recognising “self-regulation institutions” to act as secondary review bodies for “unlawful content”, this recognition is conditional and held by an administrative body not insulated from political influence.
The likelihood of Social Networks being over-vigorous in deleting or blocking content is compounded by the legal uncertainty pervading the Act. The threshold at which a Social Network’s content removal and blocking processes will be considered systemic enough to attract administrative liability is unclear, and ambiguity in the definitions of key terms (including of “Social Network”) is likely to create an environment wherein lawful content is routinely blocked or removed as a precaution. The secondary review that would be provided by “self-regulation institutions”, and the limited oversight provided by the Administrative Courts does nothing to address over-blocking, and provides little protection or due process to Social Networks that in good faith refrain from blocking or removing content in the interests of respecting freedom of expression.
Summary of recommendations
- The Act should be repealed, with consideration given to retaining Section 2 on reporting requirements in alternative legislation to increase transparency around online content moderation by private actors;
- The German Criminal Code should be comprehensively revised to remove offences that are not compatible with international human rights law on freedom of expression, including but not limited to those listed in the Act.