It is estimated that there over 2 billion people connected to the internet.
Internet intermediaries, such as internet service providers (ISPs), search engines and social media platforms, play a crucial role in enabling people around the world to communicate with each other.
Because of their technical capabilities, internet intermediaries are under increasing pressure from governments and interest groups to police online content.
At the same time, intermediaries can and do ban certain types of content. Usually this censorship takes place outside the scope of internationally recognised standards governing the permissible limitations on freedom of expression.
The problem intermediaries censoring content is further compounded by the lack of transparency in the way these restrictions are implemented; the lack of clear guidelines that users can refer to; and the absence of appropriate mechanisms for users to appeal against decisions by ISPs to censor user-generated content. This effectively means that online content is increasingly being regulated and censored via private contracts that offer limited transparency and accountability.
In response to the situation, ARTICLE 19 has developed a policy brief to address questions about intermediary liability. Drawing on international freedom of expression standards and comparative law, this brief explains how a widespread regime of liability poses risks to the exercise of freedom of expression online.
ARTICLE 19 propose a number of alternative models that can already be found in some jurisdictions, and which offer stronger protection to online freedom of expression.
We hope that this policy brief will help legislators, policy makers, judges and other stakeholders strike the right balance between the protection of freedom of expression online and the protection of other interests, such as the prevention of crime and the rights of others.
ARTICLE 19’s key recommendations:
- Web hosting providers or hosts should in principle be immune from liability for third party content when they have not been involved in modifying the content in question.
- Privatised enforcement mechanisms should be abolished. Hosts should only be required to remove content following an order issued by an independent and impartial court or other adjudicatory body, which has determined that the material at issue is unlawful.
- From the hosts’ perspective, orders issued by independent and impartial bodies provide a much greater degree of legal certainty.
- Notice-to-notice procedures should be developed as an alternative to notice and take down procedures. These would allow aggrieved parties to send a notice of complaint to the host. Notice-to-notice systems should meet a minimum set of requirements, including conditions about the content of the notice and clear procedural guidelines that intermediaries should follow.
- Clear conditions should be set for content removal in cases of alleged serious criminality.
You can download our policy here – in English, Spanish, French and Bahasa Indonesian.