ARTICLE 19 welcomes the report of the UN Special Rapporteur on freedom of expression, David Kaye, calling upon the information and communications technology (ICT) sector to respect human rights. It will be presented and discussed by states and other stakeholders during the UN Human Rights Council’s 32nd Session in June 2016 (HRC 32).
In our submission to the report, we emphasised that the Internet has fundamentally changed the way in which people communicate and go about their daily lives, and that understanding the role and responsibilities of private actors is key to protecting freedom of expression online.
“The Internet is the front line in the battle for freedom of expression and civic space,” said Thomas Hughes, Executive Director at ARTICLE 19.
“The pressures that States and others exert on private actors to control the information people can access and share has received too little attention from the UN, which this report redresses. At the same time, the report makes clear that private sector as well as Internet standard setting organisations managing and maintaining the Internet’s infrastructure, can and should do a lot more to protect and promote freedom of expression,” Hughes added.
The Special Rapporteur’s report recognises that it is overwhelmingly private actors who create, maintain and operate the infrastructure and spaces for freedom of expression and access to information. These actors are diverse, comprising an “ecosystem” including telecommunication providers, hardware and software developers, email and social media providers, as well host and search companies. Technical bodies and regulators at the international and national level also play a key role. The Special Rapporteur identifies that while private actors have provided exponential opportunities for free expression, the decisions they take and the pressures they are subject to have a profound impact on the exercise of rights.
The report is intended as an initial mapping of the private actors in the ICT sector that influence freedom of expression, identifies legal and policy issues at the intersection of state and private sector conduct, and outlines areas where further normative guidance will be provided in future reports.
Among the numerous concerns outlined in the report, the Special Rapporteur identifies the following:
Content restrictions
The Special Rapporteur raises concerns at the proliferation of vague laws targeting online content, with broad offences that essentially give authorities unfettered discretion to criminalise any expression they find objectionable.
“Excessive intermediary liability” is also singled out as a key problem, in particular where private companies are required or pressured, by law or through extra-legal measures, to regulate content transmitted through or hosted on their services. Such measures range from licensing requirements imposed on internet access providers to threats that local access to those services will be blocked. They also include requirements or legal incentives imposed on intermediaries to assess the validity of State and private actors’ requests to remove various types of content, from data protection to copyright claims. In this regard, the Special Rapporteur seriously questions whether private companies are best placed to make determinations of content illegality. He also raises questions as to the appropriate balance between the rights to privacy and protection of personal data, on the one hand, and the right to seek, receive and impart information on the other – an area in which ARTICLE 19 is taking on a leading role with the development of Principles on Freedom of Expression and Privacy.
The Special Rapporteur’s concerns are not limited to State regulation and extra-legal restrictions imposed by States however. He also highlights a number of issues raised by companies’ internal policies and practices, from the inconsistent application of companies’ broad Terms of Service, to filtering, network or services shutdowns and “zero-rated” services. The impact of design and engineering choices, which impact the way in which users access and view information on platforms, is also mentioned.
Technical standards
For the first time, the Special Rapporteur recognises that technical standards have profound implications for freedom of expression and expresses concern at the lack of sufficient human rights considerations in organisations that have a key role in developing and maintaining the Internet’s infrastructure, for example the Internet Corporation for assigned names and numbers (ICANN), the Internet Engineering Task Force (IETF), and the Institute of Electrical and Electronics Engineers (IEEE). At the same time, the Special Rapporteur acknowledges that lack of human rights perspectives in technical standards development processes is often due to lack of sufficient technical expertise.
The report positively references initiatives to develop human rights expertise in Internet Standard Developing Organisations (ISDOs), which ARTICLE 19 is playing a leading role in. This includes:
- The Human Rights Protocol Considerations group (HRPC) at the IETF to research whether standards and protocols can enable, strengthen or threaten human rights. This includes two new Internet Drafts (I-Ds) that set out the first iteration of human rights protocol considerations, which give guidance to engineers to document and potentially mitigate the impact of their standards on human rights.
- The Cross Community Working Party on Human Rights (CCWP HR) at ICANN, which maps policies, procedures and operations that impact human rights, provides information, suggestions and recommendations to chartering organizations and ICANN community, proposes procedures and mechanisms for a human rights impact assessment and ensures that the responsibility to respect human rights becomes an integral part of ICANN. Recently, ICANN committed to respects human rights in its bylaws.
At the same time, the Special Rapporteur also recognises that few ISDOs currently undertake human rights impact assessments or allocate the necessary resources to include a human rights perspective in their work.
Surveillance and digital security
The Special Rapporteur reiterates that state surveillance and corporate data collection and retention raise substantial issues of freedom of expression. Some of the most pressing areas of concern include the way in which companies respond to legal requests for customer data, the sale of surveillance and censorship equipment and companies’ human rights responsibilities where they take notice of covert surveillance conducted by the State. The Special Rapporteur also makes reference to encryption and anonymity as well as the impact of mutual legal assistance treaties and data localisation laws on state surveillance.
Transparency
The Special Rapporteur emphasises the importance of transparency as a means to ensure that subjects of Internet regulation are able to meaningfully predict their legal obligations and challenge them where appropriate. However, he notes little agreement on the scope of transparency reporting when it comes to government requests for user information or content removal. Statistics concerning companies’ compliance with such requests or requests made by private actors also remain limited.
Remedies
Although the right to an effective remedy is widely recognised under international law, the Special Rapporteur notes a general lack of guidance as to what the right to an effective remedy for violations of the right to freedom of expression might look like in practice in the ICT sector. The scope of corporations’ responsibility to provide a remedy in circumstances where they interpret or enforce relevant State laws too strictly is also contested. The Special Rapporteur concludes that further analysis and research are needed in this area, including as to the appropriate role of the State in supplementing or regulating corporate remedial mechanisms.
Next steps
Looking ahead, the Special Rapporteur plans on developing detailed recommendations in the following areas:
- Restrictions on the provision of telecommunications and Internet services
- Content restrictions under terms of service and community standards
- Liability for content hosting
- Censorship and the surveillance industry
- Efforts to undermine digital security
- Internet access
- Internet governance
At a minimum, the Special Rapporteur strongly emphasises the need for private actors to develop and implement transparent human rights impact assessments. More generally, he encourages private actors to do more to consider the impact of their work on human rights, and develop strategies in line with the UN Guiding Principles on Business and Human Rights, also known as ‘the Ruggie Principles’.
Conclusion
ARTICLE 19 shares the Special Rapporteur’s concerns and strongly encourages all stakeholders, in particular states and private actors in the ICT sector, to take on board the report’s conclusions and recommendations, and engage in the future reports the Special Rapporteur plans on this topic.
In particular, ARTICLE 19 calls on all states to engage fully with the Special Rapporteur’s report at HRC 32, in particular during the interactive dialogue on 16 June 2016. In the resolution on Internet and human rights to be tabled at HRC 32, we urge states to include positive references to the Special Rapporteur’s report, as well as to the mandate’s commitment to sustained attention on this issue.