In today’s world, dominant tech companies hold a considerable degree of control over what their users see or hear on a daily basis. Current practices of content moderation on social media offer very little in terms of transparency and virtually no remedy to individual users. The impact that content moderation and distribution (in other words, the composition of users’ feeds and the accessibility and visibility of content on social media) has on the public sphere is not yet fully understood, but legitimate concerns have been expressed, especially in relation to platforms that operate at such a level of market dominance that they can exert decisive influence on public debates.
This raises questions in relation to international laws on freedom of expression and has become a major issue for democratic societies. There are legitimate motives of concern that motivate various efforts to address this issue, particularly regarding the capacity of giant social media platforms to influence the public sphere. However, as with many modern communication technologies, the benefits that individuals and societies derive from the existence of these platforms should not be ignored. The responsibilities of the largest social media companies are currently being debated in legislative, policy and academic circles across the globe, but many of the numerous initiatives that are put forward do not sufficiently account for the protection of freedom of expression.
In this consultation paper, ARTICLE 19 outlines a roadmap for the creation of what we have called Social Media Councils (SMCs), a model for a multi-stakeholder accountability mechanism for content moderation on social media. SMCs aim to provide an open, transparent, accountable and participatory forum to address content moderation issues on social media platforms on the basis of international standards on human rights. The Social Media Council model puts forward a voluntary approach to the oversight of content moderation: participants (social media platforms and all stakeholders) sign up to a mechanism that does not create legal obligations. Its strength and efficiency rely on voluntary compliance by platforms, whose commitment, when signing up, will be to respect and execute the SMC’s decisions (or recommendations) in good faith.
With this document, we present these different options and submit them to a public consultation. The key issues we seek to address through this consultation are:
- Substantive standards: could SMCs apply international standards directly or should they apply a ‘Code of Human Rights Principles for Content Moderation’?
- Functions of SMCs: should SMCs have a purely advisory role or should they be able to review individual cases?
- Global or national: should SMCs be created at the national level or should there be one global SMC?
- Subject-matter jurisdiction: should SMCs deal with all content moderation decisions of social media companies, or should they have a more specialised area of focus, for example a particular type of content?
The consultation also seeks input on a number of technical issues that will be present in any configuration of the SMC, such as:
- Constitution process
- Structure
- Geographic jurisdiction (for a national SMC)
- Rules of procedure (if the SMC is an appeals mechanism)
- Funding
An important dimension of the Social Media Council concept is that the proposed structure has no exact precedent: the issue of online content moderation presents a new and challenging area. Only with a certain degree of creativity can the complexity of the issues raised by the creation of this new mechanism be solved.
ARTICLE 19’s objective is to ensure that decisions on these core questions and the solutions to practical problems sought by this initiative are compatible with the requirements of international human rights standards, and are shaped by a diverse range of expertise and perspectives.
Read the consultation paper
Complete the consultation survey
[Extended deadline: 30 November 2019]