In June 2018, ARTICLE 19 reviewed the compatibility of Facebook’s Community Standards with international standards on freedom of expression. Facebook published the latest version of its community standards in April 2018.
Facebook Community Standards are divided into six sections: (i) violence and criminal behaviour; (ii) safety; (iii) objectionable content; (iv) integrity and authenticity; (v) respecting intellectual property; and (vi) content-related requests. Although the latest version of the Community Standards is much more transparent and detailed than previous iterations, our analysis shows that they continue to fall below international standards on freedom of expression. This is especially the case of the Community Standards on ‘hate speech,’ ‘terrorism’ and bullying and harassment, as well as Facebook’s content removal procedures.
ARTICLE 19 encourages Facebook to bring its Community Standards in line with international human rights law and to continue to provide more information about the way in which those standards are applied in practice.
Summary of recommendations
- Facebook’s definition of ‘hate speech’ should be more closely aligned with international standards on freedom of expression, including Article 20(2) ICCPR which requires States to prohibit the advocacy of hatred that incites to discrimination, hostility or violence. Also, Facebook should provide case studies or more detailed examples of the way in which it applies its policies on ‘hate speech.’ Failure to express intent in relation to ‘hate speech’ (whether educational or otherwise) should not lead to automatic removal of that content in practice;
- The definition of ‘hate organisations’ and ‘attack’ should be narrowed, including by adding a requirement of intent to harm particular groups;
- Facebook should align its definition of ‘terrorism’ with that recommended by the UN Special Rapporteur on counter-terrorism. In particular, it should avoid the use of vague terms such as ‘praise,’ ‘express support,’ ‘glorification’ or ‘promotion’;
- Facebook should give examples of organisations falling within its definition of terrorist organisation. In particular, it should explain how it complies with various governments’ designated lists of terrorist organisations, particularly in circumstances where certain groups designated as terrorist by one government may be considered as legitimate (e.g. freedom fighters) by others. It should also provide case studies explaining how it applies its standards in practice (e.g. on beheading videos);
- Facebook should strive to narrow its definitions of bullying and harassment in order to prevent legitimate content from being removed. It should ensure that the definition of bullying and harassment remain distinct. Facebook should explain in more detail the relationship between “threats,” “harassment,” and “online abuse”/“bullying” and distinguish these from “offensive content” (which should not be limited as such). Further, Facebook should provide detailed examples or case studies of the way in which it applies its standards in practice, including with a view to ensuring protections for minority and vulnerable groups;
- Facebook should state more clearly that offensive content will not be taken down as a matter of principle unless it violates other rules;
- Facebook should make more explicit reference to the need to balance the protection of the right to privacy with the right to freedom of expression. In so doing, it should also make reference to the criteria developed, inter alia, in the Global Principles on the Protection of Freedom of Expression and Privacy;
- Facebook should improve its voluntary initiatives aimed at tackling ‘fake news’ by including a charter of ethics comparable to the highest professional standards of journalism and involving a wide range of stakeholders;
- Facebook should explain in more detail how its algorithms detect ‘fake accounts’ or produce more ‘reliable’ results, including by listing the criteria on the basis of which these algorithms operate;
- Facebook should ensure that its appeals process complies with the Manila Principles on Intermediary Liability, particularly as regards notice, the giving of reasons, and appeals processes;
- Facebook should be more transparent about its use of algorithms to detect various types of content, such as ‘terrorist’ videos, ‘fake’ accounts or ‘hate speech;’
- Facebook should provide information about its trusted flagger system, including identifying members of the scheme and the criteria being applied to join it;
- Facebook should refrain from putting in place contact points in countries with a poor record on the protection of freedom of expression;
- Facebook should provide case studies of the way in which it applies its sanctions policy;
- Facebook should provide disaggregated data on the types of sanctions it applies in its Transparency Report;
- Facebook should stop requiring its users to use their real-name. It should not require users to prove their identity.