On 19 March 2020, Facebook announced that due to the Covid-19 pandemic it was sending its content moderators home, and as a result making changes to the way it enforced its community standards. This meant a switch to heavier reliance on automation to remove content, and a suspension of its appeals process. The full impact of these measures on freedom of expression has yet to be seen.
Although Facebook has said that access to information was crucial to the success of efforts to tackle the pandemic, its own transparency reporting continues to offer insufficient information about the enforcement of its community standards. ARTICLE 19 urges Facebook to provide more clarity about the details of its content moderation practices, including the full impact of AI-based content moderation on freedom of expression.
The latest community enforcement report: January-March 2020
Facebook’s latest report on enforcement of its community standards covers the period from January – March 2020. In particular, the latest report shows that:
- There has been a marked increase in content taken down in the area of nudity, hate speech and dangerous organisations. Whereas Facebook took action on about 25 million pieces of nudity related content in January-March 2019, it took action on nearly 40 million pieces of such content in the same period in 2020. In relation to hate speech, Facebook took action on about 3.5 million pieces of content between January-March 2019, whereas nearly 10 million pieces of content were actioned for the same period in 2020. The Community Enforcement Report also shows that Facebook significantly increased action taken on organized hate as part of its dangerous organisations policy in January-March 2020 – nearly 5 million pieces of content – compared to the previous quarter between October-December 2019 – about 1.8 million.
- While Facebook calculates for nudity-related content, this calculation is not yet available for hate speech or organized hate-related content. Facebook calculates that the prevalence of nudity and related content (i.e. the number of views of violating content) was just under 0.15% between January-March 2019 and that it was closer to 0.05% in January-March 2020.
- Content reinstated on appeal has decreased for nudity -related content: whereas nearly 4 million pieces of content had been appealed in April-June 2019, it was down to about 2.3 million in January-March 2020. The number of pieces of content restored has also decreased from just over one million in April-June 2019 to just over 600,000 in January-March 2020.
- The number of appeals on hate speech grounds remains stable, hovering around 1.3 million between January-March 2019 and the same period in 2020. Since a peak of around 170,000 pieces of content restored in July-September 2019, the number of pieces reinstated content has steadily decreased to just over 60,000.
- Appeals against decisions on organized have increased but not led to a significant amount of reinstated content. Appeals against actioned content on grounds of organized hate have increased since October-December 2019 reaching just over 230,000 in January-March 2020. The amount of restored content is low, at about 50,000 pieces of content. Some content is restored automatically, primarily terrorist-related content with nearly 300,000 pieces of content restored without appeal.
Insufficient information and impact of the pandemic
ARTICLE 19 believes that Facebook’s transparency community enforcement report does not go far enough and fails to provide a genuinely transparent account of content removal decisions made by Facebook. A significant amount of information continues to be absent:
- There is no information about the amount of content removed as a result of flagging by trusted flaggers and who those flaggers are, including whether they might include law enforcement or other government agencies.
- When Facebook refers to ‘action’ being taken, it is unclear whether this refers to posts, pages or accounts.
- The report fails to breakdown removal decisions per country, so it is difficult to draw firm conclusions on how moderation practices have impacted online expression.
- It could be helpful to provide information about ‘actioned’ content on a monthly basis in order to link more closely the removal of content to particular events, such as elections.
- Facebook provides information about content reinstated ‘without appeal’. Yet, no information is provided about this ‘automated’ content reinstatement process.
Facebook further seems to suggest that prevalence of certain types of content such as adult nudity and sexual activity has decreased because its algorithms are getting better at detecting content that is identical or near-identical to existing violations in its database. While this may well be true and appears to be borne out by the lower numbers of appeals and reinstated content in the nudity category, Facebook does not explain how those algorithms have been improved or the extent to which more content is removed. Given the well documented issues of wrongful takedown, including the removal of artists and journalists content and pages, we believe Facebook must provide more information to explain how its algorithms distinguish ‘identical’ content when the context or purpose of publication (e.g. humour) are different from that of the original content.
More generally, these numbers do not help understand the impact on freedom of expression of Facebook’s changes to its operations as a result of the pandemic. For instance, it is impossible to tell what impact, if any, the rise in hate speech and discrimination against specific groups during the pandemic has had on online hate speech, and subsequently content moderation decisions. In any event, ARTICLE 19 is concerned about the impact on freedom of expression of suspending Facebook’s appeals process and its increased reliance on automation to remove content. The Centre for Democracy and Technology has for example stated that algorithms may have up to 20% error rate. They are currently unable to take context duly into account and as such are prone to making mistakes. It is therefore important that changes implemented because of the pandemic – an extraordinary situation – do not become the “new normal” for Facebook.
Recommendations to Facebook
Facebook still has a long way to go in terms of transparency and accountability for its content moderation processes. With the changes in its enforcement processes during the pandemic, it is now more important than ever that the company take action to address these problems and prevent violations of freedom of expression in content moderation becoming further entrenched on the platform. Below, ARTICLE 19 sets out a series of recommendations for Facebook to improve its appeals processes and transparency reporting:
General Recommendations
ARTICLE 19 reiterates our calls on Facebook to :
- Create a comprehensive appeals mechanism. While this exists for some types of content, it should be applied across the board, including for account suspensions. If Facebook decides that content should be removed or other restrictive measures should be applied, users should be notified of the reasons for the decision and given a right to challenge the decision, for all types of removals.
- Improve the clarity and detail of its transparency reporting on enforcement of community standards. In particular, Facebook should provide more specific information about:
- Types of content being ‘actioned’, i.e. text, video, account;
- Types of action being taken: whether content is removed, downranked or labelled;
- List of trusted flaggers per country and per category of content;
- Action taken on the basis of community standards (or local law) as a result of flagging from a trusted flagger, including when those trusted flaggers are law enforcement or other government agencies;[1]
- Content restrictions, especially removals, by country or region relative to the amount of content posted in that country or region. Consideration should be given to publishing data about content restrictions on a monthly basis;
- Time taken to review appeals and make a decision about appeals;
- Facebook’s estimate of the error rate of its algorithms per type of content;
- Number of pieces of content that are not removed following flagging by users, trusted flaggers or algorithms as violating Facebook’s community standards;
- More information about its internal processes, including how and why is content reinstated without appeals.
These changes should be implemented across the board and as part of the company’s specific Covid-19 transparency report covering the period since 19 March.
Covid-19 specific recommendations
While Facebook acknowledges that the latest report does not fully display the impact of Covid-19 related measures, the company must go further and commit to providing a full report on content moderation and curation during this period. In particular, ARTICLE 19 urges Facebook to take the following steps:
- Facebook must ensure that this increased reliance on AI in the content removals without human review does not become the new norm. In particular, we urge the company to provide a timeline on efforts to restore full human content review across its services. The company should also share details of its contingency plans and human rights due diligence it says it has conducted on the reduced service.
- Facebook should produce a specific Transparency Report for the period in which content moderation processes have been impacted by the pandemic, i.e. starting from March 2020 until ‘normal’ service is resumed by Facebook. This report should demonstrate the noticeable trends that are the likely result of the changes made to Facebook’s processes during the pandemic.
- Facebook mustensure that data on content removed automatically during this period of reduced human review is preserved, so that appeals and restoration of wrongly removed content are still available as remedies for users at a later date.
[1] The focus of this statement is Facebook’s enforcement of community standards transparency report, but we note that this information is also missing from the legal requests enforcement transparency report and should be provided.