- Five years after Manila Principles on intermediary liability were published, processes for content takedowns are still flawed
- Tech companies are set to rely more on automated content takedowns because of coronavirus outbreak
- Need for transparency and fair appeals even more urgent as people rely on social media for communication during pandemic.
ARTICLE 19 has urged tech companies to be more transparent about content moderation practices and improve their appeal processes, as they enact plans to remove misinformation about coronavirus. The call comes on the fifth anniversary of the publication of the Manila Principles, a set of standards for censorship and takedown laws. Together with the Santa Clara Principles that provide guidelines on how companies can create meaningful, fair, unbiased and proportional processes that respect rights of the users of platforms, they should be guiding the responses to the coronavirus pandemic by social media platforms and tech companies.
Acting Executive Director Quinn McKew said:
“Tech companies have a role to play in helping everyone access comprehensive and accurate information during the coronavirus pandemic, and this may include removing content.
“However, there are existing flaws in the platforms’ processes for identifying and removing content that could threaten freedom of expression during this time. There is a lack of transparency about how algorithms determine what content is removed or promoted, moderation rules are applied inconsistently and inconsistent processes for appeal.”
“During the coronavirus pandemic, we can expect more content to be removed and a greater reliance on automated removals. More than ever, we need companies to be transparent about the content they are removing and have proper processes in place so that all users can appeal if they think that content has been taken down in error.
“The ability to communicate on social media will be vital for many people during times of social isolation. Therefore, we would urge companies to take particular care about suspending or terminating users’ accounts during this time.”
Response to coronavirus by tech companies
Last week, Twitter announced that it will temporarily start relying more on technology and that its automated systems will start removing some content without human review. However, it would not be permanently suspending accounts based solely on the automated enforcement systems. They would also be broadening their definition of ‘harm to address content that goes directly against guidance from authoritative sources of global and local public health information”.
YouTube has said that it will increase its use of automated content takedowns as there will be less staff available to review content, and that this may lead to “increased video removals, including some videos that may not violate policies”. While the company says that users can appeal content takedowns, it admits that “workforce precautions will also result in delayed appeal reviews”.
Facebook has said that with “a reduced and remote workforce, we will now rely more on our automated systems to detect and remove violating content and disable accounts”. Last Tuesday, Facebook incorrectly removed a large number of posts, many of which were reported to have been about the coronavirus. Guy Rosen, Vice President of Integrity, later tweeted: “We’ve restored all the posts that were incorrectly removed, which included posts on all topics – not just those related to coronavirus. This was an issue with an automated system that removes links to abusive websites, but incorrectly removed a lot of other posts too.”
Manila Principles
ARTICLE 19’s call for greater transparency comes on the fifth anniversary of the publication of the Manila principles. Produced by a coalition of international freedom of expression organisations, they provide best practice guidance on how tech companies can respond to content removal requests while also protecting freedom of expression. In particular, the Manila Principles state that intermediaries should publish their content restriction policies online, in clear language and accessible formats, and keep them updated as they evolve, and notify users of changes when applicable. They also stipulate that intermediaries, together with governments and civil society should work together to develop and maintain independent, transparent, and impartial oversight mechanisms to ensure the accountability of the content restriction policies and practices.
Missing Voices
Last year, ARTICLE 19 launched its Missing Voices campaign, which is calling for Facebook, Twitter and Google to:
- Provide more transparency about the number of content removals, types of flaggers , reasons for removal, how many appeals they receive and the outcome of the appeals.
- Provide more transparency about the algorithms used to remove content, which can perpetuate bias or fail to detect context.
- Give all users to have a right to appeal when their content is removed or their account closed down.
ARTICLE 19 will ask people to share examples of content about coronavirus that has been incorrectly removed to help improve transparency and access to information during the pandemic.
For more information, contact [email protected]