Social media platforms can be a space for free expression, democratic debate, and participation. But weak content moderation can transform them into hotbeds of ‘disinformation’, ‘hate speech’, and discrimination. This is especially concerning in post-conflict countries, where tensions between groups can erupt into violence.
ARTICLE 19’s new research investigates how content is moderated on major social media platforms in three post-conflict countries – Bosnia and Herzegovina, Indonesia, and Kenya – with a particular focus on ‘harmful content’ (such as ‘hate speech’ and ‘disinformation’).
Our research has found that social media companies don’t listen to local communities. They also fail to consider context – cultural, social, historical, economic, political – when moderating users’ content.
This can have a dramatic impact, online and offline. It can increase polarisation and the risk of violence – as when Facebook allowed incitement of genocide against Rohingya in Myanmar.
Bridging this gap between global companies and local communities is therefore vital to ensuring sustainable peace and democracy in post-conflict countries.
Content moderation and freedom of expression handbook
Social Media for Peace in Bosnia, Indonesia and Kenya
Empowering Indonesia: The Damai Coalition’s Impact on #SocialMedia4Peace
Transforming Kenya: The Impact of #SocialMedia4Peace
Empowering Change in Bosnia #SocialMedia4Peace
Read our country reports
Bosnia and Herzegovina
Indonesia
Kenya
Colombia
Global problem, local solution
ARTICLE 19, together with our research participants, has proposed a solution: local Coalitions on Freedom of Expression and Content Moderation.
These coalitions would allow consistent engagement between social media platforms and local civil society organisations, which would contribute to bridging the gap between global tech giants and local communities.
Our research provides more information on these coalitions. For each country, we outline practical steps for creating them, detailed risk assessments and potential members.
Edwin’s story
What are we asking social media companies to do?
Comply with international standards on freedom of expression and content moderation
Based on the Guiding Principles on Business and Human Rights, companies must comply with international standards on freedom of expression. The Santa Clara principles detail what this entails.
As part of this compliance, content-moderation rules must be clear and accessible to all users. Meta spends 87% of its budget for tackling ‘misinformation’ on English-language content, for example – but only 9% of its users speak English. This hints at discrimination in resource allocation between the Global North and Global South, and must be urgently addressed.
Twelve major companies – including Apple, Facebook (Meta), Google, Reddit, and Twitter – have endorsed the Santa Clara Principles. But our research shows they are failing on one important principle: Cultural Competence.
To honour their commitments, social media companies must ensure they understand the content they are moderating – which means understanding local languages, cultures, and political and social contexts.
Ensure internal complaints mechanisms are effective
Social media companies must ensure that mechanisms for appealing against wrongful removal of content and other restrictions on users’ freedom of expression are effective. These must be easily accessible – including to speakers of languages other than English. Companies must respond to complaints quickly and appropriately.
Publish comprehensive transparency reports
These must include information about decision-making processes, tools used to moderate content (e.g. algorithms and trusted-flagger schemes), and content-removal requests received and actioned on the basis of their Terms of Service.
Participate in new, independent, self-regulatory mechanisms
Companies should work with local civil society organisations to find solutions to current content-moderation problems. They should participate in the local Coalitions on Freedom of Expression and Content Moderation that ARTICLE 19 recommends are established. These coalitions could be an interim step towards a full Social Media Council.
Be easily accessible to local stakeholders
Social media companies should make themselves easily and transparently accessible to local stakeholders. They should do this via online means that enable local actors to effectively engage with them.
Learn more about ARTICLE 19’s work on content moderation
Regulating content moderation: Who watches the watchmen?
Social Media Councils: One piece in the puzzle of content moderation
Side-stepping rights: Regulating speech by contract
International: The Santa Clara Principles and the push for transparency
EU: Digital Services Act crisis response mechanism must honour human rights
Explore our campaigns
“The launch on the first International Day to Counter Hate Speech of ARTICLE 19’s reports is a significant contribution of the UNESCO ‘Social Media 4 Peace’ project to curb hate speech on social media while protecting freedom of expression.”
Mr Tawfik Jelassi, UNESCO Assistant Director-General for Communication and Information
“In conflict-affected and fragile contexts, there is often a lack of governance mechanisms to deal with the growing challenge of harmful content online. This is why we support efforts to address this growing challenge.”
Marc Fiedrich, Acting Director and Head of Service for Foreign Policy Instruments (FPI) – European Commission