Summary of event’s discussions and key recommendations from ARTICLE 19 and Keep It Real ambassadors on the threat of disinformation to free speech.
1. A look at the threat disinformation poses in Ireland
Online disinformation can have real life impact. In Ireland, false claims shared on social media and disseminated by more traditional media regarding the alleged convictions of George Nkencho, a young black man shot by police in December 2020, sparked outrage and showed the dangers that disinformation can pose to citizens. Around the world, misinformation and disinformation spread on social media about drugs that were rumoured to cure Covid-19, alongside false claims about COVID-19 vaccines, created a sense of confusion and spread distrust among many communities, information that consequently impacted their health choices and fueled uncertainties about their safety.
Young people in Ireland are not sufficiently supported in discerning truth from “fake news” or encouraged to promote discussions within families, or among friends and communities about how to avoid being trapped by disinformation. Despite existing efforts by civil society in Ireland, such as the BeMediaSmart campaign launched by Media Literacy Ireland, young people feel that currently the education system does not adequately build students’ understanding of disinformation, which so deeply affects their development as active citizens in society and their ability to fully participate in the democratic process.
Journalists in Ireland are often the targets of online harassment from fake social media accounts or trolls attacking their reputation, their safety and the public’s trust in the news they report. The impact of online harassment is particularly detrimental for female journalists, who are often disproportionally targeted by online threats compared to their male colleagues, and who are often sexually demonised publicly.
2. How could online disinformation be countered?
Role of States
· States have an obligation to protect freedom of expression under international human rights law (UDHR, article 19). Freedom of expression does not necessarily protect only truthful information and restrictions to this right are extremely limited in scope. States should strike a balance between protecting the human right to freedom of expression and countering disinformation and other forms of harmful speech.
· States should promote media and information literacy, as well as digital literacy, amongst their citizens, and provide tools to help citizens critically analyse the news they receive and access a full range of sources.
· Open and transparent media is the backbone for a democratic society. States should ensure journalists’ and media freedom and independence and journalists’ safety. By promoting a plural media landscape, citizens would be able to access a variety of information, including accurate and reliable news.
· States must ensure the dissemination of accurate information from official sources; this is specifically crucial at times of emergency, such as in the context of the COVID-19 pandemic.
Role of media
· Media should comply with journalistic ethical standards, partake in self-regulation mechanisms such as ombudsperson and press councils, and verify the accuracy of the information they report.
· Media should fact-check false information shared in the public domain. In Ireland, TheJournal.ie crated a web section to debunk fake news and provide truthful accounts of stories.
Role of social media and other private companies
· Social media companies have implemented some solutions to address concerns over problematic content. For example, Facebook and other social media companies have stepped up initiatives to curb “fake news”, including fact-checking, content labelling and highlighting the sources of online content. Facebook has also recently published its first human rights policy after more than 15 years since its foundation, which will guide the application of human rights to their policies and processes, and will bind the company to report human rights issues they deal with, including decisions over the limitation of speech on its platforms.
· Social media companies have massive dominance in the world and in the way information is shared and accessed. Although platforms don’t create disinformation, they nevertheless constitute an amplifier for all type of harmful content, including disinformation. International human rights law binds States to comply with internationally-set standards; however, do social media companies play a quasi-State role in light of the power they hold globally? And should they be held accountable to international human rights? How should they deal with online content that is “awful but lawful”? Digital governance is undergoing a significant moment of confusion and civil society should be involved and consulted in social media platforms’ decision-making processes, especially when these decisions affect citizens’ human rights.
· Social media companies must increase transparency on how they moderate content on their platforms.
· Social media companies must provide internal avenues for users to appeal decisions over content takedowns.
· A number of efforts have been established at global level to provide external oversight on social media platforms. The Facebook Oversight Board is an independent body set up last year with the aim of providing independent oversight on Facebook’s decisions regarding content moderation and regarding its internal policies. The effectiveness of this body is yet to be assessed, although concerns have been raised over the capacity of a small group of experts to apprehend the complexity (languages, culture, politics, etc.) of cases coming from all regions of the world and whether it can represent an effective mechanism for individuals at the local level and grassroots communities affected by Facebook content moderation decisions.
· In addition, research and innovation efforts can play a part in improving the presence of reliable and accurate information online. In Ireland, for instance, Kinzen aims to develop cutting-edge solutions to fight online misinformation and improve content moderation on social media.
Role of citizens
· Individuals could make use of strategies such as information awareness and critical thinking when analysing news from media or social media.
· Actively engage in matters of public concern and be open to engage with various viewpoints and a variety of sources.
· A kind non-judgemental approach to express concern over “fake news” is conducive to constructive and open discussions with friends and families when they are influenced by disinformation.
3. A Social Media Council in Ireland
· ARTICLE 19 is promoting the establishment of a Social Media Council (SMC) in Ireland, a voluntary forum where measures to deal with disinformation and other problematic content could be discussed, fine-tuned, assessed or reviewed by representatives from social media companies and all stakeholders (media, social media, academics and civil society, among others), while balancing these with the protection of the right to freedom of expression.
· Given the need to establish oversight on how social media companies deal with “lawful but harmful” content, the SMC can represent an effective platform to design a society-orientated approach to these issues, and to ensure public ownership of decisions on content moderation. Thanks to its participatory and transparent nature, the SMC would enable the development among all participants of a common understanding not only of the types of content that ought to be moderated but also of the appropriate and realistic technical approaches to moderation and find solutions to disinformation.
· The SMC could also operate an appeals mechanism: users would have access to an independent, external body that can make decisions on disputes related to content moderation.
· The Irish Oireachtas is discussing the adoption of a new law that will create a new legal framework for the regulation of social media platforms. The Online Safety and Media Regulation Bill is intended to set up a regulatory body (“the Media and Online Safety Commission”) that would oversee and provide strict guidance to social media companies on the regulation of content on their platforms.
· The future Media Commission would also adopt safety codes on harmful content. These codes will serve to define the measures that should be taken by social media platforms in order to minimise the availability of harmful online content on their services. The Bill also makes it clear that, when elaborating such codes, the protection of fundamental rights of users’ needs to be taken into consideration.
· ARTICLE 19 suggests that there is a space for complementarity between the future Media Commission and an Irish SMC. The SMC, a voluntary mechanism, would be on the frontline, operating the appeals mechanism, while the regulator could have more of a helicopter view, making sure that self-regulation implemented through the SMC is effective to deal with online harms
Key Recommendations
After participating in discussions at the final event, looking at the seriousness of disinformation and how it impacts young people in Ireland, and also reiterating the importance of freedom of expression, the ambassadors of the Keep It Real campaign and ARTICLE 19 present the following recommendations to conclude the Keep It Real campaign:
To policy makers in Ireland:
1. Promote media and information literacy, in particular regarding the threats disinformation can pose to freedom of expression, and include such topics in the national educational curriculum for students.
2. Refrain from overly-broad regulation of online platforms so to strike a balance between countering harmful content and protecting citizens’ right to freedom of expression online.
3. Support and enable the creation and adoption of participatory solutions to the challenges posed by disinformation, such as a Social Media Council to work in collaboration with the future Media Commission.
4. Promote an independent and free media landscape.
To media:
1. Ensure transparency and accuracy of the news and follow ethical standards of journalism.
2. Debunk “fake news” by providing full accounts of news stories, access to verified information by the public and ensure a right to reply.
To social media companies:
1. Increase transparency regarding content moderation decisions and ensure a right to appeal for users.
2. Respect the right to freedom of expression online and apply international human rights principles to guide decision-making processes about content moderation.
3. Engage with government and civil society to find participatory solutions to the threats that disinformation can pose to free speech, such as the Social Media Council, as well as to addressing current shortcomings in content moderation.
4. Have clear guidelines on gender-based harassment, improve internal redress mechanisms and ensure transparency, and partner with civil society organisations to develop practical strategies to address gender-based online violence.
To civil society:
1. Establish or be involved in participatory mechanisms to discuss solutions to the threats to online free speech and collaborate with social media platforms and governments to find collective solutions to these.
2. Continue advocating for the adoption of a balanced approach between protecting users’ safety and their right to freedom of expression and access to information online.
3. Reinforce initiatives to promote the media and information literacy and digital literacy of citizens.
About the Event
To find out more visit the Keepitreal online event page to know more about the event and the speakers.