The crisis resulting from violent white supremacist riots that breached the US Capitol on 6 January placed in the spotlight, once again, questions about freedom of speech and the role of digital platforms in our lives.
Facebook and Twitter’s decisions to suspend Trump’s accounts; Google and Apple’s suspension of Parler from their app stores; and Amazon’s decision to remove Parler from its cloud have provoked differing reactions broadly on partisan lines. They range from welcoming the decisions as safeguarding democracy, to condemnations as forms of private censorship.
The underlying question in this debate is whether private companies should be the ones who decide which content can be made available and shared among millions of people and which cannot. Do they have an obligation to suspend content that constitutes incitement to violence when there’s risk of damage to people or property? And if so, how should they decide when this is the case and based on which parameters? Is there a robust system in place for this kind of moderation, or are we to be simply subject to essentially with one-man decisions? Does Trump’s speech deserve different treatment from that of an average user? Do platforms apply their policies evenly around the world?
There are so many questions to answer, and while none of these questions are new, civil society, academics, decision makers, regulators and the platforms themselves have been struggling to define answers to them for years.
What seems clear from Big Techs’ moves following the 6 January is that they are emergency responses which try to fill a huge gap in law and in government policy. Trump has for a long time used Facebook and Twitter as vehicles to drive responses from followers based on disinformation. Since the election he has prepared the ground for the Capitol’s assault using a variety of spokespeople many of whom found a home in the social network Parler. Parler’s carefully crafted image as a place to share hateful and violent content, has been in development for three years. Parler continued until 5th January to stoke hate and distrust; they waited until it all ignited into terrible violence, the loss of five peoples lives, and what many now realise is something much more dangerous.
While ARTICLE 19 welcomes the decisions taken by Big Tech in the last few days the enormity of the challenges posed by the equivocal position of these companies requires a determined effort to prevent rushed, reactive decision-making in the future.
We will not protect free speech, democracies, or counter the toxic polarisation and violence we have seen develop in various contexts in the last few years unless we develop a structural response. We need procedures based on international human rights standards, not the apparent goodwill of a powerful man under political pressure. The problem we have is that most of the world’s attention is on the visible part of the problem, not the part that lies under the bonnet. Like an iceberg, its greatest mass lies under the water, and that is why we must look at the layers of internet infrastructure and identify where the problem really lies.
Yes, rules on content moderation are essential. Demanding that in their content moderation practices, social media platforms are guided by human rights standards including the Rabat Plan of Action is fundamental if we are to avoid censorship and other violations of individuals’ free expression rights, as well as to guarantee that illegal content, such as clear incitement to violence is not spread online. But violations, and especially censorship, can occur at different points in the system as the events of the past week clearly demonstrate.
The infrastructure underpinning social content means that if people want to share or access content, they have to use a social media application whose terms of services allow that content to circulate. They also need an app store where they can find and download the app. And the app, in order to work, requires a cloud provider to host it. Those are the main, although not the only, key intermediary gateways our free speech online goes through. Some call them ‘chokepoints’; others use the word ‘gatekeepers’ to define these intermediaries. They guard, and impede access to the market by a business, as well as to control the flow of content in the digital environment.
As we saw, a number of gatekeepers took action following the US Capitol assault. Apple gave Parler 24 hours to mitigate the “planning of illegal and dangerous activities” occurring on its service or face expulsion from its App Store. The App Store is the only means to download mobile apps on Apple devices and Parler’s removal means the app can no longer be downloaded on devices where it isn’t already present or updated on devices where it is.
Google has suspended Parler from its PlayStore based on similar grounds: as they have recently stated ‘for us to distribute an app through Google Play, we do require that apps implement robust moderation for egregious content’. However, as Google allows other app marketplaces on Android, and its decision applies only to the Play Store, people with Android devices will still be able to get the app, if they’re willing to pass through further steps.
Amazon notified Parler that it would be removing it from its cloud hosting service Amazon Web Services. It argues that calls for violence propagating across the social network violated its terms of service and said it was unconvinced that the service’s plan to use volunteers to moderate calls for violence and hate speech would be effective. The suspension is extremely problematic for Parler, which is currently unable to operate and offline until it can find another hosting service.
There are various ways to read these moves. These decisions can be considered part of Apple, Google and Amazon’s contractual freedom. As economic actors, they are in principle free to decide who they want to provide their service to, and based on what terms and conditions. Nonetheless, the impact of these decisions on users’ free expression should be considered. Do we want the Big Tech giants to ‘police’ how the businesses that need their services act towards end-users? Do we want them to set the standards not only in the markets where they operate (in the cases at stake: app stores and cloud services), but also in those where their clients operate (in this case: social networks), and do that based on obscure profit oriented logic, which they can change any time they like based on convenience, or other economic reasons?
A way to solve this problem would be to put limits on Big Techs’ contractual freedom. However, this solution looks only at the content or service these companies might carry or not, rather than at the overall marketplace. Any invasive regulatory intervention with regards to content should be accompanied by a careful balancing of conflicting rights, and by a strict necessity and proportionality test.
Yet all the challenges above are amplified by the fact that, as explained, we are talking about key intermediaries, or gatekeepers. Indeed, a more attentive look at market dynamics unveils a fundamental factor at stake: the players that are taking action these days all enjoy a position of power in the market and constitute the main channels for people to exercise their freedom of expression and information rights online. The problem, then, lies in the fact that we are not talking about a cloud provider or an app store among many, but about companies that account for a very large percentage of the market, and that therefore are able to rule the roost. If an app does not match their standards, whatever they are and however they are set, end-users might not be able to access it at all, or only with certain limits or hurdles. Do we really want Amazon, Apple and Google to decide which apps we can use and which we cannot?
With this premise, a better way to solve the challenge is to work to counterweight the power of Big Tech by diminishing it, rather than regulating it. To return or resort to competition, rather than to regulated monopolies or oligopolies. Indeed, the main problem, as we have repeatedly said, is the excessive centralisation of power at various layers of the communication infrastructure required to exercise our free expression rights.
To solve the numerous challenges linked to content moderation, the spread of incitement to violence, censorship etc., we certainly need standards based on human rights law, but we also need to diminish the control power that a handful of players are able to exercise on the communication infrastructure. It is not with chokepoints controlled by profit-oriented companies that we’ll have our rights respected and guaranteed, and our democracy protected. It is with markets for new and innovative players, fair competition and choices for people to arise. We need decentralised power, a variety of providers with different business models and systems, which compete fairly; we need viable alternatives and the option to switch for users. Any other solution appears short sighted: a patch that cures the symptoms but leaves the cause unaddressed.
Decision makers and regulators should impose pro-competitive measures to achieve those objectives, and to implement and protect an internet infrastructure which is free, open and non-discriminatory. They should call for remedies that lower barriers to entry to the market, forbid gatekeepers from excluding competitors or from leveraging their market power to control access in other markets. They should also put in place measures to empower users, providing them with viable alternatives. Interoperability, fair and non-discriminatory access, transparency are among the tools that could help, at any layer of the infrastructure.
The protection of free speech and democracy does not require only to look at how Facebook and Twitter moderate content. It also requires to take a step back and look at the entire iceberg. We need measures that help shape the marketplace not as the home of gigantic power in the hands of a few, but as an open place for innovation, competition and the increase of people’s welfare. This way, our rights will no longer be dependent on the ad hoc decisions of unaccountable monopolies or gatekeepers.