A judge in São Paulo, Brazil has ruled that the city’s metro must stop using facial recognition technology on its network. The ruling is a victory for civil society activists, including ARTICLE 19, who argued the practice undermined people’s fundamental rights and launched a lawsuit against it.
“This decision is very important because it acknowledges that the capturing and processing of biometric data affects the fundamental rights of citizens and deserves judicial protection,” said Raquel da Cruz Lima, advisor to the Legal Center for ARTICLE 19’s Brazil and South America office.
Over 4 million people use the Companhia do Metropolitano de São Paulo (Metro) daily. The new technology would see their activities being monitored and tracked through facial recognition and other surveillance equipment, and their information would then be stored by authorities.
ARTICLE 19 Brazil and South America, the Public Defender of the State of São Paulo, the Public Defender of the Union, the Brazilian Institute for Consumer Defense (Idec), Intervozes – Coletivo Brasil de Comunicação Social, and the Human Rights Advocacy Collective (CADHu) brought the lawsuit, highlighting that the proposed facial recognition system did not meet certain legal requirements set forth in the General Law of Data Protection (LGPD), the Code of Consumer Protection, the Code of Users of Public Services, the Statute of Children and Adolescents and the Federal Constitution. It also violated international human rights standards.
The use of facial recognition technology and other surveillance technologies has increased exponentially in Brazil in recent years. Studies indicate that 21 of the 27 states of the Federation have already implemented some type of facial recognition system in public places. The Public Policy and Internet Lab (LAPIN) points out that authorities use the tech for identity verification and to monitor school attendance, as well as part of public security, mobility, and urban transportation initiatives.
LAPIN revealed that, in the states of Bahia and Paraíba alone, information about 18 million festival-goers and other people taking part in public events has been collected using these technologies.
How did we get here?
ARTICLE 19 and the civil society organisations behind the campaign believe the increased use of facial recognition technology and other technologies give authorities nearly unrestricted access to people’s data across Brazil. The technologies also help to normalise such practices and stem from a false dichotomy between public and private spaces. This leads to an erroneous assumption that the right to privacy and data protection does not apply to public places, including public transportation, when in fact the right to privacy and all human rights absolutely apply to public spaces and are not limited to people’s homes and private lives.
The surveillance system proposed by São Paulo’s metro contravenes international treaties to which Brazil is a signatory. As human rights advocates and experts have pointed out, it threatens people’s right to freedom of expression and association, not least because some people may be wary of taking part in demonstrations, fearful of being under constant observation. The initiative is part of a larger agenda to normalise mass surveillance, and reflects the government’s tactics of suppressing individual and collective expression. Because this technology uses physical characteristics to identify people, it has a particularly intense impact on Brazil’s minority communities, who face discrimination and have been repeatedly targeted by authorities.
“Frequently these technologies hit non-binary and transitioning people, for example, because they do not recognise the changes in them,’ continued Raquel da Cruz Lima. People have been prohibited from using a ticket they had bought because the technology failed to identify them as the person who initially made the purchase. “There are also ‘false positives’, where black people are more likely to be affected,” she added.
According to international human rights standards, constant surveillance leads people to self-censor in order to protect their privacy and anonymity. Rafaela de Alcântara, a digital rights consultant for ARTICLE 19 Brazil and South America, cited a United Nations Human Rights Council report published at the end of 2021 that ‘explicitly registers concern about the conduct of arbitrary and illegal surveillance supported by the use of tools such as facial recognition’ and ‘outlines a series of criteria and requirements for this kind of technology to be used in a way that doesn’t curtail the exercise of human rights’.
She added: “In general, the criteria for the implementation and use of facial recognition by public authorities in Brazil, as in the case of the São Paulo subway, show how far we are from considering the prior analysis necessary for human rights principles to be respected. In practice, we often observe that the tacit justification for adopting this type of project is the simple fact that the technology is available on the market.”
It is important to note that, in 2021, the UN High Commissioner for Human Rights called for a moratorium on the use of facial recognition in public spaces until safeguards for rights are established. Several civil society organisations and experts have pointed out that the moratorium is not enough in some cases, especially when initiatives by their very nature violate human rights, such as plans for massive biometric surveillance. Such projects should be banned. This is the case for the public civil action against the Sao Paulo Metro’s use of facial recognition technology, which has been proven to be incompatible with human rights.
In her written verdict, Judge Cynthia Thome, who issued the preliminary injunction against the Metro, said the company had not presented sufficient information about what they would do with the information it collected. ‘The use of the system to serve public agencies, for now, is no more than conjecture, a fact that, in itself, indicates the insecurity of the system that [the Metro] intends to deploy,’ Thome, from the 6th Court of the Public Treasury, said. ‘There are a number of technical issues that require dilatory evidence to be resolved.’ She added that there was a potential danger that the ‘fundamental rights of citizens’ could be affected if the system was implemented.