the Olympic Games, Trojan horse of algorithmic video surveillance?

The National Assembly must adopt an “Olympic law” on Tuesday with a view to the Paris Olympics in 2024. Among the provisions of this law, the most controversial article 7, aims to authorize automated video surveillance (VSA), which aims detect abnormal behavior more effectively using algorithms. A measure that worries the left and human rights organizations.

Opening of shops on Sundays, establishment of a health center in Seine-Saint-Denis, administrative investigation into future accredited… It is a catch-all law that the National Assembly must adopt on Tuesday March 28in a solemn vote, to prepare for the 2024 Olympics. But l’article 7 of this law is particularly debated: it provides that on an experimental basisal, the use of algorithmic video surveillance (VSA) is authorized to secure these Olympic Games. Associations for the defense of human rightsHas they point the finger at dangerous technology.

During the general consideration phase, this article of the Olympic bill passed with 59 vote in favor (presidential majority; Les Républicains; National Rally) against 14 against (New Popular Ecological and Social Union). It provides on an experimental basis that securing “large-scale sporting, recreational or cultural events” can be ensured through d’algorithms.

A technology in question

“Algorithmic video surveillance is a new technology based on the use of computer software capable of analyzing images captured by surveillance cameras in real time,” explains Arnaud Touati, a lawyer specializing in digital law. “The algorithms used in the software are notably based on machine learning technology, which allows VSA systems, over time, to continue to improve and adapt to new situations.”

Proponents of this technology boast of an ability to anticipate crowd movements or the detection of abandoned baggage or thrown projectiles. Compared to conventional video surveillance, everything is automated with algorithms responsible for analysis – which allows, according to the defenders of this technology, to limit human errors or inattention.

“As France presents itself as a champion of human rights around the world, its decision to allow mass surveillance assisted by artificial intelligence during the Olympics will lead to a general offensive against the right to privacy, the right to protest and the rights to freedom of assembly and expression”, denounced in a press release Amnesty International After the adoption of the article in question.

France, future herald of video surveillance in Europe ?

Katia Roux, Technology and Human Rights specialist at the NGO, precise the fears crystallized by technology : “Under international law, legislation must strictly meet the criteria of necessity and proportionality., the legislator has not demonstrated this”, she underlines. “We are talking about an evaluation technology, which must evaluate behaviors and categorize them as at risk in order to take measures afterwards. .”

“This technology is not legal today. In France, there have been experiments but never the legal basis that this law proposes to create”, she recalls. “At European level either. This is even part of the ongoing discussions in the European Parliament on technologies and the regulation of artificial intelligence systems. The legislation could therefore also violate the European regulation being drawn up.”


Olympic Games 2024: fear cameras, anti-drone laser, video surveillance, tech plays its games © FRANCE 24

“By adopting this law, France would pose as the champion of video surveillance within the EU and would create an extremely dangerous precedent. It would send an extremely worrying signal to States which could be tempteds to use this technology against their own population”, continuesshe.

Discriminatory biases ?

L’une fears is that the algorithm, seemingly coldd and infallible, does not actually conceal biases of discrimination : “These algorithms will be trained through a set of data decided and designed by humans. They will therefore be able to integrate tsimply out the discriminatory biases of the people who designed and thought them out“, note Katia Roux.

“The VSA has already been used for racist purposes, especially by China, under the exclusive supervision of the Uyghurs, a Muslim minority present in the country”, recalls the lawyer Arnaud Touati. “Because of the under-representation of ethnic minorities in the data provided to the algorithms for their learning, there are significant discriminatory and racist biases. According to an MIT study, the facial recognition error is certainly 1 % for white men, but it is mostly 34 % for black women.”

Arnaud Touati, however, wants to see the glass half full : “Luse of the VSA during eventsIt isevents of such magnitude could also shed light on the discriminatory, misogynistic and racist biases of the algorithm by identifying, to too many occurrences to be fair, people from minorities as potential suspects”, explainshe.

Mondayé by the left opposition to the National Assembly to reassure as to the situations which would be detecteds, the Minister of the Interior, GIt israld Darmanin, quipped : “Not the [personnes portant des] hoodies.” From the side of French government, it is estimated that the limits asked by the law – the absence of facial recognition, data protection – will be enough to to prevent Alle derive.

“We have taken precautionse-crazy, so that calls for tenders are only reserved for companies that respect a certain number of rules, including the hosting of data on national territory, respect for the CNIL andu GDPR [Règlement général sur la protection des données, NDLR]“, defends Modem MP Philippe Latombe, who co-signed an amendment with the National Rally for the call for tenders to prioritize European companies.s. “Clearly, we don’t want it to be a Chinese company doing data processing in China and using the data to do other things.”

“The guarantees provided by the government it is are not likely to reassure us. In reality, there is not really possible amendment, et this technology East, and soi, problematic and dangerous for human rights”, considers however Katia Roux. “It will remain so as long as there has not been a serious evaluationas long as there has not been a demonstration of the necessity and proportionality of its use, and as long as there has not been a real debate with the various actors of civil society on the question. “

Sport, eternal testing ground

If the Olympics are clearly the eventIt isnally targeted, the experimentation could begin as soon as the law is promulgated and will end four months after the end of the Paralympic Games, on December 31, 2024. It could therefore concern a wide range of events.It isments, starting with the next Rugby World Cup, in September-October.

LOpponents of the VSA fear to see its use, at first exceptional, eventually generalized. Sporting events often serve as a testing ground for policing, security and new technologies. So, the London Olympics had contributed to generalisis surveillance in the British capital.

“We are afraid to see a generalization that will extend beyond this exceptional period”, explains Katia Roux, who recalls that after the 2018 Football World Cup in Russia, the voice recognition technologies that hadint been allowedes were used to suppress the opposition.

Finally, Amnesty International is concerned that in the long term, video surveillance will drift towards biometric or voice surveillance. : “Facial recognition is only a feature to activate”, warns Katia Roux.

The Olympic law has not yet completed its legislative path. In addition to Tuesday’s solemn vote in the National Assembly, the text still has to shuttle with the Senate which had previously approved ituvIt is, but in different terms. Juntil the two Chambers agree.

Peter O’Brien the Tech 24 contributed to this article.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *