The streets of Paris during the upcoming Olympics could be monitored by real-time cameras with artificial intelligence (AI) to spot any potential suspicious activity. However, civil liberties campaigners are warning that this technology poses a risk to individual freedoms. As reported by the BBC's Hugh Schofield.
François Mattens, whose Paris-based AI company is in the running for a section of the Olympics video surveillance contract, insists, "We are not like China; we do not desire to become an all-seeing ‘Big Brother.’"
Recently, police have been given the authority to take advantage of algorithms utilized by CCTV cameras to identify potential issues such as sudden increases in the number of people, altercations, or forgotten items.
The law strictly forbids the utilization of facial recognition technology, for instance, as employed by China, to detect "questionable" people.
Opponents have expressed alarm, claiming the law is only the start of a slippery slope, and that although the allotted period of experimentation is set to finish in March 2025, the French government could be aiming to make the new security regulations permanent.
Noémie Levain of the digital rights campaign group La Quadrature du Net (Squaring the Web) comments, "This has happened before in Olympic Games which have been held in Japan, Brazil and Greece. What was meant to be enhanced safety protocols, as per the unique environment of the games, have ended up being standardised."
In certain French police stations, a version of the latest AI security system has been implemented. For example, Massy, located in the southern parts of Paris, is one of the first to deploy this technology.
Massy's mayor Nicolas Samsoen noted that the town has a total of 250 security cameras, far too many to be capable of being managed effectively by the current team of four.
Therefore, the AI system keeps track of all the cameras, and informs people if it detects something which has been specified, for example many people gathered together suddenly."
The AI device supervises all of the cameras, and if it detects something it has been programmed to look out for, such as a mass of individuals, an alarm will be triggered. Thus, the AI system continuously monitors the cameras, and alerts people if it recognizes anything that has been stipulated, like a large number of individuals congregating all at once.
The onus is on the officers to analyze the circumstances and determine the most proper course of action. It could involve something serious or not."
The responsibility lies with the police officers to analyze the situation and determine the suitable reaction. This could involve something serious or not.”
It is people, not computers, who ultimately decide how to respond. The algorithm is empowering people.
We put a piece of luggage to the test by leaving it on the street close to the police station. Within thirty seconds, the alarm was triggered and the control room had footage of the bag on their screen.
Before, the algorithm had to be taught recognizing an abandoned piece of luggage, hence the usage of AI. The creators have provided it with a large collection of pics of lone luggage on the street - an archive that increases with more photos added to it.
The "learning" process does not take place at the client interface, but exclusively occurs at the developers' back-office. The Massy police station has purchased a self-contained product which monitors the cameras, but is lacking the capability to gain additional intelligence.
Identifying unattended luggage is not a difficult job. It is much more difficult to determine the presence of a person in a crowd; or recognize a hidden weapon in someone's clothing; or distinguish between actual physical aggression and a simple surge in the number of people in one place.
The XXII group, a French start-up focused on computer vision software, must receive additional details from the French government in order to adjust their bid for a segment of the Olympic video surveillance venture. We need a clearer vision of what they want."
François Mattens of XXII states that the government intends for AI to be able to spot fires, clashes, people on the ground, and unattended luggage, however, they need to get organized. There needs to be a clearer idea of what exactly they are asking for.
The Rugby World Cup [in France] this September is not going to be able to make use of the proposed new systems, as it will take a considerable amount of time to have them in place.
François Mattens and other developers are aware of the complaints that they are enabling an unjustifiable amount of governmental monitoring. At the same time, they are adamant about the protections in place.
"We are not going to - nor are we legally able to - use facial recognition, making this solely different from the practices in China," he asserted.
Our attractiveness lies in our ability to offer security while adhering to legal and moral boundaries.
Noémie Levain, a digital rights activist, suggests this is just an "account" used by developers to sell their item - with the conviction that the government will most certainly support French businesses rather than foreign companies when it comes to distributing the Olympic contracts.
She states that it is claimed that not using facial recognition here will make a big difference, but in their opinion it is essentially the same.
Using AI video monitoring as a form of surveillance allows the state to evaluate our behaviour and physicality in order to make a judgement about whether it is typical or strange. Even without facial recognition, it provides a means of governing a large number of individuals. "
AI video monitoring is a form of surveillance that enables the state to assess our bodies, behaviour, and determine if it is regular or unconventional. Even without utilizing facial recognition, it can be used to implement major control.
By using AI video monitoring for surveillance, the state can assess our behaviour and physicality to detect if it is ordinary or suspect. Even without incorporating facial recognition, it can be utilised to provide a means of administering a great number of people.
We view it as being equally frightening to what is transpiring in China. It pertains to the same principle of forfeiting the freedom to remain anonymous, the freedom to behave as we please in public, and the freedom from being constantly monitored.
This week, BBC News is devoting its attention to AI and exploring the way the technology impacts our lives and what possible effects it may have in the approaching future.
top of page
bottom of page
Comments