the evaluation committee expresses doubts about the effectiveness of the system put in place for the Games

the evaluation committee expresses doubts about the effectiveness of the system put in place for the Games
the evaluation committee expresses doubts about the effectiveness of the system put in place for the Paris Games

The evaluation committee, responsible for evaluating the use of algorithmic video surveillance (VSA) tested during the 2024 Olympic Games, expresses doubts about the effectiveness of the device, in a report consulted by The World and franceinfo Wednesday January 15. This report was submitted to the Interior Ministry on Tuesday.

As a reminder, algorithmic video surveillance is a technology based on software associated with surveillance cameras to identify events deemed suspicious or at risk, before alerting an operator in real time. Tested during the Olympics, the government has since considered generalizing the system. The text of the law already provides for an extension until March 31, 2025. Matignon indicated that he was waiting for the report of the evaluation committee to make his decision.

In this report of around a hundred pages consulted by franceinfo and by the Radio agency, it emerges “that the use of algorithmic processing implemented as part of the experiment resulted in unequal technical performances, which vary greatly depending on the operators and use cases, contexts of use, as well as the characteristics techniques and camera positioning.

The committee considers that the interest of the device in the context of the experiment “depends largely on the context of use”. Among the conclusions, the committee judges that the VSA is, for example, “less effective when there is little lighting” and the results are more relevant “in enclosed or semi-enclosed spaces, notably metro corridors and stations, compared to the results observed in open spaces”.

The technical performances of the device are “overall satisfactory” for certain use cases such as “the intrusion of individuals or vehicles into an unauthorized area, traffic in an unauthorized direction, or even the density of people”. However, with some reservations. Regarding the density of people for example, “the software sometimes had difficulty counting too many individuals (…) due to the height of the cameras.” The device appears in fact “less effective when cameras are too close to the audience, with bodies not fully visible.” Concerning crowd movements, the effectiveness of the device is “difficult to assess”explains the report. “The few returns indicate the difficulties encountered in identifying real movements. The processing can, in particular, assimilate groups of people moving in the same direction, without particular haste. It is difficult to define grouping or grouping movements. rapid dispersal.

Another point highlighted by the report is that of abandoned objects. Performances are judged “very uneven”. For example, there is confusion between unattended objects and those whose presence on the premises is not abnormal: “The treatment thus regularly assimilates street furniture (benches, panels) or even cleaning equipment (trash cans, buckets and cleaning machines) and other fixed or usual objects. More seriously, it sometimes assimilates people sitting or static, particularly homeless people.

Despite this, the report states that “the agents concerned are generally satisfied with the implementation of the system”. “The integration of AI camera screens in command rooms, when it was achieved, helped to promote the complementarity of conventional cameras and AI cameras.”

For the committee, “abandonment, extension or perpetuation” of the device is “a political choice” which does not fall within its mission. However, the committee considers that if this system is put in place, “particular vigilance is required (…) in particular to prevent any risk of misuse of legal purposes or, more fundamentally, of habituation to the use of such technology for surveillance purposes”.

If the experiment is extended or made permanent, the committee considers it important that the “legislator” reaffirms several general principles such as “the control of parliamentarians upstream and the referral to draft decrees submitted to the CNIL”. Other principles to reaffirm: “The ban on facial recognition outside the judicial context, the constant evaluation of the issues for public freedoms and fundamental rights”or even “clearly informing the public of the exact extent of the use of AI-equipped cameras with regard to their right of access.” “It is essential,” insists the committee, “that the information given is sufficient to guarantee that the public is informed and that they know the rights recognized to them by European law and national legislation.”

Laurent Nuñez, the Paris police chief, said he was in favor of the generalization of the system, as did the Ministry of the Interior and the Minister of Transport, Philippe Tabarot, in an article in Le Parisien. Associations defending freedoms, for their part, fear widespread surveillance despite the red line hitherto set by the government: the use of facial recognition.

-

-

PREV Snow-ice: Paris and Île-de-France on yellow alert this Friday
NEXT last moments before the clouds return