Detail

Research - 30.01.2023 - 11:16 

HSG scientists conduct interdisciplinary research on automated facial recognition

Various Swiss cities want to ban automated facial recognition. At the HSG, the media scientist, Miriam Meckel, and the criminal lawyer, Monika Simmler, have been researching this topic for years. The HSG researchers say there needs to be a public debate about the technology, which is also used by the Swiss police.
Source: HSG Newsroom
Face recognition and personal identification technologies in street surveillance cameras, law enforcement control.
Face recognition and personal identification technologies in street surveillance cameras, law enforcement control. crowd of passers-by with graphic elements. Privacy and personal data protection,

Automatic facial recognition can identify individuals in crowds - or even from millions of images on the internet. However, in 2022, the technology was the subject of controversy in several Swiss cities: Last September, for example, the St.Gallen city parliament decided to ban the facial recognition technology application in public spaces. Similar bans are under negotiation in Basel, Lausanne and Zurich. "Digital surveillance and its regulation is one of the big political questions of our time," says HSG criminal law professor Monika Simmler. She has been researching the use of algorithms in policing and criminal justice for many years now. 

It is important to ensure there is movement in Swiss politics when it comes to discussing FRT. "Individual police departments in Switzerland are already using the technology in their investigations of crimes. But there is no legal basis for it." In the Criminal Procedure Code (CrimPC), there is no article that refers to FRT, "even on a generous interpretation" - in contrast to DNA analysis, for example, the use of which is clearly regulated. 

This legal vacuum is particularly tricky because FRT is "just the beginning," says Simmler. In global policing and criminal justice, new digital tools are constantly being developed and combined in their utilisation. "This is why we need a broad public debate about the use of tools like FRT and about digital surveillance in general - by the state as well as private individuals. The authorities alone cannot be left to make these decisions."

Simmler is currently working on two projects, one examining the existing legal bases of FRT in various European countries, and the other on mapping trends in digital investigations in policing and criminal justice aimed at developing regulatory strategies at an early stage. "In this way, we also want to give legislators in Switzerland some impetus to formulate a CrimPC for the digital age," says Simmler. 

Her work also raises questions regarding legal philosophy, such as: "What level of surveillance is the public willing to accept in the name of security?" Here, the limit with regard to FRT lies in use for prevention - meaning that people are recognized by cameras without a specific reason, without them having committed a crime. "The right to privacy, in my view, has to be given greater weight in this case than the prevention of possible crime." However, there could be agreement on the use of FRT in investigative work to solve serious crimes. 

Privacy for safety and convenience?

The HSG media scientist, Miriam Meckel, is also conducting research on FRT. She released a study in the autumn of 2022 entitled "Under the big brother's watchful eye: cross-country attitudes toward facial recognition technology". For this purpose, the social acceptance of FRT in China, the UK, Germany and the USA was surveyed using online surveys and interviews. According to the study, China has the highest approval rating for FRT use, while Germany and the U.S. had the lowest ratings in a country comparison. "The results reflect the different traditions of data protection and privacy in different countries," says Meckel. 

As the study points out, FRT also brings some risks with it: For example, there are various studies that have shown that the AI behind FRT is not as good at recognising women or people with darker complexions. "AI is trained based on previous data, and so it reproduces biases and stereotypes," Meckel adds. What is more, various countries have used FRT to monitor minorities or journalists. And this is not only the case in China, Meckel emphasizes.

"While the technology has established itself triumphantly around the world, the legislation has been lagging behind this rapid development." This is why she welcomes the EU's "AI Act", which is currently under discussion and is intended to regulate the use of AI uniformly throughout Europe.

"There is still urgent need for a public debate about government and private use of applications such as FRT, and what happens to the data they generate," stresses Meckel. The key question here is how much privacy people want to sacrifice for greater security and convenience. 

north