Soccer Fans, You’re Being Watched

“Public safety is a longstanding justification for the spread of biometric surveillance systems, while Covid-19 introduced a health dimension through body-temperature monitoring,” Hutchins says. “Marketing speaks to a seamless consumer experience for attendees at high-profile and high-cost events and encompasses everything from ease of movement in and out of the stadium through to minimizing queues for toilets and food and drinks.”

Is the deployment of such systems inevitable? “The problem here is the idea that the rollout of such technologies and infrastructures are unavoidable and an increasingly ‘natural’ part of the stadium experience,” says Hutchins. He stresses the importance of “clear and visible notifications for spectators that such technologies are in use.” Most importantly, he advocates for the introduction of “strong legislative and regulatory safeguards governing the introduction and use of these systems and the control and use of data.”

Indeed, European lawmakers have been attempting to regulate biometric mass surveillance. In April 2021, the European Commission submitted a proposal for an EU regulatory framework on artificial intelligence. Currently, the European Parliament is forming its opinion on the proposal, while the European Council is due to discuss the file in early December.

“The European Commission’s draft AI Act recognized that biometric identification is an inherently risky technology but bizarrely put forward a prohibition in Article 5 that is so weak, if anything it amounts to more of a blueprint for how to conduct biometric mass surveillance than a genuine ban,” explains EDRi’s Jakubowska.

Although there hasn’t been a final vote yet, members of the European Parliament have supported a full ban on remote biometric identification in publicly accessible spaces, by both public and private actors, and are likely to adopt that final position. However, such a ban would not include emotion-recognition uses of biometric systems, nor biometric categorization (e.g. profiling people based on their age, gender, or ethnicity). “We think that those urgently need to be banned in the AI Act. Shockingly, in the vast majority of cases, the Commission’s text did not even make those uses high-risk,” says Jakubowska.

“There should be no exceptions to the ban, as even a supposedly narrow exception would mean that mass facial recognition infrastructure would be rolled out and primed to be switched on whenever it is deemed necessary,” she adds. By definition, these systems scan the faces or bodies of every person who passes by, so it is not technically possible to limit them to, for example, suspects or perpetrators of serious crime.

In the US, the Biden administration has proposed a blueprint for an AI Bill of Rights, which commentators consider toothless, as it does not contain clear prohibitions on AI deployments that have been most controversial, like the use of facial recognition for mass surveillance.

As Qatar prepares to roll out the red carpet, fresh reports suggest everyone traveling to the country during the World Cup will be asked to download two apps that, according to experts, essentially hand over all the information on your phone. They say this highlights the urgent need for privacy regulation in global sporting events.

“Without regulation, there is a tendency to hoover up all available data and hold on to it indefinitely—this creates ‘honey pots’ for hackers and also contributes to function creep: the temptation to find other uses for the data, says Hutchins.

“Law enforcement agencies should pursue the many other tools and techniques at their disposal and that are compliant with the rule of law and human rights, rather than resorting to the use of technologies that have been widely condemned by civil society, human rights lawyers, and even UN human rights authorities,” says Jakubowska.

“Once these tools are out there, governments will argue that they should be used widely,” she summarizes. “It’s a gateway to mass surveillance.”

Source

Author: showrunner