Biometric facial recognition and AI Act: the perfect storm

Biometric facial recognition and AI Act: the perfect storm

[ad_1]

Hard cases make bad law, – extreme cases produce bad norms – said in 1904 Oliver Wendell Holmes Jr. US Supreme Court judge and distinguished jurist. Therefore, it would be too easy to exploit the choice of the Florence prosecutor’s office to analyze all the approximately 1,500 surveillance cameras to look for useful elements to save the little girl in order to silence the “privacy extremists” who oppose the use of biometric identification technologies in the name of “fundamental rights” and who demand the imposition of a prohibition of this kind also in the forthcoming “AI Act”.

Artificial intelligence

The European Parliament approves the AI ​​Act, stop live facial recognition

by Archangel Rociola


However, just as the “fanaticism” of the privacy worshippers, in the same way it is not sustainable, in the name of absolute prevention, to transform our society into a huge accumulator of data to be used not so much in real time as, and this should be worrying, “in case of need” as already happens for traffic data.

This confrontation that has been going on for decades becomes very topical regarding the use of biometrics for the “perfect storm” that is about to unleash.

The AI ​​act and the future Italian law on biometric identification

In Europe, the soon to be issued Community regulation on AI also intends to deal with the application of this technology to personal identification. The debate opposes those who want to ban it completely and those who want an exception for security and anti-terrorism purposes (subjects, moreover, on which the EU could not decide).

In Italy, on 31 December 2023 the deadline set by art. 9 Legislative Decree 139/21 which prohibits, until that date, the use of biometric identification in public places by subjects other than the state and, with some limitations, local police forces. The legislator intervened almost in the Cesarini area by approving a bill with which the ban extends to 2025 and which now passes to the Senate, but the terms of the question remain the same.

A scenario is therefore taking shape in which the lack of synchronization between Rome and Brussels could cause paradoxical consequences such as, for example, the “liberalisation” of all those biometric identification systems which Not they use artificial intelligence, or, on the contrary, an unmotivated and preconceived limitation of the use of certain technologies.

DigitalEurope

Little cybersecurity and too much desire to monitor: privacy matters little to the public sector

by Vincenzo Tiani and Rocco Panetta

The (non)sense of a stale debate

In purely formal terms, the controversy over the use – or rather, the abuse – of these instruments by the State should not even exist: if we start from the assumption that we live in a liberal democracy, we must recognize that there is a system of guarantees and conclude that, if anything, the issue concerns the “how” and not the “if” to use particular technologies. In fact, the concern that the system of formal protections actually works up to a certain point has been systematically reinforced by the news that over the years has involved not countries with “variable democracies” but “very civilized” representatives of the First World.

The privacy fetish and the controversy surrounding facial recognition

by Andrea Monti



However, regardless of how you think about it, it is not a very functional approach to base a political choice on ideological radicalisations rather than on pragmatic considerations. It is therefore worth reflecting on what the word “security” means in concrete terms.

The inconsistencies of privatized security

It is a fact that the State is no longer able to guarantee crime prevention and public safety on its own, as evidenced by the devolution of large parts of these activities to the local police, the change of role of the supervisory institutions in the ambit of “security subsidiary”, the use of stewards in stadiums and the de facto diffusion of personal escort services (o bodyguarding for heterophiles) both forbidden and hypocritically ignored. The issue would be quite complex, but the summary is that even in our country we are moving towards a model of public safety management that increasingly involves private entities.

If this is true, then it would be at least contradictory to delegate this important function to subjects who are then not put in a position to exercise it fully.

Thus, for example, the ban on the use of biometric identification systems in public places by private individuals clashes with the fact that security institutes (which, despite being companies, operate, it should be remembered, with a prefectural license) have rooms control through which they manage alarms and interventions. Why couldn’t they equip themselves with advanced systems to do their job even if they are not police forces?

Similarly, such a ban prevents shopping centers and managers of areas such as stadiums or large public car parks (perhaps underground) from adopting systems that can more or less automatically identify potential aggressors or other critical situations, only to then find themselves accused of “failed to control” in the event of an accident or violent action.

The examples could go on, but already from these synthetic considerations it clearly emerges that the theme of biometric identification (possibly accomplished also through AI) does not concern the limited and limiting debate on the “right to privacy” or the “surveillance society” but involves, at a structural level, the security model that a State intends to promote. In other words, this means taking a political decision on biometrics as such, regardless of specific uses.

Biometric security is among us (and not from today)

In this regard, it is good to remind those who are not in the sector that biometric identification – i.e. the use of personal data derived from the analysis of an individual’s physical characteristics – is not something of our times. Just to stay in Europe, in 1853 Alphonse Bertillon created anthropometry from scratch, i.e. a technique for measuring physical characteristics, used by the French police and then by many others around the world to build archives (i.e. data bases) as an identification tool of suspects and arrested. Lord Galton’s research on fingerprints followed (with better luck), which gave rise to the banks that collected them, then those that led to the classification of the photosignals and finally those that were the basis of the DNA databases. It is therefore easy to realize that biometric identification based on databases has been a fait accompli since time immemorial and that concentrating on a limited aspect such as that of software capable of analyzing video streams is quite meaningless, especially in relation , for example to the consequences of research on the genetic phenotyping which have been going on for years but which the general public doesn’t care too much about.

The importance of a responsible political choice

These reflections, despite their conciseness, demonstrate how the public debate on biometric identification and related issues is marred by the usual compulsion to repeat that occurs every time technology makes new possibilities available, and which systematically translates, on the one hand , in the irresistible drive to blindly forbid and, on the other, in looking at the classic finger instead of the equally paradigmatic moon.

The regulation of social relations, like it or not even those related to safety, passes through the collection of data and the ability to process them with the maximum possible efficiency. We can decide to ignore this fact, ban tout-court what we don’t like and continue to demand that reality bends to the representation we want to have. Or we can take note of what is happening, and try to use to everyone’s advantage what human ingenuity has been able to create, before someone else, maybe Big Tech, does it for us, in their own interest and ours. Despite.

[ad_2]

Source link