The privacy fetish and the controversy surrounding facial recognition

The privacy fetish and the controversy surrounding facial recognition

[ad_1]

The controversies triggered by the recent declaration of the interior minister on the extension of the use of facial recognition are once again affected by the confusion between the regulatory and the political level, by the distorted narrative that has stratified over the years around the fetish of “privacy” and of the discovery, by non-experts, of how the Italian public security apparatus is made up -not from today-.

Who’s Afraid of AI-Powered Face Recognition?

by Andrea Monti


The powers of public safety pertain above all to the protection of civil society and therefore, indirectly, to the very survival of the State. In other words, guaranteeing the orderly conduct of daily activities is the prerequisite for maintaining public order and therefore for what Anglo-Saxon jurists already called in the Middle Ages peace of the land. If these words sound, euphemistically, somewhat back it is because in fact they are: they refer, in fact, to an apparatus of rules still based on the Royal Decree June 18, 1931, also known as the “Consolidated text of public safety laws” (which is flanked, but it is another matter, by the “Consolidated text of health laws”) much modified, never repealed.

As it pertains to the protection of the State and —today, to the protection of the constitutional order— public security is a matter not covered by both the intervention of the EU and that of the independent authorities, and of that for the protection of personal data, in particular . There are rules that establish some form of involvement of the latter in the more technical-IT aspects of police activities, but there is no question that the superior interest of the State and the protection of the community cannot be globally limited “in the name of privacy” because there are already mechanisms for the protection of the citizen on the basis of which public security activities are subjected to judicial control — to the judiciary, in other words — whenever they result in a potential limitation of rights.

In the specific case of biometric facial recognition, then, its use for purposes of public safety and criminal investigations has already been allowed by current legislation (specifically, by the conversion of law decree 139/21) for almost two years. To this should also be added the fact that the Court of Cassation is solid in noting that in public places there is no reasonable expectation of privacy, therefore the crime of unlawful interference in private life does not arise. Finally, as anyone who has a minimum of familiarity with the practice of criminal law knows, the authorities’ powers of control, even preventive, are extremely extensive and widespread and if they were exercised in bad faith, the only differences with countries with variable democracy would be the climate and the ‘architecture.

Global control via IoT under the guise of energy saving. A danger avoided (for now)?

by Andrea Monti



This last consideration brings us to the heart of the problem which is not juridical but political: on the one hand, security can only be guaranteed with prevention and prevention implies widespread, pervasive and efficient control. On the other hand, the availability of more extensive and effective forms of control generate a reaction of instinctive refusal in people. Hence the catastrophic positions in the name of which who knows what would happen if the State had the availability of all the information concerning us, if it could track our movements, know what we do and who we spend time with.

The problem is obviously real, but managing it by invoking a precautionary principle in the name of which certain activities must be prohibited first why someone might abuse it is simply a bad choice. With all its problems, our political system (and consequently the legal one) is robust enough not to give in to authoritarian tendencies and over the years the police force has developed a profound democratic awareness. If this were not the case, we would have changed regime long ago, without having to wait for biometric facial recognition.

In other words, and on this I openly challenge the zealous “defenders of privacy”, opposing the use of technology to guarantee security because someone could abuse it is tantamount to stating that you have no trust in institutions. If so, then the inevitable logical consequence of this position would be to go underground to overthrow a state that has betrayed democratic ideals. Who has the courage to make such a statement launch the first tweet.

Certainly, as mentioned, the extension of widespread control made possible by information technologies must be managed, but more than ideological and rather outdated proclamations —at least from the perspective of those who have been dealing with these topics for a while— it would be necessary to strengthen the tools provision of citizens to detect and react to any errors in the use of information or for deliberate violations of people’s rights. Translated: we need to strengthen the right of access to data held for public safety purposes and establish truly rapid procedures for obtaining answers from the competent authorities.

Data, between the fetish of privacy and antediluvian rules

by Andrea Monti



Directive 680/16 also implemented by Italy deals with the issue and already provides tools for checking, case by case, whether a specific police activity connected to the collection of data and information has or has not exceeded the boundaries set by law. It is not perfect and there is certainly room for improvement, but the rule that protects people exists and can already be applied.

Instead of complaining of hypothetical abuses, therefore, one could count the number of proceedings initiated in the name of Directive 680 which ended with evidence of abuses by the executive power. It would be as much an empirical element as you like, but still concrete, to understand if and how much we would be at the mercy of a spying and authoritarian state. Curiously, however, there is no trace of these measures.

It may be true that, in the name of the precautionary principle, “absence of evidence does not equal proof of absence” and therefore, somewhere, abuses have been committed that have not been discovered. But when it comes to law, negative sunt not proband (negative proof cannot be given) and therefore, to date, we must exclude that violations have been committed and acknowledge that the system, as a whole, “holds”.

On the contrary, applying the precautionary principle to instruments for regulating social relations is an irrational act which only serves to cause the paralysis of any component of the State and undermine the rights of citizens. The proof that it is possible to combine surveillance technologies and democratic protections was provided, during the pandemic, by the choices of South Korea. Contact tracing was possible thanks to a massive use of data accumulated on citizens, but not for this the country suffered democratic setbacks. This certainly didn’t happen because of the cliché that wants the “Orientals” to be more “respectful of the rules”, but because of the government’s attitude towards transparency in choices and in their application.

Reality, but this the zealous defenders of privacy refuse to admit, is not made up of zeros and ones but of an infinite series of intermediate values, and the law is neither deterministic nor scientific. How politics is a human fact and as such intrinsically fallible and fallacious. But in its weakness it is certainly preferable to the binary certainties of those who worship fetishes and new gods, in the name of convictions that have much more to do with faith than with reason.

[ad_2]

Source link