Tuesday 3 May 2022

HUMANS DEGRADED TO DATA POINTS

Algorithms can endanger freedom and manipulate people. The discussion about this must not be suppressed.

Dirk Helbing, Peter Seele and Thomas Beschorner

The original German version was published by the journal “schweizer monat” in May 2022:

https://schweizermonat.ch/der-mensch-wird-zum-datenpunkt/

Background:

In a lecture in February 2022, ETH professor Dirk Helbing discusses the use of algorithms. He gives several examples: He says that computer programs can be used to control supply chains, feed pigs – or even control people, as is already happening to some extent in China. Chinese students in particular get upset by the slide of Helbing, claiming that it equated them with pigs, and circulate it on the Internet. A shitstorm follows. Hate messages are posted around the world. The professor is called a racist and a fascist, even receives death threats, and his family along with him. In the article, Dirk Helbing and two of his colleagues explain what it was all about: namely, that freedom and human dignity must be protected also in the digital age. Both are in danger, as shown, above all, by so-called social credit score systems, which are already being used in China. (dj)

Read more at:

https://www.nzz.ch/schweiz/shitstorm-an-der-eth-ein-professor-erhaelt-morddrohungen-ld.1673554

https://www.srf.ch/kultur/gesellschaft-religion/eklat-an-der-eth-wenn-ein-angeblicher-schweinevergleich-zur-staatsaffaere-wird


Big Data and algorithms have become powerful tools that can be used to organize processes and decisions efficiently and based on rules. This can be quite beneficial, but sometimes involves serious problems that are systematic in nature. This is because algorithmic systems developed for a specific context – such as supply chains – are often generalized and transferred to other objects and processes, also humans and other living beings. For example, algorithmic systems are being used in smart and animal farming. Moreover, increasingly algorithms are being used even as instruments to regulate the organization of human life.

One often says „Code is law“, meaning: software determines what is still possible in our society and what is not. From hotel to flight bookings and online shopping to information and job search to dating, everything is now datafied and personalized. More and more effectively, we are made to buy certain products, to have certain feelings and opinions, and to vote for particular political parties. Keyword: "Big Nudging." The methods of manipulation are often so subtle that we don’t even notice them. And yet they are increasingly influencing our actions, our lives, and our society in ways that the “social engineers” sitting in digital control rooms of Big Tech companies and intelligence agencies like. All of this is done with sensitive personal data collected about us by companies and states, often using spying methods. Three methods, in particular, are being used here: "profiling", "scoring", and "targeting" (see below). Increasingly, people are being treated like things, data points, or animals. Doubts are growing as to whether the use of such technologies is compatible with human dignity and our conception of a free society, since they touch the core of democratic constitutions and central principles of the UN Human Rights Charter. What is the problem?
  1. Mass surveillance: With the “War on Terror” at the latest, digital mass surveillance has spread around the world. “Predictive policing” – identifying and tracking potential criminals even before they have committed a crime – has spread internationally. The tool, originally developed for terror investigations, is now also being used by the police. The problem is: the software treats every person as a potential criminal. The constitutional “presumption of innocence” has in principle been replaced by a “presumption of guilt”.
  2. Error rate: the error rates of the algorithms used are often high. They can exceed 95 percent. In other words, there are many innocent people on the suspect lists. Predictive policing is therefore increasingly viewed as problematic. Despite all this, “profiling,” i.e., the creation of highly detailed personal profiles based on sensitive personal data, is widespread, including in the advertising industry. Meanwhile, attempts are already being made to create “digital twins” that are supposed to be a living digital image of ourselves. They are being used for virtual experiments to try to find out how to manipulate our thinking and behavior – and in the future, our health – most effectively. But are digital twins really accurate? Or something that looks confusingly like reality, but is not realistic after all? And is it ethical to use them in this way?
  3.  Discrimination: It’s not always about criminals or suspects, who might pose a potential threat. Facial recognition, chat bots, legal tech (i.e., software used to automate the legal system), and algorithms used to pre-screen job applicants have all increasingly come under scrutiny recently. It turns out that the underlying Big Data and AI systems often make racial or otherwise discriminatory classifications and suggest decisions based on them. The risk of bias is great. Worse, “scoring”, i.e. the classification of people on a point scale, intentionally gives customers and people a different value. Scores such as “customer lifetime value” are deciding, who will be offered which products and services at what price – or will get offered them at all. This implies discrimination. In principle, the Social Credit Score even decides who has what rights. The treatment of Uyghurs in China in particular has been sharply criticized internationally. The “Karma Police” surveillance program of the British intelligence agency GCHQ and also the Austrian triage system for the unemployed have been convicted for violating applicable law.
  4.  Dehumanization: The aforementioned developments are driving dehumanization. People are increasingly treated like robots, things or data. Qualities such as human dignity, consciousness, love, freedom, and creativity, which are of great importance to us humans, tend to be neglected. They cannot be adequately captured by data and are therefore often ignored. A data-driven society is thus in danger of losing important qualities. The supposedly optimal world created on the basis of numbers can thus quickly become inhumane and dystopian.  
  5. Dangerousness: “Profiling” and “scoring” are the basis of “targeting” – a method with which individuals having certain characteristics are specifically being targeted. This often triggers a special treatment: personalized advertising designed to trigger the purchase of a particular product or service, the manipulation of people with certain personality traits toward desired voting decisions, the setting of rates or prices for an insurance policy, etc. But targeting can also result in certain people being treated as suspects, based on questionable generalizations, or even in the triage of patients in hospitals, i.e. decisions about life and death. Furthermore, targeting can be used to mobilize people whose sensitivity to certain issues has been identified. The phenomenon known as “footfall” is used in particular to initiate protest movements, which can sometimes end violently. Even with little personal data, dangerous applications are therefore possible.

So what can we do? Obviously, we need a new, a better digitalization. This should protect human rights and human dignity and offer us the best possible opportunities for development in a free society. It should put people at the center and support creative, social and environmentally friendly action. Digital platforms and applications need a value-sensitive design. Because, when it comes to the question of how to design the digital world, nothing less is at stake than the question of sovereignty – both of individuals and of democratically legitimized institutions.