The machines, with the algorithms that make them work, began to understand simple English sentences in the 1960s. Later, they learned to translate more complex texts into hundreds of languages, filter our emails, and recognize handwritten text. And today they are already able to beat us in strategy and logic games, understand what we say and act accordingly, help doctors and beat the best chess or go players.
A bot is a computer program that imitates human behavior and a chatbot is the bot that simulates a conversation with a person. The first has existed for 50 years. And the second, chatbots, are springing up everywhere. In the press, these algorithms are referred to as artificial intelligence. I prefer to call them computer algorithms because they do a lot of computation on top of specific artificial intelligence (AI) algorithms that are a part of computing.
The most advanced programs are capable of understanding human speech and having a logical and pseudo-intelligent conversation, in multiple languages, which also allows for great advances in machine translation, giving the instant translator a glimpse of the horizon. The capabilities of the ChatGPT tool are a good example of what we say.
But achieving this automatic understanding of the information is not easy because the computer program not only has to read or listen to the information, but also has to relate and interpret it in order to understand it and act or react accordingly. If someone says Seville during a conversation, the computer algorithm must be able to distinguish between the city and the football club, and to do this it must not only be able to process the language, but also have been provided with a lot of information, definitions of concepts and basic knowledge, to be able to pull and extract the solution from the context like a human.
Computer machines and algorithms are the future, and in many cases the present, and their advances arouse surprise, fear, and most often apprehension. Undoubtedly the greatest fear of computer algorithms and their advances is their hypothetical ability to make decisions playing against humans. The reality is that, unlike the stories some movies tell us, the machines, the algorithms, are unaware of their own existence, they don’t have their own goals and feelings. This is centuries away from our current technological capabilities.
But do algorithms need an ethical guide? UNESCO took the initiative. Its 193 members this week adopted a list of recommendations that will become the world’s first ethical guide. This guide can be viewed here.
One recommendation, number 26, is exhaustive: “AI systems should not be used for social assessment or mass surveillance purposes.” Another of the councils, 36, notes that “an AI system should never be the ultimate human responsibility and accountability can replace”. “As a general rule – the text adds – life and death decisions should not be handed over to artificial intelligence systems.”
But the debate goes deeper
Could there be a political party run by a computer algorithm and led by a chatbot? It might sound like science fiction or the plot of a dystopian series, but it’s a real fact; this party already exists and is part of the election campaign in Denmark. It’s called the Synthetic Party and is run by Leader Lars, a chatbot that any citizen can talk to.
The promoters of the Synthetic Party are aiming to run in the next election campaigns in Denmark. In return, they are demanding that the law be changed so that an algorithm can run in the elections. What they’re asking isn’t a group of people prompted by the algorithm to present themselves. They want the computer algorithm to go to the polls and make the relevant policy decisions based on how it interacts with voters.
But what is your ideology: progressive or conservative? “The party is synthetic, which literally means that it homogenizes what seems contradictory or disparate,” explains Asker Bryld Staunaes, a member of the Computer Lars artist group and MindFuture technology center and principal creator and promoter of the Synthetic Party.
There are other similar movements in the rest of the world. They call him virtual politicians. Information about these movements and related articles can be found on Wikipedia.
But the question is: do we want a computer algorithm running the elections? My answer is no. No, for various reasons. First, politics is a basic human endeavor. It is the activity we use to make global decisions. Politics is inherently conflicted in that there are conflicting interests in society, and therefore we are deceived when they advocate synthetic, neutral politics that are above interests.
In line with Unesco recommendations, I think there must always be a person responsible for an algorithm’s recommendations, especially in the realm of politics. Likewise, I think we should ask that one person take responsibility for every decision that an algorithm can make, but should not hide behind. Be it applying for a mortgage, prioritizing at a hospital, etc.
I assert the importance of politics, political participation and the need for politicians. Defending that a computer algorithm makes the decisions is believing that neutrality is possible and that politicians should go. It’s an anti-political movement that, if we think about it carefully, prefers to leave things as they are.
Miguel Toro He is Professor of Computer Languages and Systems at the University of Seville.
Chronicles of the Immaterial is a space for the dissemination of computer science, coordinated by the academic society SISTEDES (Sociedad de Ingeniería de Software y de Tecnologías de Desarrollo de Software). The intangible is the immaterial part of the computer systems (i.e. the software) and its history and development are told here. The authors are professors from Spanish universities, coordinated by Ricardo Peña Marí (professor at the Complutense University of Madrid) and Macario Polo Usaola (employee professor at the University of Castilla-La Mancha).
you can follow THE AGRICULTURAL TECHNOLOGY on Facebook and Twitter or sign up here to receive our weekly newsletter.