by Vincenzo Curion
Many of the contemporary possibilities of information society start with algorithms, procedures that solve certain problems through a finite number of elementary, clear and unambiguous steps, in a reasonable time. Their origin dates back to ancient times when a Syrian mathematician collected and formalized procedures for the resolution of algebraic equations. Today, the importance of the algorithms has grown thanks to the availability of automatisms and mechanization that are part of our society. Computers, automatic and mechanical tools use algorithms to codify the behavior they must perform. These tools contribute to determining the quality of our life, increasingly also our social organization. It should be necessary to know algorithms, their peculiarities, because in this way, we can reach more consciousness about the reality surrounding us. Through a critical approach, we could establish if algorithms represent a threat or an opportunity for the quality of life of large sections of the population. It is important to overcome the idea that algorithms and generally all technologies are just for experts, because their action reverberates on people’s lives.
To overcome this prejudice about the knowledge of algorithms, let us consider the sixties of the last century. When the programming was born and those realities that today are ICT giants took the first steps, the cultural debate in the pedagogical field, caught the instances of autonomy and emancipation and produced a series of ideas that would have created a new potential for social transformation. Models that converged in the critical pedagogy. This pedagogy developed a critical approach to society. It originated from critical thinking, a basic human ability, the foundation for processes of adaptation and inclusion. All individuals follow these processes, since of their birth. To understand the mechanisms underlying reality and to interact rightly with them, people use critical thinking, an ability that goes beyond analysis. It includes an evaluation phase, when everyone must weigh every input, to produce a more suitable response to his social adaptation.
During the sixties and seventies of last century, the ideas of social criticism changed the structures and upsets, which appeared consolidated. Contemporary to the growth of information technology, society lived the youth protests, becoming aware of the importance of free access to information, re-evaluating individual creativity1. These facts nourished the minds and the entrepreneurial spirits of the time. They fueled speculation about mass communication, seeing in the nascent computer industry a way of expression, a way to count, to distribute knowledge, unhinging an asphyxiated social system. This contributed to redefine roles and paradigms. While entrepreneurs discovered the power of software and the Net, the critical educators began to be conscious that their action could discover social contradictions and constraints in the education field, extending the critical view to the entire social system, which generates the educational system. They took awareness of the education inscribed in the process of reproducing society as a “social relationship of power”. Subsequently, politics put to profit this social relationship. The conclusions were: 1) the center of critical pedagogy become the whole education, in particular, that part of education considered as fundamental aspect of “production and reproduction of social life”; 2) there is no society without its own pedagogy, there is no pedagogy that does not follow a social vision. Consequently, to determine the meaning of pedagogical action is impossible without reference to the concrete society reproduction process.
With this conviction, over time, critical pedagogy has raised the problem of “basic material conditions of society” that border pedagogical relations. To improve pedagogical relations it is necessary to improve the basic material conditions of society because the production mode, the society itself, the distribution of power and the domination, are central categories, which allow to set up a critical analysis of educational processes2.
Today we must realize that some alarming phenomena, such as the misinformation, the affirmation of the logic of individualization, the strength of impulsiveness, the growth of superficiality and of prejudice, dirty human relationships.
To stop them we need to reconsider the individual autonomy, the core of critical thinking. It is important to accept that some hypothesis of critical thinkers into sixties was wrong. Those activities that should have fueled the education of citizens, promoted the birth of a social awareness, that would push other people to study, to analyze and, ultimately, to become promoters of a social renewal, do not produce the right effect. The past critical thinkers vision does not happen. There has been any regeneration of social dynamics. We are witnessing to relational degradation, caused also by the confusion between the emancipation of the individual with the emancipation of the group. This misunderstanding caused the dependence, rather than the oppression and the enslavement of the individual and of the group, within an existing society that works against the individual autonomy. Today, to start to think about these problems correctly, it is necessary to admit that all the attempts to reach emancipation on a purely individual level is bound to fail. Autonomy is not the work of the “moral force of noble souls”. People must conquer autonomy in laborious processes of formation, historical and ontogenetic.
Those processes and the theoretical criticism of education are the basis of human growth, an inconclusive process, continuously under the permanent threat of social seduction strategies. The responsible people must have a permanent project for their own emancipation. Starting from these admissions, we have some chances to fight social challenges such as neoliberalism, globalization, multicultural society, isolation and “desolidarization”, westernization in the sense of Latouche, subjugation of experience by the cultural industry and so on. These menaces have very powerful weapons, capable of striking people, starving them, disrupting the social fabric.3
Between these powerful weapons, the most exploited are instruments and algorithms, which elaborate information. They manipulate daily our data that sustain the contemporary data economy, according to mathematical models, in ways that are very often unknown, based on choices of fallible human beings.
Many of these models have unfortunately codified human prejudices. Some software, that control daily our lives, contains misunderstandings and systematic errors. These bugs can be very disruptive for all. 4
Data are the microelements that compose and grant us information, the resource that supports most of the opportunities of contemporary society. Goods, services and people, have corresponding information images, preserved and disseminated in the universe thanks to the technology that can allow us an almost unlimited knowledge of every single event that happens on Earth. It is the Internet Web, capable of collecting and aggregating information from everywhere. Internet Web contains an incredible amount of information, increasing continuously. Our abilities to pay attention and to respond are limited, so we cannot and should not know everything, but our choices and our actions must be free from the constraints of ignorance, guided by free will. Our choices should be ethical; they should not be harming ourselves and the other people. That is why each of us lives his own life in the tension between ignoring certain phenomena and the utopia of being able to hold infinite knowledge.
This tension justifies critical action, where the term “critical” has the meaning that Horkheimer gave it:” the theoretical effort to critically illuminate the current society in the interest of a rationally organized future society”5. That means action to protect the deepest and most extensive humanity’s interest, according to logic order criteria.
To live consciously this tension, making free and justifiable choices and actions, we must educate ourselves to critical thinking. People do not need a simply observation upon “the world as it is”, but a critical approach that works to promote emancipation and social transformation where they are necessary. Critical education theory must nurture the sense of responsibility in the individual to meet the change challenge. It must educate people to grasp the links that support the built social structures, teaching to accept risks and opportunities of reality.
“The emancipatory approach of pedagogy could serve also to decipher the pedagogical practice of society, that is, to relocate all pedagogical measures in the context of their historical-social conditions”.6 These conditions are made also by technology because every technicality is born in a historical context, often like a sediment of previous knowledge. To be focused on the historical period, helps human critical judgment to control the technology growth. To do this, people can use critical pedagogy, which develops its analyses and perspectives, starting “from the present tendencies of the social process”, from the concrete material conditions of existence and it comes to think “the idea of society as a subject”. With this approach, critical thinkers can investigate all the technicalities underlying social strategies, technological discoveries, algorithms, artificial intelligence, machine learning, Decision Support Systems, predictive algorithms and all their social, political and pedagogical effects.
Unfortunately, when we use information technologies or algorithms, two different rights are against each other.
The first is the developers’ right of intellectual property, to keep the secret, to earn money with their work. Proprietary technologies and developers’ choices made to preserve the friendly aspect, have buried any mechanism behind the interface. It makes sense to talk about a limitation of his free will, of “coercion”, for which no one claims. The same happened when the user has no choice about the instrument due to scarcity of discoveries and lack of other means.
On the other hand, people have the right to behave responsibly using any tool to avoid the side effects caused by this use, but the lack of knowledge reduces their autonomy and causes a very limited control over the operations they perform.
To operate correctly, people must use technology ethically, with a critical thinking learnt by a critical education. Where there are barriers or objective impediments to understanding algorithms, despite the charm of the tools and the advantages derived from their use, people should use caution, the same anyone would use in an unknown, hostile environment. Very often stories cover technology.
Let us just think about the emotional and the experience design that fueled the marketing and brand identification discourse. If these curtains fell, the user, conquering a critical knowledge, could understand the cognitive distortions induced by the use of that specific technology in that determined experience. He should recognize that no technique is neutral as it appears, but all influence the reality.
The awareness of non-neutrality could help to ensure that the user maintains the control of technology, not the other way around.
Today, the project of every object considers not only objective parameters but also subjective, parameters such as sensation, emotions, perceptions, ideas and values of the potential buyers. In this way, producers maximize the possibility to sell, while the buyers’ judgment autonomy decreases dramatically, causing impulsive buys. To learn to be critic about a tool, to understand how the intelligent object actually works, which functionalities we really need, which are superfluous, is even more necessary as the object is smart and it is buy and sold in an “intelligent” process.
Experts make mathematical and statistical models to deduce important facts and trends, studying also movements, desires, and the purchasing power of people, because these elements influence people’s choices, from information they consider to the questions they formulate. Analysts make predictions about people’s reliability and potential as students, workers, lovers, criminals.
Although accurate, experts’ analysis requires omission of important information, due to the choices on what to include, to simplify the world in a sort of small-scale version, more easily understood. Sometimes, a model simplification derives brutally, from the excessive complexity of modeling the direct data. When this happens analysts resort to vicarious data, so-called the proxy data, from which they extrapolate their knowledge base. The end users have any control over established statistical correlations derived from data or proxy data. In some cases, these statistical correlations have shown to be discriminatory and even illegal.
The simplifications create blind spots in the models, caused by the evaluations and priorities of their creators who face problems in the course of modeling. Every blind spot is like a void in a wall. It weakens the whole wall.
The models are essentially opinions rooted in mathematics. Simple models sacrifice accuracy and judgment in the name of efficiency, while the reputation of impartiality is a reflection of goals and ideologies. Another problem about models is that they have a life cycle, like all human artifacts. A birth, an evolution and a death. Models built today will not work as well tomorrow. A constant updating is necessary; otherwise, they will end up aging. What is the guarantee that the update is constantly done? What is the proof that past deviations do not weigh and affect unlucky users forever? What is the certainty that knowledge bases are update with correctly acquired data?
Finally, today we have large amounts of data available. It is undeniable that we can find almost everything in them. What are the purposes for correlating the available data?
Statisticians, when they can intervene, they do only some posteriori in a correction phase. Updates and adjustments are necessary for the creation of a “dynamic model” formalized within the organization that operates using it. For transparency, it would be necessary that the model could be transparent also outside, towards users. This is impossible. Transparency toward users will destroy the competitive advantage of the organization because it means to share to everyone else organization’s expertise.
We must therefore consider as a fact that there will always be more algorithms with hidden errors, as the result of modeling and simplification operations that are not always up to date.
Even in the age of Big Data, correct modeling remains a problem that only trained, competent and gifted human beings must solve. Since the existing society works against autonomy, people can only reach it through resistance against the social mainstream.
It is important to know how to sacrifice the aesthetics of the goods in the right measure, to distinguish when the object is useful, when it is the incomplete objectification of an answer to a deeper question of the individual. It is necessary to recognize how has been created the need for an object, the need to experience that particular service or behavior, by news disseminated “ad hoc”, selected on what we have told about ourselves through our behavior in social networks – online or off line.
To stop these disruptive phenomena, it is necessary a critical theory of formation that discusses about the problems revolving around information, about the dangerous imbalance created in the relationship between users and technology.
To find countermeasures, first critical thinkers, then the whole society must accept that the fundamental transformations in the field of social policy and education reform have remained far below expectations. Progress has blocked the democratization of social relations and now we are witnessing a worrying regression of potential democrats and a deeper division of society. A different model of economic accumulation causes it. The capacity for valorization and value creation are always more dislocated in the technical-intellectual sector with systems managers that can perfect continuously their offers of services in an increasingly widespread and specific manner. To do this, they need to know better and more deeply every single person. Thus, they profile all the people through, maybe, too many processes that collect data from billions of users. These “wild aggregations” create base of knowledge that allows continuous refinement of the systems, designed to incorporate an ever-increasing amount of data, refining the analysis techniques and gaining always much more.
The autonomy of judgment exposes to a certain risk of prejudice. This risk can be reduced to a threshold of tolerance in a dialectical process between individual and reality. It is not dangerous to talk about one’s prejudice, the danger is who nurture prejudice silently, avoiding confrontation. In the case of intelligent technologies used to decide, feedbacks make the difference; but if they are provided, the operation of review them can take place not in real time, it can occur not with due discrimination, it can be organized in a non-transparent or even unjust manner.
Therefore, although the solutions that adopt artificial intelligence and machine learning are innovative, they are not free from errors. The machine performs a learning, which always comes from a programming, like a discrimination to apply on the inputs it receives. The machine does not act on this discrimination. If it did, it would mean that is able to think about consequence like a human being. Therefore, we need the human action of the system administrator, who works to modify the parameters of discrimination.
There is much evidence of the importance of human action. From the point of view of market logic, due to the pervasiveness and engineering of the instruments, the original discourse that supported capitalism falls. Originally, it was true that “obtaining the interest of each one we obtain the common interest, because it is the result of the mediation between the interests”. With current artificial intelligence systems, it happens that “everyone’s interest is their own” until the system becomes unsustainable for everyone. It is the theoretical proof that the evaluation of the interests of others as an end in itself is not the economically convenient and sustainable solution neither for those who suffer it, nor for those who practice it. This problem, in the long term, can also negatively affect the policy.
Although a new industrial revolution is taking place, with machines able to dialogue with each other and to support work models that were once unthinkable, we are not in technocracies. Experts have estimated that ongoing wars are and will be in cyberspace, but humanity strives to maintain its central position against the fate of the planet. This means that any government, even those that are not democratic, are the prerogatives of humanity, which deal with a physiological limitation of their governmental action7.
Taking into account the fact that a government has a bi-univocal influence relationship with technology, increasing or reducing the resources available for research, productivity, competitiveness and education, in the system it manages, do people go in the direction of greater democracy? Is there a possibility to act on technology to direct its actions? Can the power of technical lobbies influence politics to direct it towards convenient choices?
In a condition of obvious information asymmetry, everyone’s choices could be highly impulsive and of short duration, because there is no possibility of stable references. The lack of understanding the complexity, triggers aggression and confirms what the Nobel laureate Gary Baker calls crime “a rational behavior in a situation of uncertainty”8. Paradoxically, the predictive crime algorithms, which are using in some areas, crystallize the initial vision and trigger processes that increasingly consolidate expectations.
If a neighborhood is at risk, the raise in patrols increases the spiral of arrests also for minor problems, confirming the bad reputation. Technological discovery, which should fuel progress by acting on the causes of a problem, ends up creating discrimination and division, hiding the effects of the problem.
If we want to resist these worrying phenomena, we must hone our knowledge and technical skills. Ignorance of the mechanisms at the base of a simple web page lowers the threshold of vigilance of each reader and allows, through the mechanism of word of mouth, to transmit false news, feeding malfunctions in the network. To improve, each user will have to develop a critical eye for new tools by learning to use them, for their own emancipation.
Therefore, the learning process and critical education, the process of building critical thinking, are again central. Is anyone in the information society working on these processes of learning, of critical education with the aim of emancipating themselves from the role of “used by technology”? Theoretically yes. It is the hacker. That passionate builder and rebuilder of existing technology – circuits, programs, mechanisms – that tirelessly strives to learn to interpret technology for its own purposes, proactively, driven by the desire to unhinge a social system that, through its own technical means, transmits values and triggers inequalities. The hacker aims to emancipate himself from the succubus condition of user technology.
In perfect analogy with the objective of the theory of critical education, ethical hackers aim at individual emancipation and the development of a strong ego, a stable and capable identity. To avoid a fall in the fictitious and autistic worlds of an experience reduced to sensations and impressions.
By working in organized, usually secret, communities of practice, hackers mature the skill and pragmatism of doing things. They gain the resilience that is lacking in other sections of the population.
These communities of practice can be the model, aimed at emancipation, of an educational style that on the one hand fully takes into account the dangers and constraints of socialization, on the other hand it allows the transition to a critical education of the person?
Could this propensity to do be medicine for the spirit? There is no definitive answer, but a reflection can be useful. Today we are still at the beginning of the study of the hikikomori phenomenon, but if we compare the hacker with the hikikomori, we see for both the omnipresence of the machine but diametrically opposed roles. The hacker is active in changing technology, gradually consolidating his identity. The hikikomori has a passive attitude towards technology, annihilating its own identity and life. Hackers work on technology while hikikomori suffer from it.
The only possible conclusion is that we need awareness, education to complexity, to achieve emancipation. If we renounce these rights and treat mathematical models, as if they were a neutral and inevitable force, like time or tides, we renounce our responsibility and our future.
Bernhard A., Pedagogia critica: tendenze di sviluppo e progetti per l’avvenire, Collana di Studi Internazionali di Scienze Filosofiche e Pedagogiche Studi pedagogici numero1/2006
O’ Neil C., Armi di distruzione Matematica, Giunti Editore SpA/ Bompiani, 2016.
Odifreddi P., Incontri con menti straordinarie, TEA SpA, Mondadori SpA, 2007.
Mezza M., Algoritmi di libertà La potenza del calcolo tra dominio e conflitto, Saggine, n. 305, 2018.
Aa. Vv., Umanistica_digitale, Ed. Oscar Mondadori, 2014.
Levy P., L’intelligenza collettiva: per un’antropologia del cyberspazio, Feltrinelli, 1996.
De Kerkhove D., Architettura dell’intelligenza, Testo & Immagine, 2001.
Mittelstadt B. D., Allo P., Taddeo M, Wachter S.,Floridi L. The ethics of algorithms:Mapping the debate, url https://journals.sagepub.com/doi/pdf/10.1177/2053951716679679
Ethically Aligned Design, First Edition, https://ethicsinaction.ieee.org/
1 M.Mezza, Algoritmi di libertà La potenza del calcolo tra dominio e conflitto, Saggine, n. 305 2018.
2 B.Armin, “Pedagogia critica: tendenze di sviluppo e progetti per l’avvenire”, Collana di Studi Internazionali di Scienze Filosofiche e Pedagogiche Studi pedagogici numero1/2006
3 B. Armin, “Pedagogia critica: tendenze di sviluppo e progetti per l’avvenire”, cit.
4 C. O’ Neil, “Armi di distruzione Matematica”, Giunti Editore SpA/ Bompiani 2016.
5 B. Armin, “Pedagogia critica: tendenze di sviluppo e progetti per l’avvenire”, cit.
7 Let us assume that the best form of government is the democratic one. In an interview, the Nobel Prize winner Armatya Sen, show us that the Arrow’s theorem, which has passed into the collective imagination as “the mathematical proof that democracy does not exists” derives from the fact of excluding, directly or indirectly, the use of a piece of information that a functioning democracy should take into account”. When government removes the restriction on information, and democratic procedures are broader and more permissive from the information point of view, there are no results of impossibility.
8 P. Odifreddi, Incontri con menti straordinarie, Longanesi, TEA SpA, Mondadori SpA, 2007, p.27.
Vincenzo Curion is a telecommunications engineer and teacher of technology at the first level secondary school. Assistant in “Image education”. He collaborates with the Department of Humanities of the University of Naples “Federico II”, for research and research scientific activities on the topics of gender pedagogy and new technologies with particular attention to the digital divide, smart cities for women, training in the STEM field. He also collaborates in research activities of the Chair of Pedagogy of Family Relations on the themes of Smart Cities, of equal opportunities and reconciliation policies.