DayFR Euro

“Artificial intelligence poses a risk of implosion of the social body”

The developments in AI prove the philosopher of convivialism and critic of industrial society Ivan Illich right, who observed that from a certain threshold, a device becomes counterproductive to the point of threatening social cohesion. At a time of growing loneliness for which screen addiction is partly responsible, new AIs carrying the promise of friendship or love pose a risk of implosion of the social body and widespread unease. ladder. Are these new artificial beings a new source of confinement?

The danger is particularly worrying for our hyperconnected youth but paradoxically more affected by loneliness than ever. According to the Jean-Jaurès Foundation, 71% of 18-24 year olds say they are affected by loneliness, much more than the general population (46%). It must be said that, in twenty years, the daily time spent with friends for adolescents has increased from two and a half hours to forty minutes.

Conscious machines

Created in 1966 by a researcher at the Massachusetts Institute of Technology (MIT), the Eliza chatbot gave its name to a well-known phenomenon: the Eliza effect. Although very rudimentary compared to current major language models (for example ChatGPT), the chatbot reformulates sentences in an interrogative form. It was enough for users to attribute consciousness to the machine.

Nowadays, AIs have acquired a very in-depth knowledge of human behavior, to the point of making us forget their mechanical nature. The attribution of degrees of humanity to these algorithmic devices is inevitable, what the psychiatrist Serge Tisseron calls degrees of “personhood”. In this regard, the commonly accepted belief that the integration of robots and other AI within human communities would be more complex in the West than in Asia, due to our respective metaphysical beliefs, is false. According to a 2022 study, the attribution of cognitive, emotional and intentional abilities to robots is similar in the two cultures but is carried out through different socio-cognitive processes.

From love to algorithmic time

The new version of OpenAI’s ChatGPT, under the name GPT-4 Advanced Voice, marks a significant step forward in human-machine interactions. Able to see, hear and speak through a natural voice, it challenges our abilities to distinguish between humans and AI. These AIs are no longer limited to carrying out laborious tasks, they now forge emotional bonds with us, going so far as to create relationships of friendship, even love. OpenAI is aware of these risks of emotional attachment but also of reducing human interactions. This fact resonates with the highlighting by MIT of the 777 risks of AI classified into seven areas among which “Human-Machine interaction” manifests itself as a risk of exploitation of trust by anthropomorphization implying an emotional attachment pushing to follow without retaining AI advice.

In our smartphones, the famous Replika application already has more than 30 million 3D avatars created by humans to serve as temporary friends or romantic partners. This success was possible even though the modest GPT-3 was hiding behind the avatars. With the latest more powerful models associated with devices such as virtual reality headsets or augmented reality glasses, the relationship that users have with their Replika risks covering new living spaces and being more in-depth.

But be careful, these relationships are not without risks, as was the case during a February 2023 update that modified the behavior of avatars, to the point that many users expressed a feeling of distress and mourning, no longer recognizing the person with whom they had shared moments of life for months, even several years. Are we ready to welcome this new sociability in a world where the epidemic of loneliness is already well established?

A danger for the youngest

The confinement due to confinement has increased the place of screens in our lives and has contributed to distancing us from each other in various ways with a ratchet effect (teleworking, distance learning, Netflix versus cinema, etc.). For the youngest, the presence of screens is normal, having not known any other world than this one. Worse still, younger people are not able to understand the real nature of an AI, making the line between a real person and AI even more blurred. A recent study shows that children aged 3 to 6 are more likely to trust robots than adults, especially when it comes to following instructions or telling secrets. Sam Altman predicts that our children will have as many AI friends as human ones, since neural circuits seeking social interaction can be satisfied in some cases by an AI.

Regarding Replika, its creator recently declared that people “want to talk to someone, they want to watch with someone, they want to play video games with someone, they want to take a walk with someone, and that’s what Replika is for.” The question before us now is whether we allow society to be transformed by actors from Silicon Valley.

The truth is perhaps to be sought from Francis Cabrel who writes in Let’s talk (2020) : “I too sometimes want to see anyone/I have feelings for my phone/I live right next to people who worry/I should raise my head more often/So let’s talk. »

-

Related News :