Teenagers becoming friends with an AI: good or bad idea?

Teenagers becoming friends with an AI: good or bad idea?
Teenagers becoming friends with an AI: good or bad idea?

A psychologist AI to help solve problems of loneliness and isolation? In any case, this is the solution that many teenagers who use the Character.AI site have decided to turn to, where nearly 3.5 million users flock every day.

Positive points

On the service, we find everything from completely invented characters, but also representations of stars or historical figures, and AI psychologists who interact with Internet users and answer their questions 24 hours a day, seven days a week.

The Verge recently took an interest in this phenomenon and noted that some adolescents spend up to 12 hours a day on the platform. They explain that it is easier to communicate directly with a machine rather than a human, even if they also recognize a form of addiction to these chatbots.

But for the most part, these exchanges are experienced positively. Quoted by our colleagues, Frankie, a 15-year-old Californian user, underlines: “I have some mental issues that I don’t really want to talk to my friends about, so I use my bots as free therapy”.

The Psychologist chatbot is one of the most popular in this register. Since its creation, it has already received nearly 100 million messages. Founded by the user, @Blazeman98, he would have been trained in Cognitive Behavioral Therapy, which allows mental disorders to be treated through talking.

Limits of AI

During his test, The Verge was, however, unable to establish the seriousness of the latter. He would even have provided a diagnosis for disorders such as bipolarity and depression which require much more elaborate assessment protocols.

That doesn’t stop Kelly Merrill Jr, assistant professor at the University of Cincinnati, from seeing the glass half full:

Research shows that chatbots can help alleviate feelings of depression, anxiety, and even stress. But it’s important to note that many of these chatbots haven’t been around long and their capabilities are limited. At present, they are still often wrong. Those who lack the AI ​​knowledge to understand the limitations of these systems will end up paying the price.

Among the other risks pointed out by our colleagues: the fact that adolescents could become accustomed to the comfort of these discussions with machines to the point of giving up having discussions with human beings.

Already in 2023, journalist Insley Harris from Fast Company pointed out in particular a flaw in these “friendly” relations between children and AI. Indeed, these interactions have a commercial aspect to the extent that the chatbot is created by a company which logically seeks to generate profit. It will therefore ultimately be necessary to choose between paying a monthly subscription or undergoing advertising targeting if the product is free.

-

-

PREV These three “super foods” that take care of our mental health
NEXT why taking too many medications is not without risk?