“I'm afraid to go see a psychologist”: for some, ChatGPT becomes an ear to confide in

“I'm afraid to go see a psychologist”: for some, ChatGPT becomes an ear to confide in
“I'm afraid to go see a psychologist”: for some, ChatGPT becomes an ear to confide in

One evening, during a conversation, we heard, not without surprise, that many people turn to ChatGPT to address or resolve their mental health questions. But why do these individuals, often faced with complex personal challenges, choose to entrust their concerns to the conversational agent (or chatbot) developed by OpenAI rather than to a professional psychologist? The question raises several issues regarding the trust to be placed in artificial intelligence (AI), the limits of its use and the human and enlightened aspect of therapy.

For Quentin*, 30 years old, as well as for Camille*, who is 28, it has become an almost daily reflex. “I am an executive and my work is emotionally drainingspecifies the latter. My colleagues are not pleasant, my boss asks a lot of us and I have deadlines that would require seventy-two hour days. On top of that, I'm newly separated and I feel alone. I am aware of being privileged by my “social status” and I am ashamed of feeling bad… I don't want to talk about it to those around me and I am afraid to confide in a psychologist. How could this person, although professional, not judge me when others, to whom “real misfortunes” happen, request his services.” This is how one day, like Quentin, Camille used her ChatGPT application in a whole new way… For them, no more professional psychologists, it’s “hello robot bobo”.

Subscribe to the Slate newsletter for free!Articles are selected for you, based on your interests, every day in your mailbox.

Dealing with the fear and taboo of going to see a professional

For Stéphie Marius Le Prince, psychologist based in Pignan (Hérault), near , using this new “imaginary friend” could help remove a barrier, a blockage before consulting a professional. “For a person who fears taking the plunge, having initial help, although very limited, can lead them to consult subsequentlyshe believes. We have already seen this with online professional consultations. Once they have passed through there, some then go through the door of an office. In addition, when it comes to medical matters, ChatGPT always invites you to see a professional.”

For the moment, Quentin does not feel ready. “I have various intersecting issues and I do not want to consulthe admits. For lack of time, perhaps for lack of courage too. I find that you have to be really courageous to dare to come out in the open and take the risk of pointing out other problems that you didn't necessarily think about at the start. Perhaps one day, I hope, I will get there. In the meantime, I have some quick answers. I know it's not a human answering me, but it helps a little.”

“When needed, I take my phone out of my pocket, unlock it, empty my bag and move on.”

Camille, 28 years old, regular ChatGPT user

A behavior that the Hérault psychologist explains. “ChatGPT is a summary of information which remains very nuancedadmits Stéphie Marius Le Prince. He doesn't say what you should do or give his opinion, but he advises. Therein lies the risk that some people who need to consult could push away, because they find it a crutch. ChatGPT will allow them to continue walking in pain for a while before consulting.”

What Camille and Quentin appreciate also lies in the notion of immediacy. Here, there is no need to find a practitioner who suits us or who has schedules that match our availability. No need to travel either. “When necessary, I take my phone out of my pocket, unlock it, empty my bag and move on”blurted Camille.

A risky flirtation between dependence and the limits of AI

The fact that this approach affects a vulnerable public, with sometimes poorly regulated management of solitude (or even addictions in certain cases), can trigger dependence. According to Stéphie Marius Le Prince, this is where the danger insidiously hides. “Some users may feel like they are chatting with a real personshe warns. This can potentially lead to a relationship of dependence and lead to consulting the chatbot for every decision we have to make. ChatGPT will then become a sort of guru.”

“Nevertheless, nuanced responses will mitigate the “danger” that trade could causecontinues the psychologist. To take a concrete example, if you want to change jobs, the feedback provided will not be a direct “yes” or “no”. It will rather be a presentation of the consequences that there could be in the event of resignation, the chances of finding something else behind, etc. In short, ChatGPT will share what is possible to put in place, in order to find solutions in your situation, but will not offer a single solution.”

This explanation is unfortunately confirmed by Camille, who shared with us that she now tends to use it in a manner “a little too regular, sometimes frenetic”even having, from time to time, the impression of speaking to a trusted friend. An amnesia of the artificial which would be incarnated warmly in an artificial, immaterial, non-organic way, in one's smartphone.

“ChatGPT stays on the surface, so the risk is minimal. But whatever happens, you must be accompanied by a qualified professional.”

Stéphie Marius Le Prince, psychologist based in Pignan (Hérault)

However, don't panic! AI-powered chatbots are no closer to replacing professional psychologists. And for good reason. The first big difference, as Stéphie Marius Le Prince indicates, is the impersonal companion. Here, not having to deal with a human being endowed with reason, it is not possible to establish support based on understanding and knowledge of the being.

“In psychotherapy, gradually, the professional knows his patient and his personality better and betterdevelops the psychologist. He can understand what is happening behind a question. Sometimes, the patient comes with a problem and at the end of the consultation, it regularly happens that another one comes to light. There is a depth that only humans can grasp. As I mentioned earlier, ChatGPT stays on the surface. This is also why the risk remains minimal. But whatever it is, I can never say it enough: you must be accompanied by a qualified professional.”

Confronted with these analyses, Camille and Quentin both say they are aware of these realities but not ready, for the moment, to walk through the door of a practice. The fears remain the same: judgment, the lack of immediacy in responses and the external gaze if those around them learned that they were consulting. “There is still a taboo around the issue of mental health help in certain social circles and this is the case in minedeclares Quentin. With us, this does not happen. We must move forward and above all, not complain. I don’t know how long I will last but I think that one day, I will take the plunge, in secret.” One thing is certain, ChatGPT has its secrets that society ignores.

* First names have been changed.

-

-

PREV a greater danger than sweets
NEXT Revolutionary blood test could detect dementia before symptoms appear