« Artificial intelligence has truly entered the field of mental health in recent years, with numerous experiments (including) therapeutic resource recommendation systems, assistance tools, as well as emotion recognition technologies.notes psychotherapist Sabine Allouchery, member of the French collective MentalTech, in an interview given to Usbek & Rica last October. These innovations make it possible to monitor, with varying degrees of detail, the state of mental health of individuals. Some even help to evaluate, to a certain extent, the effectiveness of treatments. »
AI therapists
To take the experimentation a little further, the creators of MeMind want to develop a chatbot therapy function. “ When the questions come from another person, people can feel judged, but when they come from a phone, we often feel more comfortable using the machine », explains psychiatrist Enrique Baca, who contributed to the development of the application.
But the chatbot will still have to prove itself before being deployed on a large scale. Because “AI therapists” have been under fire since a mother accused an artificial intelligence of having pushed her 14-year-old son to suicide. The American teenager, in love with a chatbot fashioned in the image of a heroine of Game of Thrones on the Character.ai platform, killed himself in the hope of joining her. Last October, his mother filed a complaint against the company, accusing it of having programmed its chatbot so that it “ falsely presents himself as a real person, a licensed psychotherapist, and an adult lover, which ultimately led to (his son) no longer wanting to live outside » of this artificial world.
Beyond questions related to psychological influence and the protection of patient data, some health professionals point out that applications like MeMind only constitute a complementary and temporary solution. “ AI is not here to replace humans. That was never the goalwarned independent developer Arnaud Bressot, AI consultant in mental health, member of the French collective MentalTech in our columns last October. It is the doctor’s responsibility to diagnose and treat his patients, not the AI ».
Health