An AI as a psychologist? Report questions future use of AI in mental health

An AI as a psychologist? Report questions future use of AI in mental health
An AI as a psychologist? Report questions future use of AI in mental health

In a report published on Thursday, October 10, the MentalTech collective studies the advent of AI in the management of mental health. Several experts are calling for the deployment of these technologies to be supervised.

Anticipate the emergence of Artificial Intelligence to assist mental health professionals… This association sets the conditions. MentalTech, which brings together several players in the Tech and mental health sectors, has just published a report on the issue.

The French collective wants to get involved in the deployment of ethical digital solutions in the field of mental health. In this report, the association interviewed experts on the issues relating to the use of AI as a tool to assist consultations in the mental health sector.

Uses of AI

The pandemic period has had consequences on the mental health of the French. Public Health recorded a sharp increase in cases of depression, particularly among young people. In 2017, 11.7% of young people said they had experienced depression, a figure which exploded in 2021 as it rose to 20.8%. Proportionally to this increase, requests for psychiatric care have seen a sharp increase.

Although it may be scary, the mental health sector will most likely not be spared from the democratization of AI. Technology could represent an opportunity since it would allow more efficient diagnosis. Depending on the patient’s symptoms and conditions, she could suggest the most suitable treatment.

The report, however, has some reservations. Overuse of the technological tool presents a risk of leading to dependence. Relying the diagnosis solely on the opinion of the machine is not a guarantee. The report specifies that an examination carried out by a professional will be more appropriate in certain situations.

What the report recommends

The report mentions the implementation of measures in the design of these AI. Among them, the authors suggest that health professionals be included in the development of algorithms. They actually want the training data used by the AI ​​to come from the practice of professionals.

Beyond development, the authors of the report want to ensure a human control loop: health professionals who use AI will have to verify the conclusions rendered by the machine. To do this, it is suggested that doctors undergo training addressing the pros and cons of using this technology.

On the user side, the report mentions total transparency regarding the use of AI. Patients have the right to have the choice to use recognition tools or not. The report mentions a personalized installation adapted to their possible apprehensions regarding the use of this AI.

MentalTech seeks to anticipate the advent of AI in medicine, an opportunity to facilitate the diagnosis of mental disorders while keeping in mind that such a system is not infallible. According to the WHO in 2022, “71% of people with psychosis worldwide do not benefit from mental health services”.

-

-

NEXT who should be tested?