an AI can now listen to consultations and help doctors

an AI can now listen to consultations and help doctors
an AI can now listen to consultations and help doctors

On the Doctolib platform, a virtual assistant can now listen to consultations, transcribe the comments and help doctors by providing a summary.

When you go to the doctor, know that from now on everything you say can be listened to by artificial intelligence… The Doctolib platform has just launched a “virtual assistant” which could revolutionize medical consultations. Doctors who use it must ask for the patient’s consent. The principle of this “consultation assistant”, launched a few days ago, is as follows: everything you say to your doctor will be listened to, transcribed and analyzed in real time by the AI, which will sort through what you say, the reason for the consultation, possible history, your lifestyle, current treatments… At the end of the discussion, the virtual assistant will provide a short summary. And all this data will automatically populate the patient’s file.

The benefit for the doctor: the ability to concentrate fully on what you are telling him, no more need to take notes, less fear of not having noted an essential element… And then, a considerable saving of time. And therefore more patients seen in a day. When we know that 40% of doctors’ time is spent on tasks with low added value, paperwork, medical reports, administration, responding to emails, if we can delegate that to AI, it’s more time for the doctor to take care of patients, which is still the heart of his job.

The tool was tested for several months with 350 doctors. It has been deployed on a large scale for several days for general practitioners, and it will be done gradually for specialists. The question of the acceptability of this type of tool will also arise, with fears linked to the confidentiality of the data collected, even if Doctolib ensures that nothing is recorded and that no element is stored, because the Conversations are deleted after the consultation. Will that be enough to reassure patients?

We can’t stop progress: An AI that listens to doctor’s conversations – 10/28

Beware of risks with AI

This is a first step, but AI in the field of medical consultation can go much further. Listening and transcribing is good, but the next step would be for the AI ​​to also be able to make a diagnosis. Could this AI, which today synthesizes what I say, also carry out a diagnosis which could support or contradict that of the doctor for example? We have already had tests where, based on symptoms, ChatGPT was better than human doctors at making a diagnosis. The idea being to use AI as additional support to avoid any errors, diagnoses, medicinal contraindications, etc. This is not easy for doctors to accept, because it somewhat calls into question their position of authority, but we will get there little by little, if only because it is in the patient’s interest. Moreover, if we go further, we also dream of a button on Doctolib which would indicate whether the doctor we are going to uses AI, and therefore offers a digital eye in addition to his own.

But we risk quickly reaching people who will self-diagnose using ChatGPT and no longer go to the doctor. And then it becomes dangerous… It should absolutely be avoided. As in all fields, AI has an unfortunate tendency to hallucinate, answer anything when it does not have the answer to a question… In another study, ChatGPT was asked very specific questions about prevention cardiovascular risks. She answered 21 out of 25 questions correctly. But in the four cases where she got it wrong, she gave advice that could potentially have killed a patient! Same results on questions asked about breast cancer screening, 22 out of 25. Her rate of correct answers is impressive, but when she makes a mistake, she talks nonsense. She goes so far as to invent false articles and false scientific references. And in an area as sensitive as health, this is clearly not acceptable.

-

-

PREV South Kivu launches its community advisory committee to combat Mpox
NEXT who should be tested?