On September 13, 2024, Professor Jean-Christophe Bernhard, head of the robotic surgery activity of the Urology department at Bordeaux University Hospital, assisted by Dr Gaëlle Margue, performed a total nephrectomy with caval thrombectomy, a complex, assisted intervention. by robotics and guided by augmented reality. This operation was broadcast live from the University Hospital to the Palais des Congrès in Bordeaux during the European Congress of Urological Robotic Surgery (ERUS), an event which brought together more than a thousand participants from all over the world. Professor Bernhard discusses this technological advance which, according to him, should ultimately become “the standard of care” for surgical interventions.
Can you tell us about the origins of this project?
This project is part of a broader program, the RHU Digital Urology 3D (Hospital-University Health Research), funded by the government as part of investments for the future, via the National Research Agency (ANR). This program is part of the national initiatives of the France 2030 strategy. In this context, we have formed a consortium with several academic partners (Grenoble University Hospital, CNRS, University and IUT of Bordeaux) and private partners (Fujifilm France, SurgAr, Rescoll, Sophia Genetics). The goal is to develop and evaluate the use of three-dimensional technologies to improve renal surgery, specifically in the treatment of kidney cancer.
One of the major innovations of this project is the integration of virtual and augmented reality technologies into robotic surgery. We worked with our partners SurgAR and Fujifilm on creating digital twins of the patient from preoperative imaging (CT scans, MRI), in order to superimpose them in real time on the image that the surgeon sees during the procedure.
Concretely for the surgeon at the time of the operation, how does this augmented and virtual reality technology serve him?
When we perform robot-assisted surgery, we operate via a camera which allows us to visualize the inside of the patient’s body. This stream of images is then captured in real time and processed by a dedicated computer. This computer will recalibrate the 3D model of the patient, his “digital twin”, to integrate it into the real view of the operation. The 3D model, created from preoperative images such as scanners, is therefore superimposed on the intraoperative view in real time. The image thus registered is sent back to the console of the surgical robot, which allows the surgeon to visualize this superposition during the operation.
Research and development work focused on this real-time adjustment phase, because this must be done in milliseconds, without delay, so that the surgeon can perform his actions fluidly and precisely. This automatic registration is what makes the integration of augmented reality into surgery possible. It allows you to go beyond virtual reality, where the surgeon sees the 3D model alongside the image of the operation, by switching to augmented reality, where the two images are merged into a single view. This gives the surgeon a much more detailed and precise overview of the patient’s anatomy during the operation, which aims to reduce the risk of error and improves the quality of the procedure.
Complementary to intraoperative ultrasound in this phase of development, it could ultimately supplant it. Indeed, intraoperative ultrasound involves inserting a probe into the patient’s body to visualize in real time the position of the thrombus through the vessel wall. This method is currently essential, but it has limitations in terms of learning, availability of equipment and time. By directly superimposing the 3D model of the patient onto the actual view of the operation, the surgeon can instantly see where critical structures are located, such as the thrombus in the vena cava.
It’s a bit like a surgical GPS to make a road analogy. For example, in a car, your GPS guides you along the most advantageous route and shows it to you on a map, which corresponds to virtual reality. With augmented reality, GPS information is displayed directly on your windshield, superimposed, showing you the way to follow, such as turning left for example.
Was September 13 the first operation with this device?
We had already carried out several dozen tests before this live demonstration. From September 11 to 13, we co-organized the European Congress of Robotic Urological Surgery (ERUS) in Bordeaux, which brought together more than a thousand participants from all over the world. It was the largest robotics congress internationally. As part of this congress, it is traditional to perform live surgeries. This means that we operate in our operating theater and that the procedure is broadcast live.
This surgery was therefore broadcast live at the Palais des Congrès in Bordeaux and it presented several particularities. The operated patient had a kidney tumor with invasion of the vena cava, which is the largest blood vessel in the body, responsible for bringing all the blood from the organs back to the heart. In some cases, when the kidney tumor is sufficiently developed, it spreads vascularly, first into the renal vein and then into the vena cava. In this case, surgery consists of interrupting blood circulation, opening the vena cava, removing the tumor thrombus, that is to say the extension of the tumor into the vessel, then reconstructing the vena cava. To achieve this, it is crucial to know precisely the limits of the thrombus in the vessel. However, when operating, the vena cava is not transparent and it is impossible to directly see what is inside.
It was therefore, due to the invasion of the vena cava, a complex intervention which is rarely performed, and by few teams in the world, using a robot-assisted approach. Usually, this type of surgery is done through an open approach. The first complexity therefore lay in carrying out this operation minimally invasively with robotic assistance.
Adding the assistance of augmented reality technology, the fruit of our development work with the SurgAR team, was unprecedented. The advantage of augmented reality, in this case, is that it allowed us to superimpose the “digital twin” of the patient on the image of the vena cava. By simulating transparency, this allowed us to visualize the boundaries of the thrombus, which indicated precisely where to interrupt circulation, clamp the vessel, and open to remove the thrombus.
Does this technology influence patients’ apprehension regarding the operation?
The question of patient experience is typically a subject that we explore within the framework of our RHU. In the Kidney Cancer Innovation program, I.CaRe Bordeaux, we have an axis where we evaluate the impact that these 3D technologies can have on patients’ understanding of their disease, on the acceptance of the disease, on understanding the proposed surgical technique, on preoperative anxiety, etc.
We therefore have clinical trials in progress where, precisely, we compare in a randomized manner (that is to say by drawing lots) two or three groups of patients, depending on the trials. We compare standard information – that is to say when we explain to the patient, on the basis of the scan, what is happening to him and what we plan to do from a surgical point of view – to an explanation based on a 3D model, either virtual or physical, using 3D printing technology.
And the first results are very promising. We observe that when we use these three-dimensional technologies, patients’ understanding increases from more than 16% to more than 52%.
Will this technology be used in other types of surgery in the future?
Absolutely. Although we are currently focusing our efforts on renal surgery, the applications of this technology extend to other surgical specialties. For example, in thoracic surgery, where lung tumors are often difficult to locate precisely, or in hepatic surgery for liver tumors or even in gynecological surgery. Augmented reality techniques are particularly suited to surgeries where the function of an organ must be preserved while removing a tumor.
This technology makes surgery more personalized, because each patient has a unique anatomy. We know, for example, that one in two patients may have a variation in their vascular anatomy. This may require adaptations in the way of operating. Augmented reality makes it possible to better understand this variability and plan the intervention accordingly.
How do you see the future of augmented reality-assisted surgery in the coming years?
I think it will become the standard of care. It is currently an innovation and a technology in the process of research and development, a bit like surgical robotics was around twenty years ago. At the time, it was a major breakthrough, and today it has almost become the norm. Likewise, I am convinced that these new technologies will become widespread in the future.
Of course, there remain imperatives to satisfy: making the technique reliable, mastering its uses, ensuring that the information provided is robust and reliable. This still requires several stages of research and development. But in the future, in my opinion, virtual reality will be the minimum, and augmented reality will be the next step.