100-years ago, medical journals reported medical discoveries like Landsteiner’s blood compatibility and rejection findings, the suggestion of the presence of vitamins, the use of the electrocardiogram, and the merits of using insulin to treat diabetes, among many, many others.  Over the last several years, medical discoveries have become technologically impressive and often seem more relatable to sci-fi or Jules Verne’s books.  However, the discoveries today go even beyond sci-fi or Jules Verne.  We are talking about truly amazing advances, like the subject of this article: robotic surgery guided through computer generated augmented reality imaging – in real time and without radiation – how about that?

robotic surgery

Small kidney cancer tumors are usually managed through Laparoscopic Partial Nephrectomy (LPN) procedures. This Minimally Invasive Surgery (MIS) is performed with robotic surgery as a standard procedure, at least in more specialized medical centers.  The LPN procedure consists of the complete removal of the tumor while preserving as much of the kidney as possible. This is a challenging procedure because there is a limited field of view and a lack of haptic feedback, as well as the fact that the procedure needs to be completed very quickly.

Faculties of the Urologic Sciences and Mechanical Engineering departments at the Vancouver Campus of the University of British Columbia, in Vancouver, Canada, designed a low-cost ($26 each), sterilizable (with autoclave) solution called the Dynamic Augmented Reality Tracker (DART).  DART is a 3D-printed dispositive capable of picking-up ultrasound signals through a Laparoscopic Ultrasound Transducer (LUS) during an intra-operative freehand ultrasound scan of the tumor. After ultrasound, the system displays the segmented 3D tumor and guides the location of surgical instruments relative to the tumor throughout the surgery. And this is done only with ultrasound.

DART is used in an intra-operative augmented reality ultrasound navigation system (ARUNS) as an aid to robot-assisted minimally invasive surgery.  In this case, da Vinci robotic surgery. For now, this system has been used only in a preliminary evaluation capacity, but further refinement and validation in vivo is expected soon!

Surgery starts by placing the DART near the tumor, then by performing a freehand ultrasound scan, the Laparoscopic Ultrasound Transducer and the DART are tracked and both of their images are synchronized.  3D volume reconstruction is then performed through manual segmentation of the 3D ultrasound volume, resulting in one direct AR image being displayed to the surgeon in real-time – allowing a better view during the procedure.  The DART is then removed with the tumor and surrounding tissue.

Testing was conducted by an expert urological surgeon versed in robot-assisted LPN procedures.  The surgeon performed this surgery twice, one with LUS only and another one with LUS+ARUNS.  In comparing the LUS to the LUS+ARUNS, the surgical planning time went from 2 minutes to just under 2 minutes and the surgery execution went from 10 minutes and 45 seconds to 7 minutes and 30 seconds.  The LUS+ARUNS procedure was selected by the surgeon as his preferred method of the two because of its time and visual advantages.  Total system error of the ARUNS was 5.1 mm, but we think this error should easily be reduced through further refinement and testing.

Image-guided surgical procedures are largely dependent on their precision, and the LUS+ARUNS technique will need to become more precise but in spite of that, we believe that this technology shows promise.  Obviously, more studies and users are required to provide more rigorous validation and perfection of the LUS+ARUNS technique.  As we have seen in other studies, pre-operative stages can take a lot of time to complete, leaving many augmented reality techniques as presently being useful primarily for elective procedures (but as the tech continues to advance, we expect it to become useful, if not indispensable, across the board!). Simplifying visualization techniques with the use of ultrasound, such as what is being done with LUS+ARUNS procedure, could, in the very near future, achieve a standardization level that could enhance augmented reality aided procedures to a level that make AR procedures a reality for emergency surgery too.  Additionally, as low-cost implements are involved, AR aided procedures could also expand to several not-so-advanced medical centers improving many procedures for everyone and thus improving the quality of medicine as a whole.  And that, friends, is our goal!

 

 

Source: http://link.springer.com/chapter/10.1007/978-3-319-43775-0_13

2 COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here