94.07 Novel development of navigation surgery by augmented reality using a tablet PC

J. Yasuda1, T. Okamoto2, S. Onda1, A. Hattori3, N. Suzuki3, K. Yanaga1  1Jikei University School Of Medicine,Department Of Surgery,Tokyo, , Japan 2Jikei Daisan Hospital,Department Of Surgery,Tokyo, , Japan 3Jake University School of Medicine,Institute For High Dimensional Medical Imaging,Tokyo, , Japan

 

Introduction:

We reported the efficacy of an image-guided navigation system using augmented reality (AR-NS) technology for hepatobiliary and pancreatic surgery. We have superimposed the 3D organ model and the real organ in a display monitor using a stereoscope and were able to decipher the patient’s anatomy on the display monitor (Monitor method).

However, this navigation system had some problems such as high price of a dedicated stereoscope,?and surgeon’s eyesight away from the surgical field.

Therefore, we have developed a novel image-guided navigation system using a tablet PC which seems overspread in the world (Tablet method). This inexpensive device does not require changing eyesight away.

 

Methods:

We applied this tablet PC five patients who underwent navigation surgery for hepatobiliary and pancreatic field. Operative procedures consisted of hepatectomy in 3 and pancreatectomy in 2.

Surgical planning was contemplated using a 3D organ model, which was created from dynamic enhanced CT.

Positional measurement for registration was performed using a position sensor Optotrak® installed in the special operating room. After paired point registration, the surgical field was captured by the tablet PC with an infrared sensor, on which the 3D organ model was superimposed. This model data was sent via Wi-Fi to the tablet PC for real time navigation. The operation was performed with navigation guidance which included the localization of the tumor and blood vessels at various angles. The registration accuracy was calculated as the fiducial registration error (FRE). Furthermore, The tablet PC allowed writing characters on the screen. We employed this technique for education using the annotation function operated by a resident and a nurse by transmitting a superimposed image on a bedside monitor via Bluetooth.

Results:

The time required to build a 3D organ model was 2−3 hr per patient. Registration took only 1−2min for each procedure. Using Wi-Fi, AR-NS was possible without a time lag, and navigation surgery was successfully performed in all patients. The visibility of the superimposed models in the tablet PC was compatible with that of the monitor method [Fig.1]. This resulted in improved understanding of the operation by residents and nurses. About the registration precision, the mean FRE was 6.3 mm which did not differ from Monitor method.

Conclusion:

Novel Tablet method may make AR-NS more convenient than Monitor method in abdominal surgery.