smart glasses for neurosurgical navigation by augmented

6
TECHNIQUE ASSESSMENT Smart Glasses for Neurosurgical Navigation by Augmented Reality Keisuke Maruyama, MD, PhD Eiju Watanabe, MD, PhD Taichi Kin, MD, PhD § Kuniaki Saito, MD Atsushi Kumakiri, MD Akio Noguchi, MD, PhD Motoo Nagane, MD, PhD Yoshiaki Shiokawa, MD, PhD Department of Neurosurgery, Kyorin University School of Medicine, Mitaka-City, Japan; Department of Neurosurgery, Jichi Medical University, Shimotsuke-City, Japan; § Department of Neurosurgery, University of Tokyo Hospital, Bunkyo-ku, Japan Correspondence: Keisuke Maruyama, MD, PhD, 6-20-2 Shinkawa, Mitaka-City, Tokyo, 181-8611, Japan. E-mail: [email protected] Received, April 20, 2017. Accepted, December 5, 2017. Copyright C 2018 by the Congress of Neurological Surgeons BACKGROUND: Wearable devices with heads-up displays or smart glasses can overlay images onto the sight of the wearer. This technology has never been applied to surgical navigation. OBJECTIVE: To assess the applicability and accuracy of smart glasses for augmented reality (AR)-based neurosurgical navigation. METHODS: Smart glasses were applied to AR-based neurosurgical navigation. Three- dimensional computer graphics were created based on preoperative magnetic resonance images and visualized in see-through smart glasses. Optical markers were attached to the smart glasses and the patient’s head for accurate navigation. Two motion capture cameras were used for registration and continuous monitoring of the location of the smart glasses in relation to the patient’s head. After the accuracy was assessed with a phantom, this technique was applied in 2 patients with brain tumors located in the brain surface. RESULTS: A stereoscopic view by image overlay through the smart glasses was successful in the phantom and in both patients. Hands-free neuronavigation inside the operative field was available from any angles and distances. The targeting error in the phantom measured in 75 points ranged from 0.2 to 8.1 mm (mean 3.1 ± 1.9 mm, median 2.7 mm). The intraoper- ative targeting error between the visualized and real locations in the 2 patients (measured in 40 points) ranged from 0.6 to 4.9 mm (mean 2.1 ± 1.1 mm, median 1.8 mm). CONCLUSION: Smart glasses enabled AR-based neurosurgical navigation in a hands-free fashion. Stereoscopic computer graphics of targeted brain tumors corresponding to the surgical field were clearly visualized during surgery. KEY WORDS: Augmented reality, Brain tumor, Motion capture, Neuronavigation, Smart glasses Operative Neurosurgery 0:1–6, 2018 DOI: 10.1093/ons/opx279 T here is now increasing use of augmented reality (AR)-based techniques for identi- fying the precise location of target lesions or body regions to improve the safety and accuracy of surgical intervention. 1 - 4 Smart glasses can display computer-generated images onto the wearer’s sight through the lenses and are now commercially available. 5 - 7 Also known as wearable devices with heads-up displays, this technology and its clinical application are rapidly progressing. 8, 9 However, previous reports were only experimental 10 , 11 or provide supplemental reference of diagnostic images ABBREVIATIONS: 3D, three-dimensional; AR, augmented reality. Supplemental digital content is available for this article at www.operativeneurosurgery-online.com. during surgery, 9 and no studies have examined the practical clinical application of smart glasses for surgical navigation, likely because of accuracy limitations. Herein, we report our initial clinical experience on the use of smart glasses (Figure 1) for AR-based neurosurgical navigation. METHODS Our recently reported AR-based navigation system using a tablet computer 1 was modified using smart glasses. Optical markers were attached to the smart glasses and the patient’s head, and the locations were continuously monitored by at least 2 motion capture cameras (Figure 1). Three-dimensional (3D) computer graphics were then created based on magnetic resonance images taken prior to surgery. 12 , 13 Visualization software (Amira, FEI, Hillsboro, Oregon) was used for the segmentation of the scalp, skull, brain tumors, and vessels of the brain surface. OPERATIVE NEUROSURGERY VOLUME 0 | NUMBER 0 | 2018 | 1 Downloaded from https://academic.oup.com/ons/advance-article-abstract/doi/10.1093/ons/opx279/4823484 by Kyorin University user on 26 January 2018

Upload: others

Post on 18-Dec-2021

7 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Smart Glasses for Neurosurgical Navigation by Augmented

TECHNIQUE ASSESSMENT

Smart Glasses for Neurosurgical Navigationby Augmented Reality

KeisukeMaruyama, MD, PhD∗

EijuWatanabe, MD, PhD‡

Taichi Kin, MD, PhD§

Kuniaki Saito, MD∗

Atsushi Kumakiri, MD∗

Akio Noguchi, MD, PhD∗

Motoo Nagane, MD, PhD∗

Yoshiaki Shiokawa, MD, PhD∗

∗Department of Neurosurgery, KyorinUniversity School of Medicine,Mitaka-City, Japan; ‡Department ofNeurosurgery, Jichi Medical University,Shimotsuke-City, Japan; §Departmentof Neurosurgery, University of TokyoHospital, Bunkyo-ku, Japan

Correspondence:Keisuke Maruyama, MD, PhD,6-20-2 Shinkawa,Mitaka-City, Tokyo, 181-8611, Japan.E-mail: [email protected]

Received, April 20, 2017.Accepted, December 5, 2017.

Copyright C© 2018 by theCongress of Neurological Surgeons

BACKGROUND: Wearable devices with heads-up displays or smart glasses can overlayimages onto the sight of the wearer. This technology has never been applied to surgicalnavigation.OBJECTIVE:Toassess the applicability andaccuracyof smart glasses for augmented reality(AR)-based neurosurgical navigation.METHODS: Smart glasses were applied to AR-based neurosurgical navigation. Three-dimensional computer graphics were created based on preoperative magnetic resonanceimages and visualized in see-through smart glasses. Optical markers were attached to thesmart glasses and the patient’s head for accurate navigation. Twomotion capture cameraswere used for registration and continuous monitoring of the location of the smart glassesin relation to the patient’s head. After the accuracy was assessed with a phantom, thistechnique was applied in 2 patients with brain tumors located in the brain surface.RESULTS: A stereoscopic view by image overlay through the smart glasses was successfulin thephantomand inbothpatients. Hands-free neuronavigation inside theoperative fieldwas available fromany angles and distances. The targeting error in the phantommeasuredin 75 points ranged from 0.2 to 8.1 mm (mean 3.1± 1.9 mm, median 2.7 mm). The intraoper-ative targeting error between the visualized and real locations in the 2 patients (measuredin 40 points) ranged from 0.6 to 4.9 mm (mean 2.1 ± 1.1 mm, median 1.8 mm).CONCLUSION: Smart glasses enabled AR-based neurosurgical navigation in a hands-freefashion. Stereoscopic computer graphics of targeted brain tumors corresponding to thesurgical field were clearly visualized during surgery.

KEYWORDS: Augmented reality, Brain tumor, Motion capture, Neuronavigation, Smart glasses

Operative Neurosurgery 0:1–6, 2018 DOI: 10.1093/ons/opx279

T here is now increasing use of augmentedreality (AR)-based techniques for identi-fying the precise location of target lesions

or body regions to improve the safety andaccuracy of surgical intervention.1-4 Smartglasses can display computer-generated imagesonto the wearer’s sight through the lenses andare now commercially available.5-7 Also knownas wearable devices with heads-up displays,this technology and its clinical applicationare rapidly progressing.8,9 However, previousreports were only experimental10,11 or providesupplemental reference of diagnostic images

ABBREVIATIONS: 3D, three-dimensional; AR,augmented reality.

Supplemental digital content is available for this article atwww.operativeneurosurgery-online.com.

during surgery,9 and no studies have examinedthe practical clinical application of smart glassesfor surgical navigation, likely because of accuracylimitations. Herein, we report our initial clinicalexperience on the use of smart glasses (Figure 1)for AR-based neurosurgical navigation.

METHODS

Our recently reported AR-based navigation systemusing a tablet computer1 was modified using smartglasses. Optical markers were attached to the smartglasses and the patient’s head, and the locationswere continuously monitored by at least 2 motioncapture cameras (Figure 1). Three-dimensional(3D) computer graphics were then created based onmagnetic resonance images taken prior to surgery.12,13Visualization software (Amira, FEI, Hillsboro,Oregon) was used for the segmentation of the scalp,skull, brain tumors, and vessels of the brain surface.

OPERATIVE NEUROSURGERY VOLUME 0 | NUMBER 0 | 2018 | 1

Downloaded from https://academic.oup.com/ons/advance-article-abstract/doi/10.1093/ons/opx279/4823484by Kyorin University useron 26 January 2018

Page 2: Smart Glasses for Neurosurgical Navigation by Augmented

MARUYAMA ET AL

FIGURE 1. Overview of the system and smart glasses. A, Motion capture cameras. B, Markers for patient head. Arrowheads indicate optical markers used forregistration. C, Smart glasses with markers. D, The visualized images through the glasses. The created brain is overlaid onto a physical skull model throughthe lenses.

3D reconstructed data (OBJ format) were transferred to anothercomputer Sync VV (ACME Portable Machines Inc, Azusa, California)running the original software based on Unity (Unity Technologies,San Francisco, California). 3D graphics were displayed on commer-cially available smart glasses (Moverio BT-200; Seiko EpsonCorporation,Suwa, Japan; Figure 1B, 1C).5 A set of 4 optical markers were attached tothe smart glasses to monitor their position. The position of the patient’shead and glasses was detected by a set of at least 2 3D motion capturecameras (Vicon, Denver, Colorado). The patient’s head was registered bypointing the nasion and the bilateral preauricular points with a stick-likepointing device containing 4 optical markers. Using the measured local-ization of the smart glasses and patient’s head, the 3D reconstructed headimage was displayed on the smart glasses as a graphic overlay over theactual patient’s head observed through the glasses. Thus, the surgeon can“look” into the head of the patient head from any angle and distances.A video camera (Handycam HDR-CX370; Sony Corporation, Tokyo,Japan) attached to right side of the glasses was used to record the viewthrough the glasses (see Video, Supplemental Digital Content, whichdemonstrates the usage of smart glasses for AR neuronavigation). Theposition of both eyes, or video detector for video recording, was alsoregistered separately.

AccuracyMeasurementAccuracy of the image overlay was evaluated using a phantom

consisting of 3 cones with sharp tips and 4 fiducial points simulating thenasion, the bilateral preauricular points, and the point 10.0 cm above thenasion in the midline (Figure 2A). After registration, computer imagesof the phantom were visualized and overlaid through the smart glasses.Photographs of an overlay of the 3 tips were taken 5 times from 5 differentdirections and distances. The process of registration and visualization was

performed 5 times, and the distances between the 3 visualized and realtips of the phantom were measured in totally 75 points.

Clinical ApplicationNeurosurgical navigation through the smart glasses was performed in

2 patients with brain tumors located on the brain surface. The institu-tional review board at Kyorin University approved the study protocol,and written informed consent was obtained from the patients.

Computer graphics of the scalp, brain tumors, and vessels of the brainsurface were created and visualized through the glasses after head regis-tration for each patient. After the brain surface and tumor were exposedby opening the dura mater, photographs of an overlay were taken 5 timesfor each patient for the evaluation of the overlay accuracy. The overlaydata were compared at 4 points of the tumor border on the brain surface.

RESULTS

A stereoscopic view of the image overlay through the smartglasses was achieved for the phantom (Figure 2) and in bothpatients (Figures 3 and 4). Because we used only 4 fiducial pointsfor registration, it took approximately 3 min for head registrationand 3 min for validation. The targeting error of the phantom(measured in 75 points) ranged from 0.2 to 8.1 mm (mean 3.1 ±1.9 mm, median 2.7 mm; Figure 2D).This technique was first applied to 63-yr-old female suffering

from metastatic brain tumors in the left cerebellum from theuterus (Figure 3). The size of the tumor was 4.0 cm in themaximum diameter, and the tumor was located in the brain

2 | VOLUME 0 | NUMBER 0 | 2018 www.operativeneurosurgery-online.com

Downloaded from https://academic.oup.com/ons/advance-article-abstract/doi/10.1093/ons/opx279/4823484by Kyorin University useron 26 January 2018

Page 3: Smart Glasses for Neurosurgical Navigation by Augmented

NEURONAVIGATION BY SMART GLASSES

FIGURE 2. Accuracy measurement using the phantom. A, A phantom built for accuracy measurement. B, View through the smartglasses without computer graphics. C, Overlay of the computer image onto the phantom through the smart glasses after registration. D,Superimposition of 3 visualized cone tips (light green crosses) onto the real tips (yellow crosses). Distances between each light green crossand yellow cross were measured.

surface (Figure 3A). Computer graphics of the tumor was recon-structed in relation to the whole head (Figure 3B) and visualizedthrough the glasses before skin incision (Figure 3C). Afterexposing the tumor margin (Figure 3D), the real tumor locationwas confirmed through the glasses (Figure 3E). Targeting erroramong 20 measured points ranged from 1.2 to 3.8 mm. Meanerror was 2.2 ± 0.8 mm, with a median of 1.8. The tumor wastotally removed successively, and the patient suffered no postop-erative complications.This technique was next applied to 69-yr-old female with

multiple meningiomas in the right frontal lobe (Figure 4). She hada total of 5 lesions sized 1.2 to 5.3 cm in the maximum diameter(Figure 4A). The most posterior tumor (arrow in Figure 4B) wasselected for accuracy measurement. The tumors were visualizedthrough the glasses before skin incision (Figure 4C), and thelocation and size of craniotomy were determined. After exposingthe margin of 1 of the tumors (Figure 4D), the real tumor location

was confirmed through the glasses (Figure 4E). Targeting erroramong 20 measured points ranged from 0.6 to 4.9 mm. Meanerror was 2.1 ± 1.3 mm, with a median of 1.8. All 5 tumormasses were totally removed successively, and the patient sufferedno postoperative complications.In the surgical application in 2 patients, the targeting error

between the visualized and real locations ranged from 0.6 to4.9 mm. Mean error was 2.1 ± 1.1 mm, with a median of1.8 mm.

DISCUSSION

We found that overlaying of computer graphics of surfacebrain tumors through smart glasses provided a clear and accuratestereoscopic localization during brain surgery, with a clinicallyacceptable target registration error. By the AR technique, asurgeon does not need to imagine the 3D view from 2D

OPERATIVE NEUROSURGERY VOLUME 0 | NUMBER 0 | 2018 | 3

Downloaded from https://academic.oup.com/ons/advance-article-abstract/doi/10.1093/ons/opx279/4823484by Kyorin University useron 26 January 2018

Page 4: Smart Glasses for Neurosurgical Navigation by Augmented

MARUYAMA ET AL

FIGURE 3. Image overlay during neurosurgery in patient 1. A, Magnetic resonance imaging of a 63-yr-old woman with metastatic brain tumors in the leftcerebellum. B, Computer graphics reconstruction. C, Image overlay before skin incision. D, Physical tumor exposed after dural incision. E, Image overlay ofthe tumor.

tomography images usually necessary when using conventionalneuronavigation. By applying smart glasses, in addition, a surgeondoes not need to move his gaze to a computer display, andhe only needs to see the surgical field to obtain stereoscopicnavigation in a hands-free fashion. Thus, the application ofsmart glasses may be extremely effective technology for surgicalnavigation.Our system has several key advances. First, we used motion

capture cameras to detect the accurate position of the smartglasses and the patient’s head; we recently reported the clinicalapplication of this techinque.1 Computer images correspondingto the direction and the distance between the glasses and thepatient’s head were then visualized in real time from their relativelocations. This enabled accurate visualization inside the brainfrom any angle or distance, with no parallax effect, provided theglasses and the patient’s head were within the detection area fromthe cameras. Second, as Moverio (Seiko Epson Corp) providesa semitransparent overlay of computer images onto that physi-cally seen through the lenses,5 additional information on hiddenbrain structures was directly provided to the surgeons in realtime during the surgical procedure. Importantly, the surgeon

can also temporarily turn off the visualization or display in ablinking manner if required. Third, a stereoscopic view waspossible in our system by independently registering both eyesbehind the smart glasses. This enabled eliminating the parallaxeffect; unfortunately, we are unable to provide stereoscopic infor-mation to the readers, and recorded images from only one lensof the glasses are presented. After registration of eye position,the additional correction for eye movements was not necessarybecause a displayed image always moved in accordance with anobject with direct vision when the eyes moved. A mechanism forthis fact can be explained as follows: the object to be observed, thedisplayed image, and the center of eye rotation always exist on astraight line despite eye movement. Finally, our system is simpleand easy to handle and is cheaper than conventional neuronav-igation systems. Devices other than Moverio can also be usedif full color computer graphics from a real-time position can bedisplayed in both eyes.

LimitationsThere are several limitations of our technique. First, the

overlaid images are only seen by the surgeon wearing the smart

4 | VOLUME 0 | NUMBER 0 | 2018 www.operativeneurosurgery-online.com

Downloaded from https://academic.oup.com/ons/advance-article-abstract/doi/10.1093/ons/opx279/4823484by Kyorin University useron 26 January 2018

Page 5: Smart Glasses for Neurosurgical Navigation by Augmented

NEURONAVIGATION BY SMART GLASSES

FIGURE4. Image overlay during neurosurgery in patient 2. A, Magnetic resonance imaging of a 69-yr-old female with five meningiomas in the right frontallobe. B, Computer graphics reconstruction. C, Image overlay before skin incision. D, Physical tumor exposed after dural incision. The most posterior tumor(arrow in B) was selected for accuracy measurement. E, Image overlay of the tumor.

glasses. Furthermore, although the accuracy of visualization oflesions on the brain surface was confirmed, the accuracy of deeplesions remains unknown. Further studies are required to examinethe potential effects of brain shift by craniotomy and othersurgical procedures inside the brain on visualization accuracy,14,15although confirming the error on the brain surface is the mostpractical and reliable method to avoid brain shift. The timerequired to prepare the computer graphics from neuroimagingstudies taken for routine clinical practice is also important; this isusually within 30 min at our hospital.1,12,13 However, if segmen-tation of the scalp, skull, tumor, and other structures could beperformed using threshold standing on the image intensities, thiswould only take approximately 3 min for each structure.1 Futurestudies are required to develop an automated image constructionfrom magnetic resonance or computed tomography images.

CONCLUSION

We report our initial clinical experience on the use of smartglasses for AR-based neurosurgical navigation in a hands-free

fashion. Stereoscopic computer graphics of targeted brain tumorscorresponding to the surgical field were clearly visualized duringsurgery. Our technique can also be applied to other regions ofthe body, provided that the image overlay is performed accurately.Therefore, with further development, our technique has potentialfor widespread clinical use, such as for visualization of whole-bodycancer screening using smart glasses.

DisclosuresThis work was supported by the Grants-in-aid for scientific research from

the Ministry of Education, Science, and Culture of Japan (15K10371 to DrMaruyama). The authors have no personal, financial, or institutional interest inany of the drugs, materials, or devices described in this article.

REFERENCES1. Watanabe E, Satoh M, Konno T, Hirai M, Yamaguchi T. The trans-visible

navigator: a see-through neuronavigation system using augmented reality. WorldNeurosurg. 2016;87(3):399-405.

2. Watanabe E, Watanabe T, Manaka S, Mayanagi Y, Takakura K.Three-dimensional digitizer (neuronavigator): new equipment for computedtomography-guided stereotaxic surgery. Surg Neurol. 1987;27(6):543-547.

OPERATIVE NEUROSURGERY VOLUME 0 | NUMBER 0 | 2018 | 5

Downloaded from https://academic.oup.com/ons/advance-article-abstract/doi/10.1093/ons/opx279/4823484by Kyorin University useron 26 January 2018

Page 6: Smart Glasses for Neurosurgical Navigation by Augmented

MARUYAMA ET AL

3. Roberts DW, Strohbehn JW, Hatch JF, Murray W, Kettenberger H. A framelessstereotaxic integration of computerized tomographic imaging and the operatingmicroscope. J Neurosurg. 1986;65(4):545-549.

4. Sugimoto M, Yasuda H, Koda K, et al. Image overlay navigation by markerlesssurface registration in gastrointestinal, hepatobiliary and pancreatic surgery. JHepatobiliary-Pancreat Sci. 2010;17(5):629-636.

5. Epson America, Inc. Epson Smart Glasses. Available at: https://epson.com/For-Home/Wearables/Smart-Glasses/c/h420. Accessed August 9, 2017.

6. Google Inc. Google glass. Available at: https://x.company/glass/. Accessed August9, 2017.

7. Microsoft Inc. Microsoft HoloLens. Available at: https://www.microsoft.com/en-us/hololens. Accessed August 9, 2017.

8. Ianchulev T, Minckler DS, Hoskins HD, et al. Wearable technology with head-mounted displays and visual function. JAMA. 2014;312(17):1799-1801.

9. Mitrasinovic S, Camacho E, Trivedi N, et al. Clinical and surgical applicationsof smart glasses. Technol Health Care. 2015;23(4):381-401.

10. Birkfellner W, Figl M, Huber K, et al. A head-mounted operating binocular foraugmented reality visualization in medicine - design and initial evaluation. IEEETrans Med Imag. 2002;21(8):991-997.

11. Shao P, Ding H, Wang J, et al. Designing a wearable navigation system forimage-guided cancer resection surgery. Ann Biomed Eng. 2014;42(11):2228-2237.

12. Kin T, Oyama H, Kamada K, Aoki S, Ohtomo K, Saito N. Prediction ofsurgical view of neurovascular decompression using interactive computer graphics.Neurosurgery. 2009;65(1):121-129.

13. Maruyama K, Kin T, Saito T, et al. Neurosurgical simulation by inter-active computer graphics on iPad. Int J Comput Ass Radiol Surg. 2014;9(6):1073-1078.

14. Nimsky C, Ganslandt O, Cerny S, Hastreiter P, Greiner G, Fahlbusch R.Quantification of, visualization of, and compensation for brain shift using intra-operative magnetic resonance imaging. Neurosurgery. 2000;47(5):1070-1080.

15. Keles GE, Lamborn KR, Berger MS. Coregistration accuracy and detection ofbrain shift using intraoperative sononavigation during resection of hemispherictumors. Neurosurgery. 2003;53(3):556-564.

Supplemental digital content is available for this article atwww.operativeneurosurgery-online.com.

6 | VOLUME 0 | NUMBER 0 | 2018 www.operativeneurosurgery-online.com

Downloaded from https://academic.oup.com/ons/advance-article-abstract/doi/10.1093/ons/opx279/4823484by Kyorin University useron 26 January 2018