site stats

Augmented reality head-mounted display–based incision planning in cranial neurosurgery: a prospective pilot study

OBJECTIVE

Monitor and wand–based neuronavigation stations (MWBNSs) for frameless intraoperative neuronavigation are routinely used in cranial neurosurgery. However, they are temporally and spatially cumbersome; the OR must be arranged around the MWBNS, at least one hand must be used to manipulate the MWBNS wand (interrupting a bimanual surgical technique), and the surgical workflow is interrupted as the surgeon stops to “check the navigation” on a remote monitor. Thus, there is need for continuous, real-time, hands-free, neuronavigation solutions. Augmented reality (AR) is poised to streamline these issues. The authors present the first reported prospective pilot study investigating the feasibility of using the OpenSight application with an AR head-mounted display to map out the borders of tumors in patients undergoing elective craniotomy for tumor resection, and to compare the degree of correspondence with MWBNS tracing.

METHODS

Eleven consecutive patients undergoing elective craniotomy for brain tumor resection were prospectively identified and underwent circumferential tumor border tracing at the time of incision planning by a surgeon wearing HoloLens AR glasses running the commercially available OpenSight application registered to the patient and preoperative MRI. Then, the same patient underwent circumferential tumor border tracing using the StealthStation S8 MWBNS. Postoperatively, both tumor border tracings were compared by two blinded board-certified neurosurgeons and rated as having an excellent, adequate, or poor correspondence degree based on a subjective sense of the overlap. Objective overlap area measurements were also determined.

RESULTS

Eleven patients undergoing craniotomy were included in the study. Five patient procedures were rated as having an excellent correspondence degree, 5 had an adequate correspondence degree, and 1 had poor correspondence. Both raters agreed on the rating in all cases. AR tracing was possible in all cases.

CONCLUSIONS

In this small pilot study, the authors found that AR was implementable in the workflow of a neurosurgery OR, and was a feasible method of preoperative tumor border identification for incision planning. Future studies are needed to identify strategies to improve and optimize AR accuracy.

ABBREVIATIONS

AR = augmented reality; ARHMD = AR head-mounted display; FIN = frameless intraoperative neuronavigation; MWBNS = monitor and wand—based neuronavigation station.

Monitor and wand–based neuronavigation stations (MWBNSs) for frameless intraoperative neuronavigation (FIN) are routinely used in brain tumor surgery for planning minimally invasive incisions by circumferentially identifying the tumor-brain interface.1 First introduced in 1991 by Kato et al.,2 MWBNS represented a marked improvement in ease of use over cumbersome frame-based navigation systems. However, MWBNS is unwieldy, as the MWBNS station has a large physical footprint taking up valuable OR space and the room must be arranged around the station. Additionally, navigation information is only provided when a surgeon picks up the navigation wand (temporarily preventing the use of a bimanual surgical technique), which interrupts the normal surgical workflow as the surgeon looks away from the surgical field and toward a remote monitor. A direct line of sight must be maintained between the MWBNS camera, navigation wand, and navigation star, which can be challenging in an OR full of equipment and moving staff. Additionally, current MWBNSs do not allow for a true 3D visualization of the lesion or approach to the lesion. Finally, an understanding of the spatial information provided by the MWBNS requires a steep learning curve and is not intuitive for trainees. Thus, innovations in FIN that are less cumbersome, provide continuous anatomical information, are hands-free, and that are more anatomically intuitive are sought.

Advances in augmented reality (AR) have enabled the investigation of digital superimposition of radiographic images onto the surgical field using wireless glasses that do not require a physical footprint in the OR.3–7 This allows for FIN during neurosurgical cases in a potentially less cumbersome fashion than with MWBNS (Video 1).

VIDEO 1. 0:00: MWBNSs for frameless intraoperative neuronavigation are routinely utilized in brain tumor surgery. Here, two surgeons are seen using an MWBNS. The surgeon must look away from the patient and instead must look at a monitor located on the other side of the room, which can be cumbersome and nonergonomic. Additionally, MWBNSs do not allow for a true 3D visualization of the lesion or the approach one must take to get to the lesion. 0:32: Here, we demonstrate the novel use of HoloLens AR glasses running the OpenSight application used for neuronavigation in cranial neurosurgery. The patient’s preoperative T1-weighted brain MR image was reconstructed and exported to the HoloLens glasses running OpenSight, which were used to visualize the hologram of the brain MR image superimposed over the patient’s head. Hand gestures can be used to change the imaging windowing (as seen at 0:41). As you can see, the hologram remains superimposed over the patient’s head even if the HoloLens wearer moves throughout the room and changes the angle of sight. No navigation wand is required, so the surgeon’s hands remain free. A 3D understanding of the tumor and its location in the patient’s head is intuitively displayed. Copyright Daniel G. Eichberg. Published with permission.Click here to view.

To this end, several cranial neurosurgical studies have investigated the use of AR head-mounted displays (ARHMDs) with HoloLens (Microsoft Corp.) AR glasses running various navigation applications. Van Doormaal and colleagues used the Unity application (Unity Technologies) running on HoloLens glasses to measure fiducial registration error (FRE) on plastic heads and found a mean FRE of 7.2 mm with holographic neuronavigation compared with 1.9 mm for conventional neuronavigation.8 Similarly, Incekara et al. used the Verto Studio application (www.vertostudio.com) running on HoloLens glasses to compare tumor border tracing for preoperative planning in the OR in neurosurgical patients; they found that holographic navigation differed from conventional neuronavigation in 64% of patients.17 Thus, while ARHMDs show promise in cranial neurosurgery, technique refinement is required to improve accuracy.

Here, to our knowledge, we present the first reported prospective pilot study investigating the feasibility of using the OpenSight (Novarad) application, which can automatically generate a 3D hologram, running on HoloLens glasses to map out the borders of tumors in patients undergoing elective craniotomy for tumor resection, comparing the degree of correspondence with that of MWBNS tracing.

Methods

Patient Selection

Eleven consecutive patients undergoing elective craniotomy for brain tumor resection were prospectively identified and included in the study. IRB approval was obtained prior to study initiation. The consent process was waived, as all patient identifying information was removed and because the use of the HoloLens glasses running OpenSight did not impact patient care or surgical planning in any way. Each patient underwent tumor border tracing for intraoperative planning first by ARHMD, then by an MWBNS.

AR Tumor Border Tracing

Preoperative contrast-enhanced T1-weighted MR images of the brain were obtained for all patients. The DICOM images from the MRI were de-identified and opened with PACS Viewer (Novarad) software on a computer. PACS Viewer converts the DICOM images into a file with a proprietary format that is recognized by OpenSight. The proprietary file format enables consolidation of all MR images as one 3D object rather than separate images.9 After conversion to the proprietary file format, the files were saved to an online cloud service. Using the HoloLens, the file was downloaded from the online cloud service and opened with OpenSight. OpenSight then automatically generates a 3D holographic object. This 3D-constructed volume is displayed via the HoloLens glasses as a 3D hologram. The OpenSight software enables the user to leverage the hand motion and voice detection systems of HoloLens to interact with the OpenSight software. The rendered 3D object may be registered to the anatomical area of interest manually (Fig. 1), automatically with an algorithm that matches the holographic surface area with the physical surface area, or with a combination of manual and automatic registration.9 In our study, manual registration was used. For manual registration, hand gestures are detected by OpenSight software running on HoloLens glasses and are used to move and manipulate the hologram until it lines up with the facial anatomy of the patient. OpenSight allows for the 3D object to be fixed in space, enabling the user to see the object in a static area even when moving or viewing from multiple angles.

FIG. 1.
FIG. 1.Manual registration of the generated 3D hologram to the patient’s head is straightforward. Hand gestures are detected by OpenSight software running on HoloLens glasses and are used to move and manipulate the hologram until it lines up with the patient’s facial anatomy. Upper: The hologram is almost aligned with the patient’s head and needs a final translation to the left for the holographic and real noses to match. Lower: A second patient after final alignment of the holographic and real faces.

All patients were positioned on the operating table and placed in a fixed three-point head clamp. The holographic projection of the preoperative brain MRI was visualized with the commercially available OpenSight application (running on HoloLens AR glasses). The hologram was aligned with the patient’s head using surface anatomy. This was done by matching the holographic surface to the patient’s ears and eyes. At least 5 surface points including the eyes, ears, and nose were used to verify the accuracy prior to tumor tracing. A surgeon wearing HoloLens glasses running the OpenSight application registered to the patient and their preoperative MRI performed circumferential tumor border tracing on the patient’s scalp using a red marker (Fig. 2). To facilitate tumor border tracing, the OpenSight application menu can be utilized with finger gestures (Fig. 3A) to change the MRI from 3D to 2D views and to change the windowing (Fig. 3B), or to change the MRI plane to the axial, coronal, or sagittal plane as necessary (Fig. 3C).

FIG. 2.
FIG. 2.A surgeon wearing HoloLens glasses running the OpenSight application registered to the patient and their preoperative MR images in 3D performing circumferential tumor border tracing using a red marker.
FIG. 3.
FIG. 3.To facilitate tumor border tracing, the OpenSight application dropdown menu (green) can be utilized with finger gestures (A) to change the MR images to 2D views and to change the windowing (B), or to change the MR image plane to the axial, coronal, or sagittal plane as necessary (C).

The OpenSight ARHMD-based tumor border tracing was not utilized in any way for surgical planning or patient care.

MWBNS Tumor Border Tracing

The patient was registered to the StealthStation S8 optical system (Medtronic) using surface trace registration. An accuracy of 2 mm or less was achieved on all patients. Registration accuracy was verified using anatomical landmarks. A second surgeon (not the surgeon who performed the HoloLens glasses running the OpenSight application tracing), using the StealthStation S8 navigation wand, performed tumor border tracing using the trajectory 1 and trajectory 2 views (Fig. 4). The border was marked using a blue marker. In 1 patient undergoing a retrosigmoid craniotomy (patient 10), the transverse and sigmoid sinuses were traced rather than the tumor border.

FIG. 4.
FIG. 4.A second surgeon using the StealthStation navigation wand (yellow arrow) and looking to the trajectory views on the monitor (red arrow) to trace the tumor borders. This process is cumbersome as 1) the surgeon must look away from the patient and surgical field, 2) the surgeon must hold a large wand that must be seen by the StealthStation camera, and 3) a bulky neuronavigation star must be present (blue arrow) and the direct path to the camera must also be unobstructed. In contrast, the surgeon wearing the HoloLens glasses running the OpenSight application is able to see a holographic representation of the tumor projected directly onto the patient and does not need to hold a wand for navigation (green arrow).

An intraoperative photograph was taken of the patient after both the OpenSight ARHMD and the MWBNS tumor border tracing.

OpenSight ARHMD and MWBNS Tumor Border Tracing Overlap Analysis

At a later date, the intraoperative photograph of the OpenSight ARHMD and MWBNS tumor border tracing were reviewed by two board-certified neurosurgeons blinded to which color marker corresponded to the method used to perform the tracing. The neurosurgeon rater was able to review the preoperative brain MRI to understand the tumor anatomy. The OpenSight ARHMD and the MWBNS tumor border tracings for each patient were compared by the two neurosurgeons. Based on a subjective sense of the degree of overlap between OpenSight and MWBNS, each OpenSight ARHMD tracing was reviewed to determine if the difference between the two tracings was significant or not. The alignment of the tumor border tracings was then rated as having an excellent, adequate, or poor degree of correspondence. An excellent degree of correspondence would offer the same amount of information for a safe and adequate approach as the StealthStation navigation system. An adequate degree of correspondence would provide the needed information for a safe, surgical approach. A poor degree of correspondence had unsatisfactory information for a safe, surgical approach.

Additionally, an objective, quantitative assessment of correspondence between tumor border tracing performed using HoloLens glasses running the OpenSight application (red) and StealthStation S8 navigation (blue) was performed (Fig. 5). Preoperative images with red and blue tracings were analyzed using ImageJ, a public domain image processing program developed at the National Institutes of Health.10 Tumor border tracings were divided into three distinct areas: the area composing only the OpenSight tracing (red, A), the area composing only the StealthStation S8 tracing (blue, C), and the intersection between the two (B). The area of both traces combined was calculated (Fig. 5A) as well as the area of the red (Fig. 5B) and blue (Fig. 5C) circles alone. The area of intersection was thus calculated as the area of both circles combined minus the area of the combined tracing. The percent overlap was calculated as the area of intersection divided by the area of the combined tracing multiplied by 100. Thus, in perfect correspondence in which both tracings lie directly over one another and have the exact same area, the resulting percent overlap would equal 100%. This was a measure of how well tumor border tracing using HoloLens compared with the current standard of intraoperative MRI guidance (StealthStation S8), what may be considered as the currently most accurate available estimation of the tumor border. Additionally, percent overlap was calculated relative to the area of the red and blue tracings, respectively. A Pearson’s product-moment correlation was used to determine the relationships between percent overlap achieved and patient number to assess improvement in AR accuracy over successive trials. A repeated-measures t-test was used to compare area measurements between red (ARHMD) and blue (MWBNS) tracings.

FIG. 5.
FIG. 5.A demonstration of how the percent overlap was calculated using area measurements in the ImageJ program. A: The area of the combined red (OpenSight) and blue (StealthStation) tracings was calculated. Relevant areas include the area solely within the red tracing (A), the area solely within the blue tracing (C), and the area of intersection (B). B and C: The area of the blue tracing and red tracing would equal the sum of B and C, and the sum of A and B, respectively. Thus, the area of intersection was calculated as the area of the red and blue tracings combined ([A + B] + [B + C]) minus the area of the combined tracing (A + B + C). The percent overlay was calculated as the area of intersection (B) divided by the area of the combined tracing (A + B + C) multiplied by 100.

Results

Eleven patients undergoing craniotomy for brain tumor resection were included. Five patient procedures were rated as having an excellent correspondence degree, 5 had an adequate correspondence degree, and 1 had poor correspondence (Fig. 6). Both blinded neurosurgeon raters agreed on the rating in 100% of cases. OpenSight tracing was possible in 100% of cases.

FIG. 6.
FIG. 6.Comparison of MWBNS (blue marker) and ARHMD (red marker) tumor tracings. A and B: Eleven patients undergoing craniotomy for brain tumor resection were included. The intraoperative photographs of the ARHMD and MWBNS tumor border tracing were reviewed by two board-certified neurosurgeons blinded to which color marker corresponded to the type of tracing. In 1 patient undergoing a retrosigmoid craniotomy (patient 10), the transverse and sigmoid sinuses were traced rather than the tumor border. The neurosurgeon rater was able to review the preoperative MR images to understand the tumor anatomy. Based on a subjective sense of the degree of overlap between ARHMD and MWBNS, each ARHMD tracing was rated as having an excellent, adequate, or poor degree of correspondence to assess the ability for a surgeon to perform the surgical approach navigation appropriately.

Of the first 6 consecutive patients, 1 patient tracing (16.7%) had an excellent correspondence. Of the second 5 patients, 4 patient tracings (80%) had an excellent correspondence. In all cases, the surgeon using the OpenSight ARHMD felt there was added information by seeing the 3D MRI augmented onto the patient’s head compared with viewing the image on the 2D screen on the navigation system. Patient 10 was excluded from the objective quantitative tumor border tracing analysis because linear tracings of the transverse and sigmoid sinuses were marked during surgery, rather than 2D tumor borders, so this analysis was not feasible.

Quantitative Correspondence Analysis

Correspondence for tumor tracings made using OpenSight ARHMD (red) and StealthStation S8 navigation (blue) was measured. The percent overlap between tracings was calculated as percent of combined tracings as well as percent of red and blue tracings, respectively. Values with the corresponding subjective rating from the blinded neurosurgeons are listed for each patient in Table 1. The mean percent overlap was 68.9 ± 30.3 (± SD). The percent overlap of red and blue tracings was also examined. The mean percent overlap of the red and blue tracings was 81.2 ± 25.4 and 75.2 ± 23.5, respectively. There was a significant correlation between the patient number and the percent overlap achieved (R2 = 0.738, p = 0.015). There was no significant difference in the tumor border area between tracings made with ARHMD (red) and MWBNS (blue) guidance (p = 0.344).

TABLE 1.Quantitative analysis of ARHMD and MWBNS tracing correspondence for all patients

Patient No. Subjective Rating % Overlap
Combined Tracing ARHMD (red) MWBNS (blue)
1 Poor 21.1 28.0 46.3
2 Adequate 65.2 77.1 80.8
3 Adequate 26.9 57.8 33.5
4 Adequate 69.3 57.8 70.3
5 Adequate 40.9 58.3 57.8
6 Excellent 100.0 100.0 100.0
7 Excellent 98.7 106.1 93.4
8 Excellent 95.1 96.6 98.5
9 Excellent 98.6 89.1 98.6
10 Adequate NA NA NA
11 Excellent 73.3 100.0 73.3

NA = not applicable.

Discussion

MWBNSs for frameless intraoperative neuronavigation have become a critical tool in cranial neurosurgery. Although MWBNSs have been utilized since 1991,2 minimal improvements in OR efficiency have been added in this sphere over the past 30 years. Because of its large physical footprint taking up a large proportion of limited OR space, the OR must often be set up around the MWBNS, which can be quite inconvenient or even problematic. Furthermore, the anatomical information provided by an MWBNS is available in a discontinuous manner; the surgeon must stop operating to pick up the wand and look away from the surgical field at a monitor to have updated positional information for navigation. Finally, because the spatial information is not superimposed in the surgical field, the anatomical position of the pathology and normal structures is not intuitive, so understanding the complicated 3D anatomy of the approach corridors requires a learning curve.

Advances in AR technology have elicited investigations into utility in neurosurgical ORs. Such previously published AR systems have superimposed images onto a remote screen or through a microscope,11–13 solutions which are arguably as cumbersome as the MWBNS. Besharati Tabrizi and Mahvash described an AR system that used a projector to superimpose images onto the surgical field, although these studies were performed in phantom heads.14 While this system permits radiographic information to be displayed ergonomically, it is not necessarily immersive, and operators and/or equipment must not come between the projector light source and the surgical field.

Systems using wireless ARHMDs are optimal, as they provide an immersive experience that has no physical footprint in the OR. No wand is required so the surgeon’s hands remain free, and the radiographic information is provided continuously instead of piecemeal with stops to “check the navigation.” ARHMD has been investigated for pedicle screw placement in phantom models,9 cadaveric spine models,15,16 and human patients.12 Van Doormaal et al. used the Unity application running on HoloLens glasses to measure fiducial registration error on plastic heads,8and Incekara et al. used the Verto Studio application running on HoloLens glasses to compare tumor border tracing for preoperative planning in the OR in neurosurgical patients.17 The Unity application requires manual hologram creation, and the Verto Studio has a semiautomatic hologram creation process; OpenSight, the application used during our study, is user friendly for surgeons who do not have extensive experience with hologram creation, as the software creates the 3D object automatically.

Unlike expensive and bulky MWBNS equipment, HoloLens glasses have a low profile and are comparatively inexpensive, safe, and compact, potentially representing an attractive option for resource-constrained or remote healthcare systems. Additionally, because the superimposition of 3D radiographic images onto the surgical field is more intuitive than the 2D display of anatomical information at the tip of a wand displayed on a monitor across the room, such technology may be useful for resident education and anatomical understanding (Fig. 7).

FIG. 7.
FIG. 7.Preoperative MR image and corresponding 3D AR image of select patients as displayed on HoloLens glasses running OpenSight in the OR. The AR images enable an intuitive understanding of the tumor location in the patient’s brain.

ARHMD Equipment

This project used HoloLens glasses running the OpenSight application. In October 2018, the use of HoloLens glasses running OpenSight was approved by the FDA for preoperative surgical planning.

The standalone HoloLens headset has an integrated computer and battery that projects to the organic light-emitting diode glasses. Because of all of this equipment, it is comparatively heavy and somewhat cumbersome. Decreased weight or improved headband padding could improve the device ergonomics.

Image file loading and preparation of the OpenSight system in a patient-specific application on the HoloLens requires relatively little time and effort for the user. The PACS Viewer file conversion step takes 10 to 20 minutes. The OpenSight software takes 5 to 10 minutes to download the PACS Viewer file. Because a 3D hologram is generated by the software automatically, little user experience with image segmentation and processing is required. Manual registration of the hologram to the patient’s head takes about 3 to 5 minutes.

Registration and Navigation

Manual registration was performed using surface anatomy. Hand gestures are detected by OpenSight software running on HoloLens glasses for manual registration and are used to move and manipulate the hologram until it lines up with the patient’s facial anatomy. Registration to the patient’s head may be performed manually, automatically with an algorithm that matches the holographic surface area with the physical surface area, or with a combination of manual and automatic registration.9

Other, potentially more accurate registration methods for ARHMDs are an active area of research. Van Doormaal et al. used a holographic point-matching registration technique.8 Fiducials were placed on a plastic head model. Then, using a 3D-printed pointer, a virtual holographic pointer recognized by the ARHMD created by the authors was superimposed over the 3D-printed pointer and the pointer was placed in each fiducial to register the hologram to the head.8 While the authors found that point matching with conventional neuronavigation resulted in a more accurate fiducial registration error than holographic neuronavigation point matching, improvements in technique with fiducial point matching alone or in combination with surface anatomy tracing may eventually result in more accurate patient registrations.

After initial registration, if the patient’s position changes, it is currently not possible to automatically correct the hologram position. This is because the hologram is fixed in space with respect to the room, not the head. Future improvements could incorporate head position feedback with automatic adjustments to hologram positioning.

OpenSight ARHMD Accuracy

Five (45%) of 11 patient tracings had an excellent correspondence of the ARHMD tumor tracing compared with the MWBNS tracing. It is important to note, however, that while the first 6 consecutive patients had a 16.7% excellent correspondence rate, the second 5 patients had an 80% excellent correspondence rate. Indeed, this trend is reflected in a significant correlation between the patient number and the percent overlap achieved upon image analysis. Thus, using ARHMD for tumor tracing appears to have a learning curve, in which accuracy improves with experience. For example, surgeons may require a slight transition period to adjust their visuospatial skills and movements while using ARHMD. Additionally, accurate windowing of the MRI to each patient may require practice. Lastly, adjusting the tumor tracing to the contours of each patient’s scalp necessitates that the contours of the 3D tumor be transposed as a 2D skin marking. Thus, accurate tracing in itself may improve with experience.

As there was no statistical significance between OpenSight ARHMD and the traditional navigation, we do not believe that OpenSight ARHMD is “better” than traditional navigation from an accuracy perspective, merely that it is comparable in terms of accuracy. Subjectively, however, we believe that AR is more intuitive to use for understanding the 3D location and anatomy of the tumor, particularly for residents new to cranial neurosurgery. An issue further complicating accurate tumor tracing is the influence of the perspective from which the surgeon is viewing the tumor in AR. As one would expect, the contours of the 3D tumor shape when transposed onto a 2D plane vary dramatically based on whether the surgeon is viewing the tumor from a view parallel to the planned surgical corridor or tangential to this planned surgical corridor. A view that is slightly tangential to the planned surgical corridor, in which the angle of approach is slightly to the side of the planned surgical corridor, may alter the contours of the resultant skin tracing dramatically and greatly influence the approach trajectory. This is in contrast to the trajectory views perspective of the MWBNS wand, which always views the tumor from a head-on perspective. Thus, the angle of AR viewing should be as parallel to the planned surgical corridor as possible to ensure the best accuracy for surgical planning and tumor tracing.

Future Directions

ARHMD will likely be useful intraoperatively for identifying and protecting neurovascular structures, as well as localizing tumor margins to ensure that maximal safe resection has been achieved. Future studies will validate the accuracy of OpenSight ARHMD intraoperatively, by prospectively measuring the correspondence between specific neurovascular structures with the unaided eye and with OpenSight ARHMD. Additionally, completeness of tumor resection and length of surgery aided by OpenSight ARHMD may be compared with use of standard MWBNS navigation.

Study Limitations

This was a small pilot study investigating the feasibility of the use of ARHMD with OpenSight in cranial neurosurgery ORs to trace tumor borders in order to facilitate incision planning. Larger randomized studies are required to validate our findings. The value and investigation of ARHMD technology during surgery for the visualization of intracranial structures is still ongoing. ARHMD technology is nascent, thus continued improvements and validation is required before it can be solely relied upon in actual surgeries.

Conclusions

This small pilot study suggests that the implementation of ARHMD with OpenSight is feasible in a cranial neurosurgery OR without interrupting workflow. Further development and investigation of ARHMD-based systems to determine reliability of preoperative tumor border identification for incision planning will be necessary. Future studies are needed to identify strategies to improve AR accuracy, as well as to validate the reliability of accuracy intraoperatively to facilitate identification of tumor borders and neurovascular structures during resection.

Acknowledgments

Dr. Daniel Eichberg is supported by a grant from the National Cancer Institute (NCI; T32 CA 211034). We thank Roberto Suazo for assistance with video editing.

Disclosures

Drs. Ivan and Urakov hold a grant from the office of the Provost at the University of Miami in association with Magic Leap (Plantation, Florida) augmented reality applications development for neurosurgery. Dr. Ivan reports being a consultant to and receiving research funding from Medtronic and the NX Development Corp. Dr. Urakov reports being a consultant to J&J and Medtronic.

Author Contributions

Conception and design: Eichberg, Komotar, Urakov. Acquisition of data: Eichberg, Di, Urakov. Analysis and interpretation of data: Eichberg, Ivan, Di, Luther, Lu. Drafting the article: Eichberg, Di, Shah, Luther, Lu. Critically revising the article: Eichberg, Ivan, Shah, Luther, Lu, Komotar. Reviewed submitted version of manuscript: all authors. Approved the final version of the manuscript on behalf of all authors: Eichberg. Statistical analysis: Di, Shah, Urakov. Administrative/technical/material support: Komotar, Urakov. Study supervision: Ivan, Komotar, Urakov.

Supplemental Information

Videos

 

 

References

  • 1

    Figueroa JMorell ABowory VShah AHEichberg Det al. Minimally invasive keyhole temporal lobectomy approach for supramaximal glioma resection: a safety and feasibility studyJ Clin Neurosci2020;725762.

  • 2

    Kato AYoshimine THayakawa TTomita YIkeda Tet al. A frameless, armless navigational system for computer-assisted neurosurgeryJ Neurosurg1991;74(5):845849.

  • 3

    Karmonik CElias SNZhang JYDiaz OKlucznik RPet al. Augmented reality with virtual cerebral aneurysms: a feasibility studyWorld Neurosurg2018;119:e617e622.

  • 4

    Jean WCMini-pterional craniotomy and extradural clinoidectomy for clinoid meningioma: optimization of exposure using augmented reality template: 2-dimensional operative videoOper Neurosurg (Hagerstown)2020;19(6):E610.

  • 5

    Ramirez-Zamora AGiordano JGunduz AAlcantara JCagle JNet al. Proceedings of the Seventh Annual Deep Brain Stimulation Think Tank: Advances in Neurophysiology, Adaptive DBS, Virtual Reality, Neuroethics and TechnologyFront Hum Neurosci2020;14:54.

  • 6

    Perin AGalbiati TFGambatesa EAyadi ROrena EFet al. Filling the gap between the OR and virtual simulation: a European study on a basic neurosurgical procedureActa Neurochir (Wien)2018;160(11):20872097.

  • 7

    Jean WCFelbaum DRThe use of augmented reality to improve safety of anterior petrosectomy: 2-dimensional operative videoWorld Neurosurg2021;146:162.

  • 8

    van Doormaal TPCDoormaal van JAMMensink T. Clinical accuracy of holographic navigation using point-based registration on augmented-reality glassesOper Neurosurg (Hagerstown)2019;17(6):588593.

  • 9

    Gibby JTSwenson SACvetko SRao RJavan R. Head-mounted display augmented reality to guide pedicle screw placement utilizing computed tomographyInt J CARS2019;14(3):525535.

  • 10

    Schneider CARasband WSEliceiri KWNIH Image to ImageJ: 25 years of image analysisNat Methods2012;9(7):671675.

  • 11

    Vassallo RKasuya HLo BWYPeters TXiao Y. Augmented reality guidance in cerebrovascular surgery using microscopic video enhancementHealthc Technol Lett2018;5(5):158161.

  • 12

    Elmi-Terander ABurström GNachabe RSkulason HPedersen Ket al. Pedicle screw placement using augmented reality surgical navigation with intraoperative 3D imaging: a first in-human prospective cohort studySpine (Phila Pa 1976)2019;44(7):517525.

  • 13

    Cabrilo IBijlenga PSchaller K. Augmented reality in the surgery of cerebral aneurysms: a technical reportNeurosurgery2014;10 ( suppl 2 ): 252261.

  • 14

    Besharati Tabrizi LMahvash M. Augmented reality-guided neurosurgery: accuracy and intraoperative application of an image projection techniqueJ Neurosurg2015;123(1):206211.

  • 15

    Urakov TMWang MYLevi ADWorkflow caveats in augmented reality-assisted pedicle instrumentation: cadaver labWorld Neurosurg.2019;126:e1449e1455.

  • 16

    Elmi-Terander ANachabe RSkulason HPedersen KSöderman Met al. Feasibility and accuracy of thoracolumbar minimally invasive pedicle screw placement with augmented reality navigation technologySpine (Phila Pa 1976)2018;43(14):10181023.

  • 17

    Incekara FSmits MDirven CVincent A. Clinical feasibility of a wearable mixed-reality device in neurosurgeryWorld Neurosurg.2018;118:e422e427.

Comments are closed.