Augmented reality in non-instrumentation minimally invasive spine surgery: a narrative review and future perspectives
Review Article

Augmented reality in non-instrumentation minimally invasive spine surgery: a narrative review and future perspectives

Feng Han1#, Feifan Xu2# ORCID logo, Yun Yang3#, Colin J. Willoughby4, Andrew K. Chan4, Dean Chou4

1Department of Neurosurgery, The Affiliated Hospital of Guizhou Medical University, Guizhou, China; 2Department of Neurosurgery, Peking University First Hospital, Beijing, China; 3Medical Integration and Practice Center, Cheeloo College of Medicine, Shandong University, Jinan, China; 4Department of Neurological Surgery, Columbia University Vagelos College of Physicians and Surgeons, The Och Spine Hospital at NewYork-Presbyterian, New York, NY, USA

Contributions: (I) Conception and design: F Han, F Xu, Y Yang; (II) Administrative support: D Chou; (III) Provision of study materials or patients: F Han, F Xu, Y Yang; (IV) Collection and assembly of data: F Han, F Xu, Y Yang; (V) Data analysis and interpretation: F Han, F Xu, Y Yang; (VI) Manuscript writing: All authors; (VII) Final approval of manuscript: All authors.

#These authors contributed equally to this work.

Correspondence to: Dean Chou, MD. Department of Neurological Surgery, Columbia University Vagelos College of Physicians and Surgeons, The Och Spine Hospital at NewYork-Presbyterian, 5141 Broadway, New York, NY 10034, USA. Email: dc3658@cumc.columbia.edu.

Background and Objective: Augmented reality (AR) is being increasingly integrated into spine surgery. However, most existing reviews predominantly focus on its use in instrumentation-based procedures. The broader role of AR in of non-instrumentation minimally invasive spine surgery (MISS), including decompression, endoscopic, and tubular techniques, has not been fully synthesized. This narrative review aims to bring together the current evidence on AR applications across the full range of MISS, with a particular focus on visualization- and workflow-critical non-instrumentation procedures, an area that has been underrepresented in prior reviews.

Methods: A comprehensive literature search was conducted in PubMed, Embase, and the Cochrane Library for articles published between 2010 and 2025. Search terms included Medical Subject Headings (MeSH) and free-text keywords related to augmented reality, minimally invasive spine surgery, decompression, and navigation. Only peer-reviewed English-language articles were included. Study selection was performed independently by three reviewers, with discrepancies resolved by consensus.

Key Content and Findings: The review synthesizes AR applications across percutaneous procedures, tubular surgery, endoscopic spine surgery (ESS), microscopic surgery, and lateral approaches. The literature indicates that AR can enhance intraoperative visualization through in-situ image overlays, reduce visual attention shifts and cognitive workload, improve workflow efficiency, and lower radiation exposure. However, current evidence is largely derived from small-sample, single-center feasibility studies (levels IV–V evidence). Comparative analyses suggest AR is most valuable as a complementary visualization layer within multimodal navigation ecosystems rather than a standalone replacement.

Conclusions: AR represents a promising adjunct in MISS, particularly for non-instrumentation procedures in which visualization is limited. To define its definitive clinical value, future research must prioritize workflow standardization, validation of performance metrics, and adequately powered multicenter trials that assess clinically meaningful outcomes. Integration with artificial intelligence (AI) and robotic platforms may further enhance surgical precision and efficiency, shaping the next phase of MISS.

Keywords: Augmented reality (AR); minimally invasive spine surgery (MISS); surgical navigation; decompression; workflow


Submitted Oct 14, 2025. Accepted for publication Jan 08, 2026. Published online Feb 06, 2026.

doi: 10.21037/jss-2025-aw-190


Introduction

Minimally invasive spine surgery (MISS) has expanded beyond pedicle screw instrumentation to encompass a diverse range of techniques (1). This spectrum includes decompressive and discectomy procedures—such as tubular microscopic decompression, endoscopic discectomy, biportal endoscopic surgery, and microscopic surgery—as well as minimally invasive fusion techniques. Examples of the latter include lateral and anterior retroperitoneal approaches like extreme lateral interbody fusion (XLIF), lateral lumbar interbody fusion (LLIF), oblique lateral interbody fusion (OLIF), and anterior lumbar interbody fusion (ALIF). Although these procedures aim to reduce tissue disruption and accelerate recovery, they impose unique technical constraints, including a heavy reliance on precise anatomical orientation and depth perception within narrow, deep, or anatomically complex surgical corridors. Additional challenges include limited visualization, radiation exposure from fluoroscopy, and steep learning curves (Table 1).

Table 1

Major technical challenges in MISS

Category Technical challenge Specific issues encountered during surgery Primary impact
Visualization and anatomical exposure Limited visualization Small incisions and narrow working channels; endoscopic view easily obscured by bleeding; unstable microscopic/endoscopic field Higher risk of nerve injury; inadequate decompression
Loss of anatomical orientation Key landmarks are difficult to expose; more pronounced in obese, revision, or severe stenosis cases Increased surgical risk; reduced procedural efficiency
Working space and instrument manipulation Restricted degrees of freedom Long, slender instruments in confined tubular or endoscopic corridors; limited angulation and maneuverability Greater technical difficulty; prolonged operative time
Instrument crowding Instruments interfere with each other within the confined working channel or biportal system Reduced surgical fluency; increased risk of complications
Imaging dependence and navigation Increased radiation exposure Frequent C-arm fluoroscopy or intraoperative CT; endoscopic procedures rely heavily on real-time imaging Higher radiation dose to surgical team
High accuracy requirement for needle placement and screw trajectory Percutaneous screws, endoscopic fusion, and lateral approaches require highly precise trajectories Risk of misplacement, fixation failure, or neurovascular injury
Steep learning curve Technically demanding procedures Techniques such as tubular surgery, endoscopic lumbar discectomy, biportal endoscopy, OLIF, and percutaneous instrumentation require extensive experience Higher complication rates in inexperienced surgeons; lower efficiency
Bleeding and soft-tissue management Difficulty in bleeding control Even minimal bleeding can obscure the endoscopic or tubular view completely Increased risk of nerve injury; incomplete decompression
Risk of soft-tissue injury Limited space restricts nerve root retraction and soft-tissue handling Postoperative pain; potential neurologic deficits
Management of intraoperative complications Challenging durotomy repair Limited working space makes dural tear repair difficult May require conversion to open surgery; prolonged hospitalization
Difficult to manage nerve injuries Limited corridor restricts hemostasis and neural decompression Higher risk of permanent neurological deficits
Device-, navigation-, and AR-related issues Registration drift or navigation inaccuracy Navigation/AR systems may lose accuracy during long procedures Screw malposition; trajectory deviation
Limited visualization of needle or instrument trajectory Real-time deep-structure tracking can be difficult, especially in anterior/lateral approaches Increased risk of vascular or visceral injury
Patient-specific challenges Obesity significantly increases difficulty Deeper working corridors; harder anatomical localization; restricted needle trajectories Poor visualization; longer operative time
Deformity and revision cases Altered or destroyed anatomical landmarks; extensive scar tissue Heavier reliance on navigation; higher complication risk

AR, augmented reality; CT, computed tomography; MISS, minimally invasive spine surgery; OLIF, oblique lateral interbody fusion.

While augmented reality (AR) has been actively explored in spine surgery, existing literature and prior reviews have predominantly concentrated on its application in instrumentation-focused procedures, such as pedicle screw placement and conventional minimally invasive fusion (2-6). As a result, the literature remains limited on AR use in the broader set of non-instrumentation MISS procedures, despite the high demands these workflows place on visualization and spatial orientation. Several key questions are still insufficiently explored: how can AR enhance visualization in tubular or endoscopic decompression? Can it improve efficiency and safety in percutaneous or endoscopic techniques? What technical challenges arise when integrating AR with microscopes or endoscopes in confined operative fields?

Addressing these questions is important because MISS frequently requires navigation where millimeters matter and anatomical landmarks are partially obscured. In this context, AR’s ability to overlay virtual guidance onto the real-world surgical field could offer transformative advantages through reduced radiation dependency and improved spatial navigation.

To address this MISS-specific gap, the present review provides a critical appraisal of AR applications across the full spectrum of MISS, with particular emphasis on non-instrumentation, decompression-dominant workflows in which visualization and spatial orientation are most limited. We evaluate current evidence regarding the clinical utility, technical performance, and workflow impact of AR in percutaneous, tubular, endoscopic, and microscopic MISS procedures. By clarifying the current state of evidence, identifying persistent technical and methodological limitations, and outlining future translational and research directions specific to MISS, this review aims to define the unique value proposition of AR in advancing MISS beyond conventional instrumentation-centered applications. We present this article in accordance with the Narrative Review reporting checklist (available at https://jss.amegroups.com/article/view/10.21037/jss-2025-aw-190/rc).


Methods

A comprehensive literature search was conducted across three major databases: PubMed, Embase, and the Cochrane Library. The search strategy incorporated both Medical Subject Headings (MeSH) and free-text terms. The search period was restricted to publications from January 1, 2010 to September 1, 2025.

Inclusion criteria were as follows: peer-reviewed publications, articles written in English, and relevant case reports or case series. Non-English studies were excluded. The selection process was independently performed by three reviewers to ensure consistency and minimize selection bias. Discrepancies were resolved through discussion and consensus (Table 2).

Table 2

The search strategy summary

Items Specification
Date of search September 10, 2025
Databases searched PubMed, Embase, Cochrane Library
Search terms used Augmented reality; minimally invasive spine surgery; surgical navigation; workflow; decompression; percutaneous; tubular; endoscopic; microscopic; lateral approach
Timeframe January 1, 2010–September 1, 2025
Inclusion and exclusion criteria Inclusion: peer-reviewed articles, studies in English, case reports/series. Exclusion: non-English articles
Selection process Conducted by three authors (F.H., F.X. and Y.Y.) independently; discrepancies were resolved through discussion and consensus

The findings were synthesized narratively using thematic analysis focused on technical applications, clinical outcomes, and comparative advantages across different non-instrumentation MISS procedures.


Discussion

The distinct advantage and integrative role of AR in non-instrumentation MISS

While AR has demonstrated clear advantage in instrumented spine surgery through enhancing pedicle screw placement accuracy, its role in non-instrumentation MISS is distinct. In contrast to instrumentation-centered workflows where accuracy metrics often predominate, non-instrumentation MISS depends on continuous spatial orientation and real-time anatomical clarity within deep, narrow, and visually restricted corridors. Accordingly, the evaluation of AR in this setting should extend beyond screw accuracy to include workflow efficiency, visual attention shifts, registration stability, and surgeon cognitive load.

The anticipated advantages are therefore fundamentally different: rather than preventing implant malposition, AR primarily aims to enhance real-time anatomical understanding, reduce fluoroscopy-associated radiation exposure, and facilitate the procedural learning curve through intuitive in-situ guidance. This distinction underscores the need for dedicated research frameworks and outcome measures tailored specifically to non-instrumentation MISS.

By addressing the core challenges of MISS—limited visualization, constrained spatial orientation, and heightened cognitive load—AR provides intuitive guidance without fundamentally altering established surgical workflows (Table 3). Given the procedural diversity within MISS, its role is best examined within specific operative contexts. The following sections therefore summarize current evidence for AR applications in major non-instrumentation MISS procedures, with study quality and level of evidence detailed in Table 4.

Table 3

Summary of advantages of AR in MISS

Domain Key features Clinical relevance in MISS
Enhanced intraoperative visualization Real-time in-situ overlay of imaging data (CT/MRI/navigation) onto the surgical field Overcomes limited visualization caused by narrow working corridors and deep operative fields
Improved spatial orientation Intuitive three-dimensional representation of anatomical structures and trajectories Reduces the risk of disorientation, particularly in tubular and endoscopic procedures
Increased navigational accuracy Precise delineation of critical structures (nerves, vessels, tumors, OPLL margins) Enhances surgical safety and minimizes neurovascular injury
Reduced cognitive workload Eliminates frequent attention shifts between the operative field and external displays Improves intraoperative focus and decision-making efficiency
Improved ergonomics Promotes more neutral posture with fewer head and trunk adjustments Reduces physical strain and supports long-term surgeon occupational health
Lower perceived workload Decreased subjective workload scores Enhances procedural comfort and sustainability
Support for complex cases Reliable guidance in revision surgery, tumors, OPLL, and fusion-related procedures Expands the applicability of MISS to technically demanding scenarios
Reduced reliance on fluoroscopy Decreased need for repeated C-arm confirmation in selected steps Lowers radiation exposure for both patients and operating room staff
Facilitated learning curve Provides intuitive anatomical feedback and trajectory guidance Assists training and skill acquisition for early-career surgeons
High system compatibility Seamless integration with microscopes, endoscopes, and navigation platforms Preserves established MISS workflows and facilitates clinical adoption

AR, augmented reality; CT, computed tomography; MISS, minimally invasive spine surgery; MRI, magnetic resonance imaging; OPLL, ossification of the posterior longitudinal ligament.

Table 4

AR-related studies in non-instrumentation MISS

Author [year] Procedure Interface type Study type Sample size Level of evidence* Key study design limitations
Wu et al. [2014] (7) Percutaneous vertebroplasty Projector (ARCASS) Clinical feasibility study (case series) 3 patients Level IV Small clinical sample size; absence of control group
Hu et al. [2020] (8) Percutaneous vertebroplasty Projector (ARCASS) Prospective case-control study 9 patients in ARCASS group, 9 patients in control group Level III Limited sample size; one single surgeon; potential selection bias
Wei et al. [2019] (9) Percutaneous kyphoplasty AR-HMD (HoloLens) Prospective randomized study 40 patients (20 in Mix Reality group, 20 in fluoroscopy group) Level II Small sample size; requires C-arm fluoroscopy assistance
Heinrich et al. [2019] (10) Spinal needle injections AR-HMD (HoloLens) Phantom study 21 participants, 9 needle insertions each (total 189 insertions) Level V Phantom study only; manual registration process
Gibby et al. [2020] (11) Percutaneous spine procedures AR-HMD (HoloLens) Clinical case series with phantom control 10 patients (18 procedures); phantom control (32 needle passes) Level IV Small clinical sample size; reliance on manual registration in some cases; no direct comparison group
Agten et al. [2018] (12) Lumbar facet joint injections AR-HMD (HoloLens) Phantom study 2 radiologists, 20 injections per modality Level V Phantom model only; manual registration was tedious and time-consuming; spatial drift of hologram with head movement
Fritz et al. [2013] (13) Bone biopsy Projector (Perk Station) Prospective feasibility study (cadaveric) 4 cadavers, 16 target lesions Level IV Small number of cadaveric study; needle path limited to axial plane
Jun et al. [2023] (14) Transforaminal epidural injection Self-developed AR-assisted navigation system Phantom study with comparison (AR-guided vs. fluoroscopy-guided) 1 torso phantom, 5 targets per side (total 10 needle insertions) Level V Phantom model only; one single physician; potential learning curve and adaptation bias
Abe et al. [2013] (15) Percutaneous vertebroplasty AR-HMD (Custom) Clinical feasibility study (case series) combined with phantom validation 5 patients (10 needle insertions), plus 40 phantom trials Level IV (clinical) & Level V (phantom) Small clinical sample; manual marker-based registration; accuracy dependent on fluoroscopic adjustment
Sommer et al. [2022] (16) MIS-TLIF AR-microscope (BrainLAB) + tubular retractor Prospective case series (feasibility study) 10 patients Level IV Small sample size, no control group
Sommer et al. [2022] (17) Intradural spinal tumor surgery AR-microscope (BrainLAB) + tubular retractor Prospective case series (feasibility study) 3 patients Level IV Small sample size, no control group
Kirnaz et al. [2022] (18) Intradural spinal tumor surgery AR-microscope (BrainLAB) + tubular retractor Surgical video/technical report 1 patient Level V Single-case video report; no comparative data
Schmidt et al. [2025] (19) MIS-TLIF AR-microscope (BrainLAB) + tubular retractor Prospective randomized-controlled trial on simulation model 12 residents (24 procedures: 12 with AR, 12 without) Level II Simulation model study; limited to one procedure type; may not fully replicate intraoperative variability
Jamshidi et al. [2021] (20) Endoscopic TLIF AR-HMD + endoscopy Surgical video/technical report 1 patient (video demonstration) Level V Single-case video report; no comparative data
Park et al. [2025] (21) Biportal endoscopic unilateral laminotomy and bilateral decompression AR-HMD (Apple Vision Pro) + endoscopy Case report 1 patient Level V Single-case report; no control or comparative data
Van Isseldyk et al. [2025] (22) Uniportal endoscopic lumbar discectomy AR-HMD (low-cost headset) + endoscopy Prospective controlled within-subject study 10 surgeons × 20 surgeries (200 cases total) Level III Within-subject design reduces inter-surgeon variability; subjective (NASA-TLX) and observational (RULA) measures
Carl et al. [2019] (23) Degenerative spine surgery (various approaches) AR-microscope (HUD) Prospective case series 10 patients Level IV Small sample size; no control group; without clinical outcomes comparison
Umebayashi et al. [2018] (24) Transvertebral anterior cervical foraminotomy and posterior cervical laminoforaminotomy AR-microscope (HUD) Technical report/case series 2 patients Level IV Small sample size; no control group; without clinical outcome comparison
Kosterhon et al. [2017] (25) Osteotomy AR-microscope (HUD) Technical case report/feasibility study 1 patient (+ 1 spine model) Level IV Small sample size; no control group
Benjamin et al. [2019] (26) Intramedullary spinal cord neoplasms resection AR-microscope (BrainLAB) Case series/technical note 2 patients Level IV Small sample size; no control group
Carl et al. [2019] (27) Spine tumor surgery (intra- and extradural lesions) AR-microscope (HUD) Prospective case series 10 patients Level IV Small sample size; no control group; lack of clinical outcomes comparison
Carl et al. [2019] (28) Intradural spinal tumor surgery AR-microscope (HUD) Prospective case series 10 patients Level IV Small sample size; no control group; lack of clinical outcomes comparison
Carl et al. [2020] (29) Various spine surgeries (tumors, degenerative cases, infections, deformities) AR-microscope (HUD) Prospective observational study/case series 42 patients Level IV Absence of control group; lack of clinical outcomes comparison; selection bias
Onuma et al. [2023] (30) Anterior decompression (floating method) for cervical OPLL AR-microscope (HUD) Retrospective observational study with historical control 14 AR patients (vs. 53 historical controls) Level III Retrospective design; small sample size; comparison with historical controls; lack of randomization
Pojskić et al. [2024] (31) Intradural spinal tumor surgery AR-microscope (BrainLAB) Retrospective cohort 120 surgeries (46 with AR) Level III Retrospective cohort comparison, relatively small sample, surgeon heterogeneity, subjective AR benefit assessment
Pojskić et al. [2021] (32) Lateral approaches AR-microscope (HUD) Prospective case series 16 patients Level IV Small sample size; non-homogeneous pathologies and procedures; absence of control group
Urreola et al. [2025] (33) XLIF AR-HMD (Augmedics xvision Spine System) Prospective case series 5 patients Level IV Small sample size; absence of control group; heterogeneous procedures and patient characteristics

*, level of evidence is based on the Oxford Centre for Evidence-Based Medicine (OCEBM) 2011 Levels of Evidence. AR, augmented reality; ARCASS, augmented reality computer-assisted spine surgery; HMD, head-mounted display; HUD, heads-up display; MISS, minimally invasive spine surgery; MIS-TLIF, minimally invasive transforaminal lumbar interbody fusion; NASA-TLX, National Aeronautics and Space Administration-Task Load Index; OPLL, ossification of the posterior longitudinal ligament; RULA, Rapid Upper Limb Assessment; XLIF, extreme lateral interbody fusion.

Lumbar puncture and percutaneous vertebroplasty

Early applications of AR in MISS have focused on technically less complex percutaneous procedures, including lumbar punctures and vertebroplasty, in part due to technical and equipment constraints (7,8,10,11). Across available studies, AR-assisted techniques achieve accuracy comparable to conventional computed tomography (CT) or fluoroscopic guidance while also offering distinct workflow advantages.

In simulated lumbar facet joint punctures, AR-assisted navigation achieved an accuracy of 97.5%, compared with 100% under CT-guidance, while eliminating the need for lead garments (12). Similar precision was shown in cadaveric biopsies of spinal metastases under magnetic resonance imaging (MRI)-AR navigation, with a mean deviation of 4.3±1.2 mm and efficient procedural times (13). A comparative study on lumbar models further reported equivalent target deviation between AR and fluoroscopy for epidural punctures, but AR reduced total operative time by two-thirds, suggesting a significantly shortened learning curve (14). Additionally, a pilot AR-guided vertebroplasty study reported successful needle trajectory planning without pedicle breach or cement leakage (15).

AR in tubular surgery

Tubular surgery is a representative technique within MISS and is primarily used to treat herniated discs and lumbar stenosis, and to perform transforaminal lumbar interbody fusion (TLIF) procedures (34). Integration of AR with the surgical microscope has yielded emerging and clinically relevant applications in tubular MISS (16-18,35). Visualization in this setting is inherently constrained by the narrow working corridor and deep operative field, and subtle anatomic cues may be obscured.

AR-assisted surface mapping enables surgeons to visualize preplanned trajectories and critical bony landmarks directly overlaid onto the patient’s anatomy (36). This in-situ guidance supports more accurate entry-point selection and trajectory planning.

After the tubular retractor is docked, surgeons must identify key anatomical structures under the microscope. With conventional CT-based navigation, this process often requires repeated pauses to confirm landmarks using a navigated pointer. This shifts attention between the operative field and the navigation monitor and can disrupt surgical flow. AR mitigates this limitation by directly highlighting relevant anatomical structures—such as the lamina, pars interarticularis, facet joint, pedicle, disc space, and neural elements—within the microscopic field of view (16,23). This real-time overlay reduces workflow interruptions, streamlines intraoperative decision-making, and enhances surgeon comfort by decreasing cognitive and physical demands, potentially minimizing fatigue-related errors (19).

Although AR-microscope navigation requires additional steps, including the fusion of preoperative planning CT with intraoperative navigation CT and subsequent system calibration, these processes are brief (approximately 5 and 2 minutes, respectively) and appear to have minimal impact on the overall operative workflow (16).

AR in endoscopic surgery

Endoscopic spine surgery (ESS) represents one of the most technically demanding procedures within MISS, particularly with respect to visualization and spatial orientation (37). In recent years, only a limited number of studies have begun to explore the potential role of AR in ESS.

In 2021, Jamshidi et al. reported the first clinical application of AR in ESS (20). The authors demonstrated that overlaying navigation data directly onto the surgical field was particularly advantageous during endoscopic TLIF. Furthermore, the ability to simultaneously visualize the operative field and three-dimensional imaging during hardware placement was shown to mitigate attention shift, thereby improving surgical workflow, operative efficiency, and patient safety (37,38).

In a separate case evaluating the Apple Vision Pro headset, Park et al. reported that AR not only effectively minimized attention shift but also enabled simultaneous display of patient medical records and preoperative imaging within the eyepiece, allowing surgeons to access relevant information intraoperatively (21). In addition, the AR display provided high-resolution (5K) endoscopic video with improved line-of-sight ergonomics directly in front of the surgeon’s eyes. This advantage was shown to significantly reduce perceived workload and ergonomic strain among endoscopic surgeons, suggesting a potential role for AR in improving surgeon well-being and the long-term sustainability of surgical practice (22). Notably, no latency or interruption was observed between the real-time endoscopic video and the AR display during surgery, ensuring procedural continuity and operational safety.

AR in microscopic spine surgery

Building upon the applications of AR in tubular and ESS, where visualization is constrained by narrow working corridors and limited lines of sight, the operating microscope represents a natural and widely adopted platform for integrating AR in MISS. As the most commonly used modality for magnification and illumination in MISS, the surgical microscope can be paired with AR technology as a standalone system, avoiding the need for head-mounted displays (29).

Early clinical use of AR-integrated microscopy in MISS focused on improving visualization of resection planes during spinal osteotomies and supporting orientation in complex keyhole procedures, including transvertebral anterior cervical foraminotomy and posterior cervical laminoforaminotomy (24,25). Subsequently, research groups from the University of Marburg and Weill Cornell Medicine respectively published a series of clinical studies evaluating commercially available AR-assisted microscope systems across a broad range of routine minimally invasive spinal procedures, including intradural tumors and lesions, extradural tumors, and degenerative spinal diseases (16-19,23,27-29,31).

Collectively, these studies demonstrated that when standardized workflows are rigorously followed, particularly with respect to achieving high registration accuracy and appropriate system calibration, AR-assisted microscopy can be applied across a wide range of spinal disease entities, anatomical regions, and surgical procedures traditionally addressed with conventional microscope-based techniques. Importantly, satisfactory navigational accuracy was reported for critical and complex anatomical structures, including revision cases, tumors, neural elements, and vascular structures (39). In more technically demanding and anatomically complex procedures, including cervical ossification of the posterior longitudinal ligament (OPLL) drilling and minimally invasive transfacet TLIF, AR-assisted microscope has been reported to provide satisfactory intraoperative guidance and encouraging surgical results (30,40).

AR in lateral approaches to the spine

AR navigation is increasingly used in lateral spinal approaches. Its main contribution often occurs during non-instrumentation steps such as corridor establishment and anatomical exposure, rather than implant placement alone.

Pojskić et al. first systematically reported the application of intraoperative CT-based AR navigation in various lateral spinal approaches, including discectomy and tumor resection, which typically do not involve implant placement. Their study showed that AR clearly delineates disc spaces, tumor margins, and adjacent critical structures (e.g., major vessels and nerve plexuses), thereby assisting surgeons with precise anatomical localization and lesion removal while reducing reliance on intraoperative fluoroscopy (32).

The recent study by Urreola et al. further clarified the utility of AR throughout XLIF/prone transpsoas procedures (33). Their findings indicate that AR navigation not only optimizes interbody cage placement but also plays a key role in critical non‑instrumentation steps such as establishing a transpsoas corridor, exposing the disc space, and preparing the endplates. By providing real‑time three-dimensional anatomical overlays, AR enables surgeons to safely traverse the psoas muscle, avoid neurovascular bundles, and efficiently plan and dock the working channel. These features may improve efficiency and procedural safety.

To date, no published studies have described the use of AR in OLIF or ALIF. As AR technology continues to mature and the demand for surgical precision grows, its applications in various lumbar fusion techniques are expected to grow.

Comparison of AR with existing navigation modalities

In contrast to AR-based navigation, CT-based navigation relies on real-time registration of intraoperative instruments with patient-specific CT imaging, providing the surgeon with multiplanar three-dimensional visualization on a separate monitor. Its principal advantages include a high pedicle screw placement accuracy (>95%), clear depiction of the spatial relationship between instruments and adjacent neural and vascular structures, reduced complication rates, and reduced surgeon radiation exposure (41-43). These benefits are particularly evident in cases with complex anatomy. However, CT navigation is limited by high equipment costs, the requirement for intraoperative CT acquisition, potential prolongation of operative time, and the risk of image drift (41,44).

Robotic navigation systems offer exceptional precision and reproducibility, with reported pedicle screw accuracy rates often exceeding 98% (45). These systems can significantly reduce the risk of neurovascular injury, promote procedural standardization and minimally invasive techniques, and further reduce surgeon radiation exposure (41,46). Nonetheless, adoption remains limited by acquisition and maintenance costs and a steep learning curve. In addition, guidance is often relatively static, with limited real-time feedback and no tactile information (47,48).

Fluoroscopy-based navigation remains widely used because of broad availability, real-time dynamic imaging, lower cost, and ease of use. It is particularly common in procedures requiring frequent adjustment, such as cement augmentation (47). However, major drawbacks include high radiation exposure, lack of three-dimensional visualization, relatively lower accuracy, and frequent interruptions to surgical workflow (41).

Overall, each navigation modality exhibits distinct strengths and limitations with respect to accuracy, cost, radiation exposure, workflow, and learning curve (as summarized in Table 5). In current clinical practice, there is an increasing trend toward hybrid navigation strategies, such as combinations of AR with robotic systems or AR integrated with CT- or fluoroscopy-based navigation (49-51). Future systems may pair AR’s intuitive visualization with robotic execution, supported by low-dose intraoperative imaging for calibration. This approach may help balance accuracy, safety, and operative efficiency (52). These comparisons underscore that AR should not be viewed as a replacement, but rather as a complementary visualization layer within multimodal navigation ecosystems.

Table 5

Comparison of navigation technologies in spine surgery

Feature Fluoroscopy-based navigation (C-arm) CT-based navigation (intraoperative CT/O-arm) Robotic navigation AR navigation
Accuracy Moderate. Based on two-dimensional imaging and highly dependent on the surgeon’s three-dimensional spatial interpretation High. Three-dimensional image-based navigation providing real-time multiplanar visualization Very high (theoretical). Robotic arms execute preplanned trajectories; however, accuracy may be affected by registration errors, bony shifts, and intrinsic mechanical tolerances High (registration- dependent). Accuracy primarily relies on precise registration between virtual images and real anatomical structures
Cost Low. Lowest acquisition and per-case costs; remains the mainstay in primary and resource-limited centers Very high. Intraoperative CT systems (e.g., O-arm) are expensive, with additional disposable costs for sterile drapes per procedure Extremely high. Highest acquisition costs; procedure-specific instruments (e.g., guiding sleeves) further increase per-case expenses High. High initial investment for head-mounted displays or AR-integrated microscopes, but currently no per-case disposable costs (primarily software- and service-based)
Radiation Exposure High (to both staff and patient). Repeated intraoperative fluoroscopy is required for confirmation, resulting in cumulative radiation exposure High (to patient). High-dose radiation is delivered during intraoperative scanning, typically in one or several acquisitions Moderate to low. Mainly relies on preoperative CT or a single intraoperative scan Very low. Primarily dependent on preoperative imaging, with minimal intraoperative radiation exposure
Workflow integration Interrupted. Requires repeated fluoroscopic checks, frequent repositioning of the C-arm, and temporary pauses in surgical manipulation Structured but time-consuming. Requires intraoperative scanning, image processing, and registration steps, which may prolong operative time Highly structured and rigid. Involves preoperative planning, system setup, registration, and robotic arm positioning, resulting in a less flexible workflow Intuitive and continuous. Allows in-situ visualization with minimal interruption to surgical flow once accurate registration is achieved
Learning curve Shallow. Most familiar modality for surgeons; requires proficiency in C-arm manipulation and interpretation of 2D images Moderate. Requires adaptation to new workflows, device operation, and interpretation of 3D navigation interfaces Steep. The steepest learning curve; requires mastery of system setup, planning software, robotic arm positioning, troubleshooting, and close team coordination Moderate to steep. Surgeons must adapt to operating within an AR visual environment and master new registration and calibration workflows

2D, two-dimensional; 3D, three-dimensional; AR, augmented reality; CT, computed tomography.

Limitations and perspectives

A primary constraint in interpreting the current literature is the predominant reliance on low-level evidence. Table 4 shows that the vast majority of studies are level IV or V. These include small-sample, single-center feasibility reports, case series, and technical descriptions. There is a conspicuous lack of high-level (I–III) comparative studies directly evaluating AR against conventional navigation modalities in non-instrumentation MISS workflows. As a result, the current evidence is promising but remains preliminary. It is still insufficient to draw definitive conclusions regarding the superior clinical effectiveness or cost-benefit profile of AR in this specific domain.

Despite growing interest in AR-assisted spine surgery, several challenges must be addressed before broader clinical adoption.

Registration accuracy remains a central technical challenge. Submillimeter precision is essential to prevent clinically significant errors. This requirement is complicated by intraoperative bone motion and soft tissue deformation (29). In addition, current AR hardware still faces limitations in field of view, image resolution, system latency, and ergonomic comfort during prolonged procedures (53).

Standardization is another major requirement. Future research should focus on establishing unified protocols for image acquisition, registration, calibration, and intraoperative verification, as well as standardized workflows for operating room integration (54). These consensus frameworks would reduce inter-operator variability, facilitate multicenter studies, and improve reproducibility.

With regard to validation metrics, most existing studies emphasize pedicle screw accuracy relative to freehand techniques. More rigorous evaluation should incorporate quantitative measures such as registration error, angular and translational deviation, procedural efficiency, workflow disruption, complication rates, and surgeon cognitive workload. Comparative trials against CT-based navigation and robotic-assisted systems are particularly important to define the incremental clinical value of AR. Well-designed multicenter clinical trials are also strongly recommended. Future multicenter trials in MISS decompression could compare AR-assisted and conventional navigation with endpoints including complication rates, operative time, radiation exposure, and learning curve.

The integration of AR with artificial intelligence (AI) and robotic systems represents a key future direction. AI-based tools may enable automated image segmentation, recognition of critical structures, trajectory planning, and intraoperative risk prediction, Convergence with robotic platforms and force-feedback systems may enable a unified visual–haptic environment that improves precision and consistency (55,56).

Finally, seamless and aseptic workflow integration remains a practical barrier. AR platforms must interface efficiently with existing navigation systems, picture archiving and communication system (PACS), and endoscopic or microscopic setups without increasing cognitive burden (16). Addressing these challenges through standardized protocols, validated performance metrics, and intelligent system integration will be essential for guiding future research and advancing AR toward routine clinical use.


Conclusions

AR is a promising adjunct in MISS, particularly for non-instrumentation and decompression-dominant workflows where visualization and spatial orientation are limited. Current studies suggest that AR may improve intraoperative anatomical understanding, reduce visual attention shifts, and enhance workflow efficiency across a broad range of MISS procedures.

Nevertheless, the available evidence is largely derived from small, heterogeneous, single-center studies. AR should therefore be regarded not as a substitute for established navigation or robotic systems, but as a complementary visualization layer within multimodal navigation ecosystems. To define its role in routine MISS practice, multicenter clinical studies should prioritize clinically meaningful outcomes such as complication rates, operative time, radiation exposure, and learning curve. To translate this potential into clinical practice, we propose a staged research agenda that begins with standardizing workflow and performance metrics, advances to rigorous comparative trials, and culminates in seamless technological convergence. Through collaborative innovation, the integration of AR for intuitive visualization, AI for intelligent analysis, and robotics for precise execution holds the potential to enable a next-generation surgical platform that supports safer, more efficient, and more accessible MISS (57).


Acknowledgments

None.


Footnote

Reporting Checklist: The authors have completed the Narrative Review reporting checklist. Available at https://jss.amegroups.com/article/view/10.21037/jss-2025-aw-190/rc

Peer Review File: Available at https://jss.amegroups.com/article/view/10.21037/jss-2025-aw-190/prf

Funding: None.

Conflicts of Interest: All authors have completed the ICMJE uniform disclosure form (available at https://jss.amegroups.com/article/view/10.21037/jss-2025-aw-190/coif). D.C. serves as an unpaid editorial board member of Journal of Spine Surgery from December 2024 to December 2026. The other authors have no conflicts of interest to declare.

Ethical Statement: The authors are accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.

Open Access Statement: This is an Open Access article distributed in accordance with the Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International License (CC BY-NC-ND 4.0), which permits the non-commercial replication and distribution of the article with the strict proviso that no changes or edits are made and the original work is properly cited (including links to both the formal publication through the relevant DOI and the license). See: https://creativecommons.org/licenses/by-nc-nd/4.0/.


References

  1. McCloskey K, Turlip R, Ahmad HS, et al. Virtual and Augmented Reality in Spine Surgery: A Systematic Review. World Neurosurg 2023;173:96-107. [Crossref] [PubMed]
  2. Liu A, Jin Y, Cottrill E, et al. Clinical accuracy and initial experience with augmented reality-assisted pedicle screw placement: the first 205 screws. J Neurosurg Spine 2022;36:351-7. [Crossref] [PubMed]
  3. Felix B, Kalatar SB, Moatz B, et al. Augmented Reality Spine Surgery Navigation: Increasing Pedicle Screw Insertion Accuracy for Both Open and Minimally Invasive Spine Surgeries. Spine (Phila Pa 1976) 2022;47:865-72. [Crossref] [PubMed]
  4. Molina CA, Sciubba DM, Greenberg JK, et al. Clinical Accuracy, Technical Precision, and Workflow of the First in Human Use of an Augmented-Reality Head-Mounted Display Stereotactic Navigation System for Spine Surgery. Oper Neurosurg 2021;20:300-9. [Crossref] [PubMed]
  5. Molina CA, Phillips FM, Colman MW, et al. A cadaveric precision and accuracy analysis of augmented reality-mediated percutaneous pedicle implant insertion. J Neurosurg Spine 2021;34:316-24. [Crossref] [PubMed]
  6. Elsayed GA, Dykhouse G, Ikwuegbuenyi CA, et al. The Evolution of Spatial Computing in Spine Surgery: Tracing the Historical Arc to Present Day Implementation. World Neurosurg 2025;204:124514. [Crossref] [PubMed]
  7. Wu JR, Wang ML, Liu KC, et al. Real-time advanced spinal surgery via visible patient model and augmented reality system. Comput Methods Programs Biomed 2014;113:869-81. [Crossref] [PubMed]
  8. Hu MH, Chiang CC, Wang ML, et al. Clinical feasibility of the augmented reality computer-assisted spine surgery system for percutaneous vertebroplasty. Eur Spine J 2020;29:1590-6. [Crossref] [PubMed]
  9. Wei P, Yao Q, Xu Y, et al. Percutaneous kyphoplasty assisted with/without mixed reality technology in treatment of OVCF with IVC: a prospective study. J Orthop Surg Res 2019;14:255. [Crossref] [PubMed]
  10. Heinrich F, Schwenderling L, Becker M, et al. HoloInjection: augmented reality support for CT-guided spinal needle injections. Healthc Technol Lett 2019;6:165-71. [Crossref] [PubMed]
  11. Gibby J, Cvetko S, Javan R, et al. Use of augmented reality for image-guided spine procedures. Eur Spine J 2020;29:1823-32. [Crossref] [PubMed]
  12. Agten CA, Dennler C, Rosskopf AB, et al. Augmented Reality-Guided Lumbar Facet Joint Injections. Invest Radiol 2018;53:495-8. [Crossref] [PubMed]
  13. Fritz J. Augmented reality visualization using image overlay technology for MR-guided interventions: cadaveric bone biopsy at 1.5 T. Invest Radiol 2013;48:464-70. [Crossref] [PubMed]
  14. Jun EK, Lim S, Seo J, et al. Augmented Reality-Assisted Navigation System for Transforaminal Epidural Injection. J Pain Res 2023;16:921-31. [Crossref] [PubMed]
  15. Abe Y, Sato S, Kato K, et al. A novel 3D guidance system using augmented reality for percutaneous vertebroplasty: technical note. J Neurosurg Spine 2013;19:492-501. [Crossref] [PubMed]
  16. Sommer F, Hussain I, Kirnaz S, et al. Augmented Reality to Improve Surgical Workflow in Minimally Invasive Transforaminal Lumbar Interbody Fusion - A Feasibility Study With Case Series. Neurospine 2022;19:574-85. [Crossref] [PubMed]
  17. Sommer F, Hussain I, Kirnaz S, et al. Safety and Feasibility of Augmented Reality Assistance in Minimally Invasive and Open Resection of Benign Intradural Extramedullary Tumors. Neurospine 2022;19:501-12. [Crossref] [PubMed]
  18. Kirnaz S, McGrath LB Jr, Sommer F, et al. Minimally Invasive Resection of an Intradural Extramedullary Spinal Tumor Using 3-Dimensional Total Navigation and Microscope-Based Augmented Reality: 2-Dimensional Operative Video. Oper Neurosurg 2022;22:e88. [Crossref] [PubMed]
  19. Schmidt FA, Hussain I, Boadi B, et al. The Use of Augmented Reality as an Educational Tool in Minimally Invasive Transforaminal Lumbar Interbody Fusion. Oper Neurosurg 2025;28:183-92. [Crossref] [PubMed]
  20. Jamshidi AM, Makler V, Wang MY. Augmented Reality Assisted Endoscopic Transforaminal Lumbar Interbody Fusion: 2-Dimensional Operative Video. Oper Neurosurg 2021;21:E563-4. [Crossref] [PubMed]
  21. Park DY, Park SM, Hashmi S, et al. Enhancing endoscopic spine surgery with intraoperative augmented reality: A case report. Int J Surg Case Rep 2025;131:111342. [Crossref] [PubMed]
  22. Van Isseldyk F, Chavalparit P, Bassani J, et al. Low-Cost Augmented Reality System in Endoscopic Spine Surgery: Analysis of Surgeon Ergonomics, Perceived Workload and A Step-by-Step Guide for Implementation. Global Spine J 2025; Epub ahead of print. [Crossref]
  23. Carl B, Bopp M, Saß B, et al. Microscope-Based Augmented Reality in Degenerative Spine Surgery: Initial Experience. World Neurosurg 2019;128:e541-51. [Crossref] [PubMed]
  24. Umebayashi D, Yamamoto Y, Nakajima Y, et al. Augmented Reality Visualization-guided Microscopic Spine Surgery: Transvertebral Anterior Cervical Foraminotomy and Posterior Foraminotomy. J Am Acad Orthop Surg Glob Res Rev 2018;2:e008. [Crossref] [PubMed]
  25. Kosterhon M, Gutenberg A, Kantelhardt SR, et al. Navigation and Image Injection for Control of Bone Removal and Osteotomy Planes in Spine Surgery. Oper Neurosurg 2017;13:297-304. [Crossref] [PubMed]
  26. Benjamin CG, Frempong-Boadu A, Hoch M, et al. Combined Use of Diffusion Tractography and Advanced Intraoperative Imaging for Resection of Cervical Intramedullary Spinal Cord Neoplasms: A Case Series and Technical Note. Oper Neurosurg 2019;17:525-30. [Crossref] [PubMed]
  27. Carl B, Bopp M, Saß B, et al. Implementation of augmented reality support in spine surgery. Eur Spine J 2019;28:1697-711. [Crossref] [PubMed]
  28. Carl B, Bopp M, Saß B, et al. Augmented reality in intradural spinal tumor surgery. Acta Neurochir (Wien) 2019;161:2181-93. [Crossref] [PubMed]
  29. Carl B, Bopp M, Saß B, et al. Spine Surgery Supported by Augmented Reality. Global Spine J 2020;10:41S-55S. [Crossref] [PubMed]
  30. Onuma H, Sakai K, Arai Y, et al. Augmented Reality Support for Anterior Decompression and Fusion Using Floating Method for Cervical Ossification of the Posterior Longitudinal Ligament. J Clin Med 2023;12:2898. [Crossref] [PubMed]
  31. Pojskić M, Bopp M, Saß B, et al. Single-Center Experience of Resection of 120 Cases of Intradural Spinal Tumors. World Neurosurg 2024;187:e233-56. [Crossref] [PubMed]
  32. Pojskić M, Bopp M, Saß B, et al. Intraoperative Computed Tomography-Based Navigation with Augmented Reality for Lateral Approaches to the Spine. Brain Sci 2021;11:646. [Crossref] [PubMed]
  33. Urreola G, Cranick MG, Castillo JA Jr, et al. Augmented Reality Navigation for Extreme Lateral Interbody Fusion with Posterior Instrumentation: Feasibility, Outcomes, and Surgical Technique. Bioengineering (Basel) 2025;12:1262. [Crossref] [PubMed]
  34. Park SM, Kim HJ, Yeom JS. Is minimally invasive surgery a game changer in spinal surgery? Asian Spine J 2024;18:743-52. [Crossref] [PubMed]
  35. Pierzchajlo N, Stevenson TC, Huynh H, et al. Augmented Reality in Minimally Invasive Spinal Surgery: A Narrative Review of Available Technology. World Neurosurg 2023;176:35-42. [Crossref] [PubMed]
  36. Patel AA, Srivatsa S, Davison MA, et al. Posterior and Transforaminal Lumbar Interbody Fusion: Recent Advances in Technique and Technology. Neurosurg Clin N Am 2025;36:11-20. [Crossref] [PubMed]
  37. Sharma AK, de Oliveira RG, Suvithayasiri S, et al. The Utilization of Navigation and Emerging Technologies With Endoscopic Spine Surgery: A Narrative Review. Neurospine 2025;22:105-17. [Crossref] [PubMed]
  38. Muhlestein WE, Strong MJ, Yee TJ, et al. Commentary: Augmented Reality Assisted Endoscopic Transforaminal Lumbar Interbody Fusion: 2-Dimensional Operative Video. Oper Neurosurg 2022;22:e66-7. [Crossref] [PubMed]
  39. Goldberg JL, Hussain I, Sommer F, et al. The Future of Minimally Invasive Spinal Surgery. World Neurosurg 2022;163:233-40. [Crossref] [PubMed]
  40. Ghellab O, Seas A, Shaker E, et al. Overlay of Nerve Roots to Aid in Augmented Reality-Guided L4 to L5 Transforaminal Interbody Fusion: A Methodologic Pilot. Int J Spine Surg 2025;19:743-50. [Crossref] [PubMed]
  41. Shirbache K, Heidarzadeh M, Qahremani R, et al. A systematic review and meta-analysis of radiation exposure in spinal surgeries: Comparing C-Arm, CT navigation, and O-Arm techniques. J Med Imaging Radiat Sci 2025;56:101831. [Crossref] [PubMed]
  42. El-Hajj VG, Charalampidis A, Fell D, et al. Study protocol: the SPInal NAVigation (SPINAV) trial - comparison of augmented reality surgical navigation, conventional image-guided navigation, and free-hand technique for pedicle screw placement in spinal deformity surgery. BMC Musculoskelet Disord 2025;26:543. [Crossref] [PubMed]
  43. Chan V, Etigunta S, Gausper A, et al. Navigation is associated with lower risk of neurological injury and transfusion in pediatric idiopathic scoliosis surgery. Spine Deform 2025;13:1911-9. [Crossref] [PubMed]
  44. Kanno H, Handa K, Murotani M, et al. A Novel Intraoperative CT Navigation System for Spinal Fusion Surgery in Lumbar Degenerative Disease: Accuracy and Safety of Pedicle Screw Placement. J Clin Med 2024;13:2105. [Crossref] [PubMed]
  45. Jiang B, Pennington Z, Zhu A, et al. Three-dimensional assessment of robot-assisted pedicle screw placement accuracy and instrumentation reliability based on a preplanned trajectory. J Neurosurg Spine 2020;33:519-28. [Crossref] [PubMed]
  46. Shaftel KA, Chang V. Commentary: How Do Robotics and Navigation Facilitate Minimally Invasive Spine Surgery? A Case Series and Narrative Review. Neurosurgery 2025;97:e109-10. [Crossref] [PubMed]
  47. Zhu Y, Zhu S, Li Y, et al. Robot-Assisted, Conventional Fluoroscopy (C-Arm), O-Arm Navigation, and Freehand Pedicle Screw Fixation in Thoracolumbar Spine Fracture Surgery: A Network Meta-Analysis. Orthop Surg 2025;17:3302-17. [Crossref] [PubMed]
  48. Yang Y, Jia Y, Liu C, et al. Simulation and analysis of non-navigational errors in robot-assisted pedicle Kirschner wire placement surgery. J Orthop Surg Res 2025;20:440. [Crossref] [PubMed]
  49. Sommer F, Goldberg JL, McGrath L Jr, et al. Image Guidance in Spinal Surgery: A Critical Appraisal and Future Directions. Int J Spine Surg 2021;15:S74-86. [Crossref] [PubMed]
  50. Hussain I, Cosar M, Kirnaz S, et al. Evolving Navigation, Robotics, and Augmented Reality in Minimally Invasive Spine Surgery. Global Spine J 2020;10:22S-33S. [Crossref] [PubMed]
  51. Virk S, Qureshi S. Narrative review of intraoperative imaging guidance for decompression-only surgery. Ann Transl Med 2021;9:88. [Crossref] [PubMed]
  52. Muthu S, Ramasubramanian S, Jeyaraman M, et al. Framework for Adoption of Enabling Technologies for Improved Outcomes in Spine Surgery. Global Spine J 2025;15:2977-85. [Crossref] [PubMed]
  53. Park SM, Kim D, Park J, et al. Augmented reality-guided pedicle screw fixation: an experimental study. Asian Spine J 2025;19:896-903. [Crossref] [PubMed]
  54. Nadeem-Tariq A, Kazemeini S, Kaur P, et al. Augmented Reality in Spine Surgery: A Narrative Review of Clinical Accuracy, Workflow Efficiency, and Barriers to Adoption. Cureus 2025;17:e86803. [Crossref] [PubMed]
  55. Seetohul J, Shafiee M, Sirlantzis K. Augmented Reality (AR) for Surgical Robotic and Autonomous Systems: State of the Art, Challenges, and Solutions. Sensors (Basel) 2023;23:6202. [Crossref] [PubMed]
  56. Bhimreddy M, Jiang K, Weber-Levine C, et al. Computational Modeling, Augmented Reality, and Artificial Intelligence in Spine Surgery. Adv Exp Med Biol 2024;1462:453-64. [Crossref] [PubMed]
  57. Jitpakdee K, Boadi B, Härtl R. Image-Guided Spine Surgery. Neurosurg Clin N Am 2024;35:173-90. [Crossref] [PubMed]
Cite this article as: Han F, Xu F, Yang Y, Willoughby CJ, Chan AK, Chou D. Augmented reality in non-instrumentation minimally invasive spine surgery: a narrative review and future perspectives. J Spine Surg 2026;12(2):21. doi: 10.21037/jss-2025-aw-190

Download Citation