Keywords

1 Introduction

Stereo-tactic neurosurgery is a form of minimally invasive surgery that uses a stereo-tactic frame to locate targets inside the brain. This method enables interventions that require highly precise targeting, like biopsy, ablation, lesion, injection, stimulation, implantation, and radio-surgery. The two most important criteria when planning such interventions are: to precisely deliver the treatment at the desired target, and to avoid proximity to any risk structures along the trajectory, such as blood vessels or critical brain regions.

The planning of trajectories for a stereo-tactic neurosurgery procedure is a highly complex task, which requires good understanding of the location and structure of the targeted brain regions, as well as all regions affected by any planned trajectory. Neurologists and neurosurgeons explore the original image datasets and segmented volumetric structures to gain understanding about the patient’s anatomy. They perform trajectory planning before a surgery to identify target regions and possible paths, and during surgery, where the pre-planned trajectories are refined and verified to avoid any proximity to risk structures. The intra-operative planning procedure is performed while the patient is under full anesthesia, after the stereo-tactic frame was mounted, and after the pre-operative images are registered with an intra-operative CT scan so that the planned trajectories can be computed in frame coordinates. The intra-operative planning can take up to two hours while the patient is in the operating theatre. In this paper we propose a novel Visuo-Haptic Augmented Reality (VHAR) user interface for neurosurgery planning, which reduces the required time for planning and therefore reduces the negative impact on the patient and makes the procedure more cost-effective.

Volumetric image data exploration and trajectory planning in medical practice is typically done using desktop-based workstations with mouse and keyboard. Specialized planning software provides 2D/3D views onto the available patient image data. Operators define trajectories in slice views onto the medical volumes. During planning, they iterate between volume exploration using mouse and keyboard, relocating trajectory points with the mouse to avoid proximity with identified risk structures, and verifying that the recently changed path does not interfere with previously identified risk areas. They also switch often between different image modalities, since not all structures are visible in all images. The type and amount of information available to neurosurgeons and neurologists has vastly increased. This includes new imaging modalities [6], improved computer assisted segmentation of volumetric data [4], computer assisted risk assessment [15], and automated trajectory planning [35]. Selecting the appropriate data and visualizing it still remains a challenge especially since additional visual overlays often interfere with the requirement to see the original image dataset with all the details.

Fig. 1.
figure 1

The Visuo-Haptic Augmented Reality (VHAR) user interface of the neurosurgery application enables users to naturally interact with medical volume datasets: (a) Users explore volumetric data using oblique slicing and bi-manual interaction. (b) Segmented structures and planned trajectories (yellow lines) are displayed. (c) Users inspect and refine trajectories to avoid proximity to risk structures using an oblique slice plane that is perpendicular to the trajectory. (Color figure online)

The VHAR user interface of our neurosurgery planning application integrates a force feedback haptic device into an Augmented Reality (AR) system. Operators see patient data visualization using a Head-Mounted Display (HMD) and can intuitively explore it using bi-manual 3D interaction (see Fig. 1). The haptic device provides force feedback that enables precise 3D targeting and informs operators about nearby risk structures. By providing haptic guides, we can reduce visual clutter while still providing sufficient information to efficiently plan the required trajectories.

2 Related Work

Neurosurgical intervention planning requires precise understanding of spatial relations between target areas, access paths, and critical regions. Current planning systems often use slice visualization for volumetric data and user interfaces with keyboard and mouse for navigation and planning, but previous research has shown that 3D user interfaces can greatly enhance the spatial understanding for such tasks. Goble et al. [9] presented a novel user interface for neurosurgical visualization. Their system tracks passive interface props like a solid sphere, cutting plane, and pointer for user interaction. They found that medical experts prefer the intuitive bi-manual 3D interaction over traditional slice viewers. Our planning system improves on their work so that the medical visualization appears co-located with the interface props using an AR display. Eagleson et al. [7] presented an interactive neurosurgery planning system where multiple users can explore and annotate patient datasets using a tabletop display with a touch surface. They use a haptic device that is placed onto the display to provide 3D input for volume exploration, but their system does not provide any haptic feedback.

Researchers have also studied the benefits of AR for surgery planning. Abhari et al. [1] evaluated user performance during the planning of a brain tumor resection. Such tasks require a good understanding of the spatial relationships between relevant anatomy like tumors and risk-structures. They compared task performance of medical experts when using different visualization techniques. In a preliminary study they showed that users performed as good or better in AR than in all other modes. The benefits of AR are however larger for novices than for experts. Shamir et al. [14] presented an augmented reality user interface for improved risk assessment in image guided keyhole surgery. They augment the risk of all possible trajectories onto the surface of a head phantom. Based on an expert review they conclude that the proposed AR user interface is beneficial for planning of difficult operations and for education. Our system builds upon and enhances their interaction techniques, and adds haptic feedback as another modality to display important information.

Recent advances in image registration and segmentation algorithms enable researchers to build complete image processing pipelines. D’Albis et al. [4] presented a fully integrated and automated planning solution for deep-brain stimulation (PyDBS). Their workflow automates pre-operative, intra-operative, and post-operative imaging tasks, such as registration and segmentation. Our neurosurgery planning system integrates the data generated by PyDBS.

Researchers are also working on automated trajectory planning systems for neurosurgery interventions [2, 5, 15], which autonomously suggest candidate trajectories to operators. These systems can reduce planning times and help users to better estimate the risk of the planned trajectory. However, such automated planning methods require a broad body of medical knowledge in digital form and accurately segmented and labeled patient data in order to produce acceptable results.

Information generated by automated trajectory planning systems can be either used to suggest trajectories, or to improve guidance of users during the planning procedure. During interviews with medical experts we realized that they often prefer access to additional information over automated processes, which they are not fully in control of. Operators can for example be guided during trajectory planning using visual and haptic guides that use data extracted from automated pipelines. Such guides inform operators about accessibility and the risk of certain configurations in real-time.

The contributions of our work are improved interaction techniques for neurosurgery planning using VHAR, which combine bi-manual volume exploration with haptic visualization and guides. The intuitive positioning and slicing of medical volumes and segmented structures helps operators to better understand the spatial relations, which reduces the time required to identify target regions and risk structures. The application provides haptic guides that display context-dependent constraints as forces in addition to the visual rendering of medical image data and segmented structures in AR. Replacing some of the visual information with haptic feedback reduces visual clutter while still providing the required information. The presented trajectory planning method further reduces the required time for planning of trajectories by reducing the number of iterations needed for refining and verifying trajectories.

3 System Design

This section presents the system design of the proposed user interface for neurosurgery planning that simplifies the exploration of medical datasets and the trajectory planning procedure. When using the planning system, users wear a HMD and hold a haptic stylus with their dominant hand. In the other hand, they hold a tracked handle with buttons used as secondary input to the application. As shown in Fig. 1, a patient dataset consisting of volumetric images and segmented structures is displayed in the haptic workspace. The proposed application allows users to pick up, transform, and place the patient dataset with the tracked handle in order to adjust the viewing angle. They use the haptic stylus to control slicing operation during exploration and verification, and to define trajectories.

3.1 Volume Exploration

The volume exploration component visualizes volumetric and geometric information interactively. During volume exploration, users mainly need control over the displayed content, effective ways to locate regions of interest inside the volume, and views that allow them to understand spatial relations between risk structures and planned trajectories.

In the proposed planning tool, users can directly manipulate the pose of the volume using the tracked handle, which gives them an intuitive way to place and orient the medical dataset as needed, similar to [9]. At the same time, they operate the haptic stylus with their dominant hand to interactively explore the volume using slice views on the volumetric data.

The visual augmentations are rendered using a multi-volume raycaster for volumetric image datasets (see Fig. 1a) and a geometry renderer for segmented structures (see Fig. 1b). Users inspect data using the oblique slicing method [13], which is a generic version of the typically used axis-aligned slice planes. In contrast to axis-aligned slice planes, users can additionally control the orientation of the slice. The combination of oblique slicing with the two-handed interaction allows operators to quickly change volume pose and slice properties. Preliminary reviews with medical experts showed that the proposed method reduces the required time to find regions of interest and help to understand spatial relations within the volume.

The concept of the oblique slicing technique is shown in Fig. 2a. Users place a volume V at a convenient pose \(O_V\) and control the slice plane SP using a haptic stylus S. The plane SP(pn) is defined by a point p, which is controlled by the position of the haptic stylus S, and a normal vector n. The normal vector is selected depending on the current task and the user’s preference. Currently, the oblique slicing component supports three viewing modes, which differ only by the data source for the plane normal n:

  • View-Aligned: The slice plane is always parallel to the view plane of the current camera and therefore provides the most detailed view onto the current slice. Users can additionaly use head movement to control the slicing. The plane normal n is the normal of the camera view-plane.

  • Stylus-Aligned: The slice plane is controlled by the haptic stylus, which gives the operator maximum control over the slicing operation. The plane normal n is the unit-vector along the longitudinal axis of the haptic stylus S.

  • Trajectory-Aligned: The slice plane is always perpendicular to the current trajectory. This mode was suggested by medical experts for trajectory verification. The plane normal n is the unit-vector along the current trajectory T.

The oblique slice view is automatically activated once the stylus tip S is inside the volume bounding box (V). The slice plane is computed every frame relative to the volume origin \(O_V\) using the current pose of the haptic stylus and the selected plane normal. The slice plane SP controls a clipping plane, which hides unwanted geometry and volumetric image data. Additionally, the locations of three to six vertices, \(x_i\), are computed that define a polygon, which represents the intersection of the slice plane with the volume bounding box. In addition to the vertex locations, we determine their texture coordinates in relation to the volume bounding box (see Fig. 2b). The polygon is textured using a OpenGL shader by interpolating over data from a three-dimensional texture that represents the medical volume.

Fig. 2.
figure 2

Oblique slicing of medical datasets in AR: (a) Slice plane SP is defined using point p and normal vector n. The plane normal can be controlled by the view plane of the current camera \(n_V\), the haptic stylus \(n_S\), or the trajectory \(n_T\). (b) Volumetric medical dataset with oblique slice plane. Vertices \(x_i\) of the slice plane and their corresponding texture coordinates are computed for real-time rendering using OpenGL shaders.

Once users have found the desired view on the data, they can lock the slice plane. A locked slice plane enables a haptic plane effect, which attracts the stylus tip to the slice plane if close enough. The haptic feedback guides users to stay on the selected plane, for example when placing points for trajectory planning.

3.2 Trajectory Planning

A trajectory is defined by two 3D positions in a pre-defined reference frame: the target point (TP) and the entry point (EP). The target point is located near or inside the region of interest and the entry point is placed on the upper surface of the skin of the head. While defining these two points initially is straightforward once a promising trajectory was identified, the refinement of the trajectory is not. When modifying entry or target position to adjust the trajectory, previous checks about risk structures in proximity are invalidated in every iteration. Therefore, the verification of the trajectory requires users to revisit again all previously identified areas.

The planning procedure can be optimized if previously known information about risk is used to guide operators. Risk structures can be manually identified in the dataset or determined using automated trajectory planning methods [2, 12, 15]. Such additional information is typically displayed visually, but then it often obstructs the view onto other relevant information that is needed for the task.

Our proposed system improves the user interface for trajectory planning with the aim to reduce the required number of iterations between trajectory refinement and verification. This is done in two ways. First, the proposed system allows operators to mark safe zones in problematic areas near the planned trajectory. Defining such zones automatically repositions the entry point so that the trajectory passes critical regions in zones that are marked safe. Second, we replace visual information about proximity to marked risk structures with haptic guides to reduce visual clutter.

Fig. 3.
figure 3

The initial (blue) trajectory TPEP is optimized with three refinement points RP1, RP2, RPi. Safe zones are defined as conic volumes around the refinement points with an additional radius. A new (red) trajectory TPEP’ is calculated where the mass-spring system defined by the projected points RP1’, RP2’, RPi’ is at equilibrium. (Color figure online)

Mark Safe Zones: Users specify one or more safe zones near risk structures in proximity while verifying the trajectory using the trajectory-aligned oblique slice view. Safe zones have the shape of a circular disc and are defined with a center (\(RP_i\)) and a radius (\(r_i\)) in the currently displayed slice (see Fig. 5e). A trajectory intersects all planes that contain safe zones at an intersection point (\(IP_i\)). Valid trajectories must intersect with each safe zone at a distance \(d_i\) from \(IP_i\) to RP that is less or equal to \(r_i\). To achieve this, the entry point EP is moved to a new location \(EP'\) to change the trajectory accordingly (see Fig. 3). Users also define an initial confidence radius (\(r_t\)) when specifying EP to limit the area in which the entry point can be repositioned.

Reposition the Entry Point: The goal when repositioning the entry point is that a refined trajectory should pass only through areas that are marked as safe. This is achieved using springs that pull the trajectory towards the safe zone centers. The proposed algorithm simulates the trajectory as pendulum with the springs attached at equilibrium to calculate the new entry point \(EP'\) using the constraints specified as safe zones. The application invokes the algorithm whenever a trajectory point was modified. Prior to invocation, the target point, the entry point, and all centers of safe zones are transformed into a reference frame (I) with TP as its origin and the negative y-axis pointing towards the entry point. From this definition follows that \(TP = [0, 0,0]\) and \(EP = [0, -l_t, 0]\), where \(l_t\) is the length of the trajectory. The position of the new entry point \(EP'\) is defined in generalized coordinates \(\alpha _0\) and \(\beta _0\).

$$\begin{aligned} EP' = \left[ \begin{matrix} l_{t} {\text {sin}}\left( \alpha _{0}\right) \\ - l_{t} {\text {cos}}\left( \alpha _{0}\right) {\text {cos}}\left( \beta _{0}\right) \\ - l_{t} {\text {sin}}\left( \beta _{0}\right) {\text {cos}}\left( \alpha _{0}\right) \end{matrix}\right] \end{aligned}$$
(1)

To complete the definition of the pendulum, a rigid sphere with mass \(m_1\) is attached at the location of \(EP'\) and a link between TP and \(EP'\) is defined with mass \(m_2\). Without considering any safe zones and assuming normal gravity \(g = [0, -1, 0] \times 9.81\frac{m}{s^2}\), the pendulum is in a stable state at \(\alpha = 0\) and \(\beta = 0\).

Safe zones affect the trajectory as follows. First, all centers of safe zones \(RP_i = [x_i, y_i, z_i]\) and their radii (\(r_i\)) are projected onto a plane defined by the point EP and the vector \([0, -1, 0]\) as its normal.

$$\begin{aligned} RP_i' = RP_i \frac{-l_t}{y_i} \end{aligned}$$
(2)
$$\begin{aligned} r_i' = r_i \frac{-l_t}{y_i} \end{aligned}$$
(3)

Springs between the projected centers \(RP_i'\) and \(EP'\) pull the trajectory away from nearby risk structures by applying forces (\(F_i\)) to the pendulum as shown in Fig. 3. The springs are parameterized using a stiffness factor (k) and the ratio between the initial confidence radius (\(r_t\)) and \(r_i'\).

$$\begin{aligned} F_i = k \frac{r_t}{r_i'} (EP' - RP_i') \end{aligned}$$
(4)

The motion of the pendulum is limited using a damping force (\(F_d\)) with the factor (c) that is calculated using the current velocity (V) of \(EP'\).

$$\begin{aligned} F_d = -c V \end{aligned}$$
(5)

The effects of all forces that are applied to the pendulum are simulated using Kane’s method [11] to calculate the new location of \(EP'\) where all forces are at equilibrium. The static parameters used in the current system are: \(m_1 = 0.1\) kg, \(m_2 = 0.01\) kg, \(c = 5\), and \(k = 25\). With these parameters, the duration of the simulation can be limited to 0.3 s to reach the equilibrium. The new location \(EP'\), however, can be computed faster than real-time so that the position updates are supplied without noticeable delay in the current implementation. The final entry point for the surgical procedure is located at the intersection of the refined trajectory and the patient’s head surface.

Fig. 4.
figure 4

Haptic guides are computed based on the specified trajectory and the marked safe zones. This example shows a trajectory with one safe zone that is defined by its center RP and its radius r. A conical volume defined by RP, TP, r marks a safe corridor. If the haptic stylus TIP is outside the safe corridor, a spring-damper system is activated that pulls the stylus back into the area marked grey. The conic volumes are intersected if multiple refinement points are active.

Haptic Guides for Safe Zones: Safe zones are also used to visualize the boundaries of safe corridors using force feedback. Such haptic guides help operators to stay within previously defined safe zones while editing the trajectory or to identify impossible configurations earlier in the planning procedure. Impossible configurations exist when no trajectory passes safe zones within the defined distance to the center and therefore no safe path with the current configuration is possible.

Figure 4 shows an example how haptic guides are calculated using a single refinement point. The haptic stylus TIP is placed near the trajectory. A safe zone is defined by its center RP and a radius r. In this example the TIP is outside of the conical volume (grey) described by TP, RP, and r. In this case, a haptic guide is activated that pulls the stylus back into the safe zone by displaying a force. The force is generated using a spring-damper system that pulls the TIP towards the safe volume (blue arrow). Multiple safe zones generate multiple overlapping conical volumes. The intersection of these volumes is visualized using force feedback.

Feedback forces to display haptic guides are computed as follows. First, for each safe zone, we transform the TIP position (T), the target point (TP), the entry point (EP), and the center (RP) into a reference frame with TP as its origin and the positive y-axis pointing towards the entry point. The trajectory is therefore defined by \(TP = [0, 0, 0]\) and \(EP = [0, l_t, 0]\), where \(l_t\) is the length of the trajectory. Then, we define a ray from the origin TP through \(T = [x_t, y_t, z_t]\) and compute the intersection point (IP) of the ray with the plane that is defined by the center RP and the normal [0, 1, 0].

$$\begin{aligned} IP = T \frac{l_t}{y_t} \end{aligned}$$
(6)

Then, we compute the distance \(d_t\) from IP to RP. If \(d_t\) is larger than the radius (r) of the current safe zone, the system activates the haptic guide. Finally, we calculate the guide force \(F_g\) using the unit vector \(v_g = \frac{RP - IP}{||RP - IP||}\), the magnitute \(m_g = d_t - r\), and a stiffness factor k.

$$\begin{aligned} F_g = k (m_g v_g) \end{aligned}$$
(7)

To ensure stability of the system, a damping force is displayed if any haptic guide is active. The damping force (\(F_d\)) is computed using the velocity (V) of the TIP and a damping factor (c) (see Eq. 5). The stiffness factor k and the damping factor c need to be adapted to the properties of the haptic device and the end-to-end latency of the haptic rendering component.

Finally, all forces that are generated by active haptic guides are added and divided by the number of active guides. Then, we add the damping force, transform the resulting force vector back into world coordinates, and display the haptic feedback to the user. The feedback forces are disabled if the distance of the TIP to the closest safe corridor is above an activation distance to avoid unwanted forces when moving the haptic stylus. Furthermore, the haptic effect is disabled if the TIP is too close to TP since the safe corridor becomes too narrow to render stable forces.

4 Implementation

We implemented our system for exploring and evaluating novel interaction techniques in neurosurgery trajectory planning. The system was developed with H3DAPI for visual and haptic rendering. We integrated the tracking and sensor-fusion library Ubitrack [10] and calibrated the complete system using the calibration method presented by Eck et al. [8]. The remaining registration errors between haptic and visual stimuli are typically between \(1 - 2\) mm after calibration. The target environment for running the application is a VHAR workspace that consists of a custom HMD, an AR-Tracking DTrack system, a PHANToM Premium haptic device, and a tracked Kensington Wireless Presenter.

As previously discussed in Sect. 3, the user interface consists of two main components: volume exploration and trajectory planning. Both components access a shared model consisting of all volumetric images and segmented structures including their attributes and spatial relations. The implemented planning workflow guides operators through the planning process and provides sensible defaults for the visibility of information in the visual and haptic channels. Patient data is loaded from PyDBS [4] datasets.

Fig. 5.
figure 5

Screenshots from the application. (b–f) area of interest is magnified for better visibility (not part of the visualization).

Figure 5 presents screen shots from the different trajectory planning stages: (a) The planning process starts with volume exploration. Users can attach the dataset to the sub-dominant hand controller to alter its position and orientation. With the haptic stylus tip they control the oblique slicing with viewpoint alignment to explore the volume and to identify the region of interest. (b) A target point is specified on a locked slice plane near or inside the region of interest. (c) The display switches to rendering risk structures for identifying possible entry point regions. Once found, the entry point and its confidence region are defined. Users receive haptic feedback when they hit the head surface. (d) Verification is required after each modification of the trajectory. Operators load different image modalities and inspect the regions near the planned trajectory using the trajectory-aligned slice view. (e) if they find any risk structures nearby they mark a safe corridor near the identified risk, which alters the planned trajectory. (f) Operator is inspecting a refined trajectory. Note how the trajectory does not directly pass the defined trajectory points anymore, but shows the result of the physics-based optimization.

Haptic feedback is generated with context-dependent force effects. These effects receive the haptic pose in real-time and display guide forces. The following effects generate haptic feedback:

  • Plane Constraints are active when oblique slice planes are locked or during interaction with other planar user interface elements like menus. They are defined by a plane, force calculation parameters (stiffness and damping), and an activation distance that disables the force effect if the distance of the haptic stylus exceeds the limit. Once the haptic stylus is close enough, the force effect pulls the haptic stylus towards the plane.

  • Rigid Surfaces are rendered for selected segmented structures like the head surface during entry point definition. Such guides help during 3D point placement, because operators do not need to verify the placement from different perspectives.

  • Safe Corridors are defined by safe zones. Section 3.2 provides details about haptic guides for safe corridors.

During interviews with medical experts, we gathered feedback in various stages during the development. The volume exploration component received positive feedback during reviews with junior and senior medical experts. The experts did not require any training to use the system and stated that navigating the medical dataset is very intuitive and efficient. They suggested to extend the current implementation with capabilities for scaling the dataset and to provide magnification lenses so that operators can interact more precisely and get a more detailed view on the patient data. The final review with a junior medical expert showed that the current application provides a useful interface for trajectory planning that should be further evaluated with a group of more experienced neurosurgeons and neurologists to validate the utility of the proposed planning tool. During the review, the junior expert tested the trajectory planning workflow by performing several planning tasks. He found that the haptic guide that displays the head surface is very useful to correctly position the initial entry point. The proposed method to improve the trajectory refinement looks promising, however, the definition of safe zones using a disc-like shape could be improved by enabling operators to mark them with a more flexible approach like free-hand segmentation.

5 Conclusions and Future Work

The neurosurgery planning application provides a novel user interface for trajectory planning in stereo-tactic interventions. The intuitive user interface simplifies volume exploration. The additional haptic guides reduce visual clutter and help operators to efficiently plan trajectories. The interactive refinement approach reduces the number of iterations that are required for editing and refining trajectories by optimizing the entry point using manually defined safe corridors.

In future work, we plan to evaluate the VHAR user interface for trajectory planning with a group of medical experts. Such evaluation would include the comparison of user performance and accuracy for systems with and without haptic risk guides, comparing the proposed oblique slicing method with traditional cross-section slices, studying the benefits of the proposed viewing modes for the interactive volume exploration component, and comparing the proposed system to current state-of-the-art desktop based planning systems.

Furthermore, we plan to extend the algorithms that refine the trajectory and provide haptic guides to support more flexible shapes for marking safe zones during the planning process. A possible approach is to allow operators to draw safe zones in the slice images instead of specifying a center and a radius. With more detailed information about risk structures in patient datasets such as blood vessels and critical brain regions, the haptic guides could also be automatically generated once a target point has been defined, similar to Shamir et al. [15].

Finally, the work presented in this paper enables the evaluation of 3D user interfaces for trajectory planning. The presented interaction techniques can potentially improve the outcome of stereo-tactic interventions for patients by reducing the required time for planning and by increasing the safety of the planned trajectories.